fix some stuff

This commit is contained in:
James Ravenscroft 2024-09-07 09:22:56 +01:00
parent d5e97e4413
commit 3f0c28daf8
14 changed files with 280 additions and 165 deletions

View File

@ -2,3 +2,15 @@
---
Welcome to the Digital Home of James Ravenscroft Machine Learning and NLP specialist and software generalist.
<img src="/images/avatar_small.png" style="max-width:250px;" class="u-photo">
I am a Chief Technology Officer and Software Engineer specialising in Machine Learning and in particular, Natural Language Processing. I am an amateur musician, cook and photographer and I love to read fiction and watch box sets and movies. I live with my wife and cats in the south of England.
On this site you will find:
- A selection of [essays and long-form posts](/posts/) that I have written about software engineering, philosophy, machine learning and AI, and personal topics.
- A [microblog of shorter content](/notes/) in response to things that interest me, including some photos.
- [Links](/bookmarks/) to content that I've found and read along with my comments and responses.
You can find more of my thoughts and in-progress writing over on my [Digital Garden](https://notes.jamesravey.me)

View File

@ -1,6 +1,9 @@
---
date: '2021-12-24T13:51:30.902871'
in-reply-to: https://stackoverflow.com/questions/15974730/how-do-i-get-the-different-parts-of-a-flask-requests-url
in-reply-to:
title: python - How do I get the different parts of a Flask request's url? - Stack
Overflow
url: https://stackoverflow.com/questions/15974730/how-do-i-get-the-different-parts-of-a-flask-requests-url
post_meta:
- date
type: replies

View File

@ -1,6 +1,9 @@
---
date: '2022-01-28T10:39:03.399512'
in-reply-to: https://news.ycombinator.com/item?id=30114173
in-reply-to:
title: 'Tell HN: Twitter is growing increasingly unusable without an account | Hacker
News'
url: https://news.ycombinator.com/item?id=30114173
mp-syndicate-to:
- https://brid.gy/publish/mastodon
post_meta:

View File

@ -1,6 +1,9 @@
---
date: '2022-01-28T10:53:28.497119'
in-reply-to: https://fosstodon.org/web/@_jacobtomlinson/107700838541067318
in-reply-to:
title: 'Jacob Tomlinson: "I also hear people rave about Podman as the new D…" -
Fosstodon'
url: https://fosstodon.org/web/@_jacobtomlinson/107700838541067318
mp-syndicate-to:
- https://brid.gy/publish/mastodon
post_meta:

View File

@ -1,6 +1,10 @@
---
date: '2022-11-01T12:56:31.585963'
in-reply-to: https://andthentheresphysics.wordpress.com/2022/10/29/beyond-catastrophe/
in-reply-to:
title: '
Beyond Catastrophe | …and Then There''s Physics'
url: https://andthentheresphysics.wordpress.com/2022/10/29/beyond-catastrophe/
post_meta:
- date
tags:

View File

@ -1,6 +1,8 @@
---
date: '2022-11-03T09:32:06.885194'
in-reply-to: https://brainbaking.com/post/2022/10/should-we-build-our-own-wayback-machines/
in-reply-to:
title: Should We Build Our Own Wayback Machines? | Brain Baking
url: https://brainbaking.com/post/2022/10/should-we-build-our-own-wayback-machines/
mp-syndicate-to:
- https://brid.gy/publish/mastodon
post_meta:

View File

@ -1,6 +1,8 @@
---
date: '2022-11-19T18:30:36.835806'
in-reply-to: https://brainbaking.com/post/2022/11/finding-stuff-on-big-blogs/
in-reply-to:
title: Finding Stuff on Big Blogs | Brain Baking
url: https://brainbaking.com/post/2022/11/finding-stuff-on-big-blogs/
post_meta:
- date
tags:

View File

@ -1,6 +1,8 @@
---
date: '2022-11-19T22:30:10.027914'
in-reply-to: https://boffosocko.com/2020/05/24/a-hack-for-using-hypothes-is-to-annotate-on-mobile/
in-reply-to:
title: A hack for using Hypothes.is to annotate on mobile | Chris Aldrich
url: https://boffosocko.com/2020/05/24/a-hack-for-using-hypothes-is-to-annotate-on-mobile/
post_meta:
- date
tags:

View File

@ -1,6 +1,8 @@
---
date: '2023-08-01T20:24:13.925346'
in-reply-to: https://ploum.net/2023-08-01-splitting-the-web.html
in-reply-to:
title: Splitting the Web
url: https://ploum.net/2023-08-01-splitting-the-web.html
post_meta:
- date
tags:
@ -9,7 +11,6 @@ tags:
- makers
type: replies
url: /replies/2023/08/01/1690921453
---
I really related to this blog post just as I also related to Kev Quirk's recent article about forums: https://kevquirk.com/bring-back-the-humble-forum

View File

@ -1,6 +1,8 @@
---
date: '2023-09-26T06:58:00.856284'
in-reply-to: https://tracydurnell.com/2023/09/25/apathy-at-work/
in-reply-to:
title: Apathy at work Tracy Durnell's Mind Garden
url: https://tracydurnell.com/2023/09/25/apathy-at-work/
post_meta:
- date
tags:
@ -8,7 +10,6 @@ tags:
- Philosophy
type: replies
url: /replies/2023/09/26/1695711480
---
> I dont get apathy at work because I care way too much about pretty much everything. This isnt necessarily a good thing in an office job. I was chatting with a friend about how caring can become a maladaptive trait at work, when youre asked to do something that really doesnt matter and no one cares about, but you just cant bring yourself to do mediocre work.

@ -1 +1 @@
Subproject commit 4739bdb60036bd783ebd04fbf5151ab6346a63c9
Subproject commit b52055184906a68a52a18e8755efcfe01b91c5fb

View File

@ -6,12 +6,90 @@ import ujson
import frontmatter
from urllib.parse import urlparse
from bs4 import BeautifulSoup
def get_html_title(url):
"""
Fetches the HTML content from a given URL and returns its title.
Args:
url (str): The URL to fetch HTML content from.
Returns:
str: The title of the fetched HTML content, or None if it couldn't be found.
"""
try:
# Send an HTTP GET request to the URL
response = requests.get(url)
# Check if the request was successful (status code 200)
if response.status_code == 200:
# Parse the HTML content using BeautifulSoup
soup = BeautifulSoup(response.content, 'html.parser')
# Find and return the title of the HTML document
title = None
if soup.title:
title = soup.title.string
# Return None if no title could be found
if not title:
return None
return title
else:
print(
f"Failed to fetch HTML content. Status code: {response.status_code}")
except Exception as e:
print(f"An error occurred: {e}")
@click.group()
def cli():
dotenv.load_dotenv()
pass
@cli.command()
@click.option("--folder", type=click.Path(dir_okay=True, file_okay=False), required=True)
def fetch_link_titles(folder):
"""Fetch titles for reply and bookmark links"""
for root, _, files in os.walk(folder):
for file in files:
if file.endswith(".md"):
full_path = os.path.join(root, file)
data = frontmatter.load(full_path)
print(f"Analysing... {full_path}")
reply_data = data.get('in-reply-to')
if 'twitter.com' in reply_data:
print("Not grabbing title for tweet")
continue
if type(reply_data) == str:
title = get_html_title(reply_data)
if title is not None:
print(f"Found in-reply-to title: '{title}'")
data['in-reply-to'] = {"url": reply_data,
"title": str(title)}
print(f"Updating in-reply-to data... {full_path}")
with open(full_path, 'wb') as f:
frontmatter.dump(data, f)
# parse the response and extract the title
@cli.command()
@click.option("--folder", type=click.Path(dir_okay=True, file_okay=False), required=True)
@click.option("--old_type", type=str, required=True)
@ -23,20 +101,22 @@ def fix_post_types(folder: str, old_type: str, new_type: str):
for file in files:
if file.endswith(".md"):
full_path = os.path.join(root,file)
full_path = os.path.join(root, file)
data = frontmatter.load(full_path)
print(f"Analysing... {full_path}")
if 'type' not in data:
print(f"Skipping {full_path} due to incomplete frontmatter")
print(
f"Skipping {full_path} due to incomplete frontmatter")
continue
if(data['type'] == old_type):
print(f"Update type for {full_path}: {old_type}->{new_type}")
if (data['type'] == old_type):
print(
f"Update type for {full_path}: {old_type}->{new_type}")
data['type'] = new_type
with open(full_path,'wb') as f:
with open(full_path, 'wb') as f:
frontmatter.dump(data, f)
@ -51,7 +131,7 @@ def set_page_meta(folder: str, page_meta: str):
for file in files:
if file.endswith(".md"):
full_path = os.path.join(root,file)
full_path = os.path.join(root, file)
data = frontmatter.load(full_path)
print(f"Update page_meta for {full_path}: {meta}")
@ -59,22 +139,20 @@ def set_page_meta(folder: str, page_meta: str):
del data['page_meta']
data['post_meta'] = meta
with open(full_path,'wb') as f:
with open(full_path, 'wb') as f:
frontmatter.dump(data, f)
@cli.command()
@click.option("--mentions-file", type=click.Path(file_okay=True), required=True)
def fetch_mentions(mentions_file: str):
"""Fetch web mentions and store as json"""
mention_ids = set()
if os.path.exists(mentions_file):
print(f"Load existing mentions from {mentions_file}")
with open(mentions_file,'r') as f:
with open(mentions_file, 'r') as f:
mentions = ujson.load(f)
print(mentions.keys())
print(f"Found existing mentions for {len(mentions.keys())} urls")
@ -85,7 +163,8 @@ def fetch_mentions(mentions_file: str):
mention_ids.update([post['id'] for post in mentionset])
print("Requesting new mentions...")
r = requests.get(f"https://webmention.io/api/mentions.json?token={os.environ.get('WEBMENTIONSIO_API_KEY')}")
r = requests.get(
f"https://webmention.io/api/mentions.json?token={os.environ.get('WEBMENTIONSIO_API_KEY')}")
if r.json().get('error') is not None:
print(f"Failed to request webmentions: {r.json()}")
@ -98,7 +177,6 @@ def fetch_mentions(mentions_file: str):
if target not in mentions:
mentions[target] = []
if link['id'] not in mention_ids:
mention_ids.add(link['id'])
mentions[target].append(link)
@ -107,10 +185,9 @@ def fetch_mentions(mentions_file: str):
print(f"Found {new} new mentions")
print(f"Storing mentions at {mentions_file}")
with open(mentions_file,'w') as f:
with open(mentions_file, 'w') as f:
ujson.dump(mentions, f, indent=2)
if __name__ == "__main__":
cli()
cli()

278
bstools/poetry.lock generated
View File

@ -1,15 +1,23 @@
# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
[[package]]
name = "beautifulsoup4"
version = "4.12.2"
version = "4.12.3"
description = "Screen-scraping library"
category = "main"
optional = false
python-versions = ">=3.6.0"
files = [
{file = "beautifulsoup4-4.12.3-py3-none-any.whl", hash = "sha256:b80878c9f40111313e55da8ba20bdba06d8fa3969fc68304167741bbf9e082ed"},
{file = "beautifulsoup4-4.12.3.tar.gz", hash = "sha256:74e3d1928edc070d21748185c46e3fb33490f22f52a3addee9aee0f4f7781051"},
]
[package.dependencies]
soupsieve = ">1.2"
[package.extras]
cchardet = ["cchardet"]
chardet = ["chardet"]
charset-normalizer = ["charset-normalizer"]
html5lib = ["html5lib"]
lxml = ["lxml"]
@ -17,28 +25,37 @@ lxml = ["lxml"]
name = "certifi"
version = "2021.10.8"
description = "Python package for providing Mozilla's CA Bundle."
category = "main"
optional = false
python-versions = "*"
files = [
{file = "certifi-2021.10.8-py2.py3-none-any.whl", hash = "sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569"},
{file = "certifi-2021.10.8.tar.gz", hash = "sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872"},
]
[[package]]
name = "charset-normalizer"
version = "2.0.9"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
category = "main"
optional = false
python-versions = ">=3.5.0"
files = [
{file = "charset-normalizer-2.0.9.tar.gz", hash = "sha256:b0b883e8e874edfdece9c28f314e3dd5badf067342e42fb162203335ae61aa2c"},
{file = "charset_normalizer-2.0.9-py3-none-any.whl", hash = "sha256:1eecaa09422db5be9e29d7fc65664e6c33bd06f9ced7838578ba40d58bdf3721"},
]
[package.extras]
unicode_backport = ["unicodedata2"]
unicode-backport = ["unicodedata2"]
[[package]]
name = "click"
version = "8.0.3"
description = "Composable command line interface toolkit"
category = "main"
optional = false
python-versions = ">=3.6"
files = [
{file = "click-8.0.3-py3-none-any.whl", hash = "sha256:353f466495adaeb40b6b5f592f9f91cb22372351c84caeb068132442a4518ef3"},
{file = "click-8.0.3.tar.gz", hash = "sha256:410e932b050f5eed773c4cda94de75971c89cdb3155a72a0831139a79e5ecb5b"},
]
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
@ -48,42 +65,54 @@ importlib-metadata = {version = "*", markers = "python_version < \"3.8\""}
name = "colorama"
version = "0.4.4"
description = "Cross-platform colored terminal text."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
files = [
{file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"},
{file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"},
]
[[package]]
name = "idna"
version = "3.3"
description = "Internationalized Domain Names in Applications (IDNA)"
category = "main"
optional = false
python-versions = ">=3.5"
files = [
{file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"},
{file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"},
]
[[package]]
name = "importlib-metadata"
version = "4.10.0"
description = "Read metadata from Python packages"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
{file = "importlib_metadata-4.10.0-py3-none-any.whl", hash = "sha256:b7cf7d3fef75f1e4c80a96ca660efbd51473d7e8f39b5ab9210febc7809012a4"},
{file = "importlib_metadata-4.10.0.tar.gz", hash = "sha256:92a8b58ce734b2a4494878e0ecf7d79ccd7a128b5fc6014c401e0b61f006f0f6"},
]
[package.dependencies]
typing-extensions = {version = ">=3.6.4", markers = "python_version < \"3.8\""}
zipp = ">=0.5"
[package.extras]
docs = ["sphinx", "jaraco.packaging (>=8.2)", "rst.linker (>=1.9)"]
docs = ["jaraco.packaging (>=8.2)", "rst.linker (>=1.9)", "sphinx"]
perf = ["ipython"]
testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.0.1)", "packaging", "pyfakefs", "flufl.flake8", "pytest-perf (>=0.9.2)", "pytest-black (>=0.3.7)", "pytest-mypy", "importlib-resources (>=1.3)"]
testing = ["flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.0.1)", "pytest-flake8", "pytest-mypy", "pytest-perf (>=0.9.2)"]
[[package]]
name = "markdownify"
version = "0.11.6"
description = "Convert HTML to markdown."
category = "main"
optional = false
python-versions = "*"
files = [
{file = "markdownify-0.11.6-py3-none-any.whl", hash = "sha256:ba35fe289d5e9073bcd7d2cad629278fe25f1a93741fcdc0bfb4f009076d8324"},
{file = "markdownify-0.11.6.tar.gz", hash = "sha256:009b240e0c9f4c8eaf1d085625dcd4011e12f0f8cec55dedf9ea6f7655e49bfe"},
]
[package.dependencies]
beautifulsoup4 = ">=4.9,<5"
@ -93,9 +122,12 @@ six = ">=1.15,<2"
name = "python-dotenv"
version = "0.19.2"
description = "Read key-value pairs from a .env file and set them as environment variables"
category = "main"
optional = false
python-versions = ">=3.5"
files = [
{file = "python-dotenv-0.19.2.tar.gz", hash = "sha256:a5de49a31e953b45ff2d2fd434bbc2670e8db5273606c1e737cc6b93eff3655f"},
{file = "python_dotenv-0.19.2-py2.py3-none-any.whl", hash = "sha256:32b2bdc1873fd3a3c346da1c6db83d0053c3c62f28f1f38516070c4c8971b1d3"},
]
[package.extras]
cli = ["click (>=5.0)"]
@ -104,138 +136,27 @@ cli = ["click (>=5.0)"]
name = "python-frontmatter"
version = "1.0.0"
description = "Parse and manage posts with YAML (or other) frontmatter"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "python-frontmatter-1.0.0.tar.gz", hash = "sha256:e98152e977225ddafea6f01f40b4b0f1de175766322004c826ca99842d19a7cd"},
{file = "python_frontmatter-1.0.0-py3-none-any.whl", hash = "sha256:766ae75f1b301ffc5fe3494339147e0fd80bc3deff3d7590a93991978b579b08"},
]
[package.dependencies]
PyYAML = "*"
[package.extras]
docs = ["sphinx"]
test = ["pytest", "toml", "pyaml"]
test = ["pyaml", "pytest", "toml"]
[[package]]
name = "pyyaml"
version = "6.0"
description = "YAML parser and emitter for Python"
category = "main"
optional = false
python-versions = ">=3.6"
[[package]]
name = "requests"
version = "2.26.0"
description = "Python HTTP for Humans."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
[package.dependencies]
certifi = ">=2017.4.17"
charset-normalizer = {version = ">=2.0.0,<2.1.0", markers = "python_version >= \"3\""}
idna = {version = ">=2.5,<4", markers = "python_version >= \"3\""}
urllib3 = ">=1.21.1,<1.27"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)", "win-inet-pton"]
use_chardet_on_py3 = ["chardet (>=3.0.2,<5)"]
[[package]]
name = "six"
version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
[[package]]
name = "soupsieve"
version = "2.4.1"
description = "A modern CSS selector implementation for Beautiful Soup."
category = "main"
optional = false
python-versions = ">=3.7"
[[package]]
name = "typing-extensions"
version = "4.0.1"
description = "Backported and Experimental Type Hints for Python 3.6+"
category = "main"
optional = false
python-versions = ">=3.6"
[[package]]
name = "ujson"
version = "5.1.0"
description = "Ultra fast JSON encoder and decoder for Python"
category = "main"
optional = false
python-versions = ">=3.7"
[[package]]
name = "urllib3"
version = "1.26.7"
description = "HTTP library with thread-safe connection pooling, file post, and more."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4"
[package.extras]
brotli = ["brotlipy (>=0.6.0)"]
secure = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "certifi", "ipaddress"]
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]]
name = "zipp"
version = "3.7.0"
description = "Backport of pathlib-compatible object wrapper for zip files"
category = "main"
optional = false
python-versions = ">=3.7"
[package.extras]
docs = ["sphinx", "jaraco.packaging (>=8.2)", "rst.linker (>=1.9)"]
testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.0.1)", "jaraco.itertools", "func-timeout", "pytest-black (>=0.3.7)", "pytest-mypy"]
[metadata]
lock-version = "1.1"
python-versions = "^3.7"
content-hash = "37f6b249fc390c867f7f098d6b2a25155384e22a69faa6bb1276e93d559f3450"
[metadata.files]
beautifulsoup4 = []
certifi = [
{file = "certifi-2021.10.8-py2.py3-none-any.whl", hash = "sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569"},
{file = "certifi-2021.10.8.tar.gz", hash = "sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872"},
]
charset-normalizer = [
{file = "charset-normalizer-2.0.9.tar.gz", hash = "sha256:b0b883e8e874edfdece9c28f314e3dd5badf067342e42fb162203335ae61aa2c"},
{file = "charset_normalizer-2.0.9-py3-none-any.whl", hash = "sha256:1eecaa09422db5be9e29d7fc65664e6c33bd06f9ced7838578ba40d58bdf3721"},
]
click = [
{file = "click-8.0.3-py3-none-any.whl", hash = "sha256:353f466495adaeb40b6b5f592f9f91cb22372351c84caeb068132442a4518ef3"},
{file = "click-8.0.3.tar.gz", hash = "sha256:410e932b050f5eed773c4cda94de75971c89cdb3155a72a0831139a79e5ecb5b"},
]
colorama = [
{file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"},
{file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"},
]
idna = [
{file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"},
{file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"},
]
importlib-metadata = [
{file = "importlib_metadata-4.10.0-py3-none-any.whl", hash = "sha256:b7cf7d3fef75f1e4c80a96ca660efbd51473d7e8f39b5ab9210febc7809012a4"},
{file = "importlib_metadata-4.10.0.tar.gz", hash = "sha256:92a8b58ce734b2a4494878e0ecf7d79ccd7a128b5fc6014c401e0b61f006f0f6"},
]
markdownify = []
python-dotenv = [
{file = "python-dotenv-0.19.2.tar.gz", hash = "sha256:a5de49a31e953b45ff2d2fd434bbc2670e8db5273606c1e737cc6b93eff3655f"},
{file = "python_dotenv-0.19.2-py2.py3-none-any.whl", hash = "sha256:32b2bdc1873fd3a3c346da1c6db83d0053c3c62f28f1f38516070c4c8971b1d3"},
]
python-frontmatter = []
pyyaml = [
files = [
{file = "PyYAML-6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4db7c7aef085872ef65a8fd7d6d09a14ae91f691dec3e87ee5ee0539d516f53"},
{file = "PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9df7ed3b3d2e0ecfe09e14741b857df43adb5a3ddadc919a2d94fbdf78fea53c"},
{file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77f396e6ef4c73fdc33a9157446466f1cff553d979bd00ecb64385760c6babdc"},
@ -243,6 +164,13 @@ pyyaml = [
{file = "PyYAML-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f84fbc98b019fef2ee9a1cb3ce93e3187a6df0b2538a651bfb890254ba9f90b5"},
{file = "PyYAML-6.0-cp310-cp310-win32.whl", hash = "sha256:2cd5df3de48857ed0544b34e2d40e9fac445930039f3cfe4bcc592a1f836d513"},
{file = "PyYAML-6.0-cp310-cp310-win_amd64.whl", hash = "sha256:daf496c58a8c52083df09b80c860005194014c3698698d1a57cbcfa182142a3a"},
{file = "PyYAML-6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d4b0ba9512519522b118090257be113b9468d804b19d63c71dbcf4a48fa32358"},
{file = "PyYAML-6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:81957921f441d50af23654aa6c5e5eaf9b06aba7f0a19c18a538dc7ef291c5a1"},
{file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:afa17f5bc4d1b10afd4466fd3a44dc0e245382deca5b3c353d8b757f9e3ecb8d"},
{file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dbad0e9d368bb989f4515da330b88a057617d16b6a8245084f1b05400f24609f"},
{file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:432557aa2c09802be39460360ddffd48156e30721f5e8d917f01d31694216782"},
{file = "PyYAML-6.0-cp311-cp311-win32.whl", hash = "sha256:bfaef573a63ba8923503d27530362590ff4f576c626d86a9fed95822a8255fd7"},
{file = "PyYAML-6.0-cp311-cp311-win_amd64.whl", hash = "sha256:01b45c0191e6d66c470b6cf1b9531a771a83c1c4208272ead47a3ae4f2f603bf"},
{file = "PyYAML-6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:897b80890765f037df3403d22bab41627ca8811ae55e9a722fd0392850ec4d86"},
{file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50602afada6d6cbfad699b0c7bb50d5ccffa7e46a3d738092afddc1f9758427f"},
{file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48c346915c114f5fdb3ead70312bd042a953a8ce5c7106d5bfb1a5254e47da92"},
@ -270,20 +198,68 @@ pyyaml = [
{file = "PyYAML-6.0-cp39-cp39-win_amd64.whl", hash = "sha256:b3d267842bf12586ba6c734f89d1f5b871df0273157918b0ccefa29deb05c21c"},
{file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
]
requests = [
[[package]]
name = "requests"
version = "2.26.0"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
files = [
{file = "requests-2.26.0-py2.py3-none-any.whl", hash = "sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24"},
{file = "requests-2.26.0.tar.gz", hash = "sha256:b8aa58f8cf793ffd8782d3d8cb19e66ef36f7aba4353eec859e74678b01b07a7"},
]
six = [
[package.dependencies]
certifi = ">=2017.4.17"
charset-normalizer = {version = ">=2.0.0,<2.1.0", markers = "python_version >= \"3\""}
idna = {version = ">=2.5,<4", markers = "python_version >= \"3\""}
urllib3 = ">=1.21.1,<1.27"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)", "win-inet-pton"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<5)"]
[[package]]
name = "six"
version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
]
soupsieve = []
typing-extensions = [
[[package]]
name = "soupsieve"
version = "2.4.1"
description = "A modern CSS selector implementation for Beautiful Soup."
optional = false
python-versions = ">=3.7"
files = [
{file = "soupsieve-2.4.1-py3-none-any.whl", hash = "sha256:1c1bfee6819544a3447586c889157365a27e10d88cde3ad3da0cf0ddf646feb8"},
{file = "soupsieve-2.4.1.tar.gz", hash = "sha256:89d12b2d5dfcd2c9e8c22326da9d9aa9cb3dfab0a83a024f05704076ee8d35ea"},
]
[[package]]
name = "typing-extensions"
version = "4.0.1"
description = "Backported and Experimental Type Hints for Python 3.6+"
optional = false
python-versions = ">=3.6"
files = [
{file = "typing_extensions-4.0.1-py3-none-any.whl", hash = "sha256:7f001e5ac290a0c0401508864c7ec868be4e701886d5b573a9528ed3973d9d3b"},
{file = "typing_extensions-4.0.1.tar.gz", hash = "sha256:4ca091dea149f945ec56afb48dae714f21e8692ef22a395223bcd328961b6a0e"},
]
ujson = [
[[package]]
name = "ujson"
version = "5.1.0"
description = "Ultra fast JSON encoder and decoder for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "ujson-5.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:644552d1e89983c08d0c24358fbcb5829ae5b5deee9d876e16d20085cfa7dc81"},
{file = "ujson-5.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0cae4a9c141856f7ad1a79c17ff1aaebf7fd8faa2f2c2614c37d6f82ed261d96"},
{file = "ujson-5.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ba63b789d83ca92237dbc72041a268d91559f981c01763a107105878bae442e"},
@ -335,11 +311,39 @@ ujson = [
{file = "ujson-5.1.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:ce620a6563b21aa3fbb1658bc1bfddb484a6dad542de1efb5121eb7bb4f2b93a"},
{file = "ujson-5.1.0.tar.gz", hash = "sha256:a88944d2f99db71a3ca0c63d81f37e55b660edde0b07216fb65a3e46403ef004"},
]
urllib3 = [
[[package]]
name = "urllib3"
version = "1.26.7"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4"
files = [
{file = "urllib3-1.26.7-py2.py3-none-any.whl", hash = "sha256:c4fdf4019605b6e5423637e01bc9fe4daef873709a7973e195ceba0a62bbc844"},
{file = "urllib3-1.26.7.tar.gz", hash = "sha256:4987c65554f7a2dbf30c18fd48778ef124af6fab771a377103da0585e2336ece"},
]
zipp = [
[package.extras]
brotli = ["brotlipy (>=0.6.0)"]
secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)"]
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]]
name = "zipp"
version = "3.7.0"
description = "Backport of pathlib-compatible object wrapper for zip files"
optional = false
python-versions = ">=3.7"
files = [
{file = "zipp-3.7.0-py3-none-any.whl", hash = "sha256:b47250dd24f92b7dd6a0a8fc5244da14608f3ca90a5efcd37a3b1642fac9a375"},
{file = "zipp-3.7.0.tar.gz", hash = "sha256:9f50f446828eb9d45b267433fd3e9da8d801f614129124863f9c51ebceafb87d"},
]
[package.extras]
docs = ["jaraco.packaging (>=8.2)", "rst.linker (>=1.9)", "sphinx"]
testing = ["func-timeout", "jaraco.itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.0.1)", "pytest-flake8", "pytest-mypy"]
[metadata]
lock-version = "2.0"
python-versions = "^3.7"
content-hash = "1672b483488a4907061160b05790c138fb3e199ace6a2b5374a4512a76a49c2a"

View File

@ -16,6 +16,7 @@ click = "^8.0.3"
python-dotenv = "^0.19.2"
markdownify = "^0.11.6"
python-frontmatter = "^1.0.0"
beautifulsoup4 = "^4.12.3"
[tool.poetry.dev-dependencies]