When you add a link on your blog to an online resource, odds are that the link might break over time. Some sites, like Wikipedia, are relatively stable. But blogs, news sites or even open source docs will change from time to time. Sometimes the authors put redirects in place, but other times links can go stale.
It turns out to be a huge problem.
To help combat dead links, you may want to use the deadlink Python project. The project offers a simple command line interface to find bad links, as well as a well designed logo.
After installing the project via
pip install deadlink you can point to a local
readme.md file to check for bad links.
deadlink check README.md
You can also configure the tool. Either via the command line arguments, or via a config file. To copy that's listed on the docs:
allow_urls = [ "https:" ] ignore_urls = [ "stackoverflow.com", "math.stackexchange.com", "discord.gg", "doi.org" ] igonore_files = [ ".svg" ]
There's even a setting that allows deadlink to attempt to replace any redirected pages with their new links via;
deadlink replace-redirects ...
You could run this tool as a CI step for your docs, but it might be best used as a weekly cron job on GitHub Actions. Either way, it's a good habbit to check for bad links once in a while!
Back to TILs.