403ing AI crawlers
Now that I'm self-hosting this site and the associated infrastructure, the public facing part of the site (where you're reading this) is served via a boring old Linux/Apache/PHP stack. Which means I had to write an .htaccess file.
Naturally, because I could, I wrote the .htaccess file in Liquid with the relevant portion below (with the data being sourced from an 11ty robots.js data file):
{%- assign userAgents = "" -%}
{% for robot in robots -%}
{%- for userAgent in robot.user_agents -%}
{%- if ...
Read more at coryd.dev