robots.txt: dynamically disallow development domains
authorMischa POSLAWSKY <perl@shiar.org>
Wed, 15 Nov 2023 17:06:03 +0000 (18:06 +0100)
committerMischa POSLAWSKY <perl@shiar.org>
Sun, 3 Dec 2023 20:31:05 +0000 (21:31 +0100)
Not requested frequently enough to warrant static caching.

robots.txt [deleted file]
robots.txt.plp [new file with mode: 0644]

diff --git a/robots.txt b/robots.txt
deleted file mode 100644 (file)
index 15b6ecb..0000000
+++ /dev/null
@@ -1,5 +0,0 @@
-User-agent: *
-Disallow: /source/*::*
-
-Sitemap: http://sheet.shiar.nl/sitemap.xml
-Host: sheet.shiar.nl
diff --git a/robots.txt.plp b/robots.txt.plp
new file mode 100644 (file)
index 0000000..15e80a3
--- /dev/null
@@ -0,0 +1,10 @@
+<(common.inc.plp)><:
+$header{content_type} = 'text/plain; charset=us-ascii';
+checkmodified($ENV{SCRIPT_FILENAME});
+
+say 'User-agent: *';
+say 'Disallow: ', $Dev ? '/' : '/source/*::*';
+#say 'Disallow: /font/*?q=*';
+:>
+Sitemap: http://sheet.shiar.nl/sitemap.xml
+Host: sheet.shiar.nl