Dark Visitors is a plugin for Kirby 3 and 4 that blocks unwanted AI Crawlers from your website using robots.txt. It uses the Dark Visitors API to identify and block unwanted visitors.
It also allows you to add custom rules and your sitemaps to your robots.txt file.
composer require mauricerenck/darkvisitors
Or download the latest release unzip it, copy it to site/plugins/dark-visitors
You need a Dark Visitors access token to use this plugin. Go to https://darkvisitors.com/ create an account and your own Project. Open your project and get your access token under settings.
Edit your config.php
and add the following line:
'mauricerenck.dark-visitors.token' => 'YOUR TOKEN'
Set which types of AI crawlers you want to block:
'mauricerenck.dark-visitors.aiTypes' => ['AI Assistant', 'AI Data Scraper', 'AI Search Crawler'],
Add your custom rules to the robots.txt file:
'mauricerenck.dark-visitors.agents' => [
[
'userAgents' => ['Googlebot', 'Bingbot'],
'disallow' => ['/admin'],
],
[
'userAgents' => ['Bingbot'],
'allow' => ['/microsoft'],
],
],
Setting your custom rules will overwrite the default rules, which are:
[
'userAgents' => ['*'],
'disallow' => ['/kirby', '/site'],
];
Add your sitemaps to the robots.txt file:
'mauricerenck.dark-visitors.sitemaps' => [
'Sitemap: https://your-site.tld/sitemap.xml',
'Sitemap: https://your-site.tld/sitemap2.xml',
],
Darkvisitors offers a tracking feature. If you want to use it, you can enable it in the config:
'mauricerenck.dark-visitors.analytics' => true,