Botly is a Filament plugin to manage your site's robots.txt file directly from the Filament admin panel. Rules, sitemaps, and AI crawler blocks are stored in the database and served dynamically — no static file required.
Install the package via Composer:
composer require awcodes/botlyRun the installation command to publish migrations and run them:
php artisan botly:installOr publish and run the migration manually:
php artisan vendor:publish --tag="botly-migrations"
php artisan migrateOptionally publish the config file:
php artisan vendor:publish --tag="botly-config"Register the plugin in your Filament panel provider:
use Awcodes\Botly\BotlyPlugin;
$panel->plugins([
BotlyPlugin::make(),
]);That's it. Botly registers a Robots Manager page in your panel and automatically serves /robots.txt via a dynamic route.
Botly stores your robots configuration in the database. When /robots.txt is requested, the rules are read from the database and formatted as valid robots.txt output on the fly. You can also export the current configuration to a static public/robots.txt file using the Export Robots.txt button on the admin page.
Important
If a static public/robots.txt file already exists, Botly will display a warning in the admin UI. The file must be deleted or renamed before the dynamic route can take effect.
The published config file (config/botly.php) allows you to set default values that are used when no database record exists yet:
return [
'defaults' => [
'rules' => [],
'sitemaps' => [],
'ai_crawlers' => [],
],
'persistent_rules' => [],
];Persistent rules are rules that are always included in the output and cannot be edited or deleted from the admin UI. You can define them in the config file or fluently on the plugin:
Via config:
// config/botly.php
'persistent_rules' => [
[
'user_agent' => '*',
'directive' => 'disallow',
'path' => '/admin',
],
],Via plugin:
BotlyPlugin::make()
->persistentRules([
[
'user_agent' => '*',
'directive' => 'disallow',
'path' => '/admin',
],
]),Each rule is an array with three keys:
| Key | Values |
|---|---|
user_agent |
Any string, e.g. *, Googlebot |
directive |
allow, disallow, crawl-delay, clean-param |
path |
The path to allow or disallow, e.g. /admin |
BotlyPlugin::make()
->navigationIcon('heroicon-o-robot')
->navigationGroup('Settings')
->navigationLabel('Robots.txt'),BotlyPlugin::make()
->title('Robots Manager')
->slug('robots-manager'),The admin page includes a Block AI Crawlers checkbox list. Selecting crawlers will add Disallow: / entries for each one in the output. Botly ships with a curated list of known AI crawlers including GPTBot, ClaudeBot, PerplexityBot, and more.
composer testPlease see CONTRIBUTING for details.
Please review our security policy on how to report security vulnerabilities.
The MIT License (MIT). Please see License File for more information.