Skip to content

Block crawlers and high traffic users on your site by IP using Redis

Notifications You must be signed in to change notification settings

joyceverheije/laravel-block-bots

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 

Repository files navigation

Laravel Block Bots

Latest Version on Packagist Software License Total Downloads

Introduction

Laravel Block bots is a pacakge that block bad crawlers, people trying to scrape your website or high-usage users, but lets good and important crawlers such as GoogleBot and Bing pass-thu.

Features

  • ULTRA fast, less than 1ms increase in each request.
  • Verify Crawlers using reverse DNS
  • Highly configurable
  • Redirect users to a page when they got blocked
  • Allow Logged users to always bypass blocks

Install

Via Composer

composer require potelo/laravel-block-bots

Requirement

Before Laravel 5.5

In Laravel 5.4. you'll manually need to register the \Potelo\LaravelBlockBots\BlockBots::class service provider in config/app.php.

Config

To adjust the library, you can publish the config file to your project using:

php artisan vendor:publish --provider="Potelo\LaravelBlockBots\BlockBotsServiceProvider"

Configure variables in your .env file:

BLOCK_BOTS_ENABLED=false
BLOCK_BOTS_ALLOW_LOGGED_USER=true
BLOCK_BOTS_FAKE_MODE=false
BLOCK_BOTS_LOG_BLOCKED_REQUESTS=true

Usage

It's simple. Go to Kernel.php and add to the $routeMiddleware block as :

protected $routeMiddleware = [
        ...
        'block' => \Potelo\LaravelBlockBots\Middleware\BlockBots::class,
    ];

Than you can put in the desired groups. For exemple, lets set to the Wrb group:


 protected $middlewareGroups = [
        'web' => [
            ...
            \App\Http\Middleware\VerifyCsrfToken::class,
            'block:100,/limit'
        ],

Where:

  • 100: is the number of pages an IP can access every day
  • /limit: Is the route we going to redirect the IP after the limit

Change log

Please see CHANGELOG for more information on what has changed recently.

Contributing

Please see CONTRIBUTING and CODE_OF_CONDUCT for details.

Credits

License

The MIT License (MIT). Please see License File for more information.

About

Block crawlers and high traffic users on your site by IP using Redis

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • PHP 93.1%
  • HTML 6.9%