Banning certain IPs from accessing site

What’s a good way to ban certain IPs from accessing a Rails app. For example, some crawlers/bots like to crawl the site and I’d like to handle this the best possible way.

My first thought is to add a before_filter to the application controller and if the request.remote_ip matches one of the banned IPs in the banned IPs array, then to redirect the user. But is there a better (more efficient) way? I suppose another way is to include the banned IPs in the server config file and deny them that way. Thanks for everyone’s suggestions!

This is related but separate from a similar topic on IpSpoofAttackError that I also opened today.

Take a look at this gem: GitHub - udzura/rack-block: A rack middleware for handling search bot access, ip block, etc.

If it’s just a question of blocking robots from indexing your site I’d say that robots.txt is the simplest solution :wink:

1 Like

@seangriffin cool thanks, will check it out!
@lenartr I’m concerned more about unauthorized crawlers (not google, bing, etc.), which won’t respect the robots.txt.