← Back to Upcase

Banning certain IPs from accessing site

(David Lee) #1

What’s a good way to ban certain IPs from accessing a Rails app. For example, some crawlers/bots like to crawl the site and I’d like to handle this the best possible way.

My first thought is to add a before_filter to the application controller and if the request.remote_ip matches one of the banned IPs in the banned IPs array, then to redirect the user. But is there a better (more efficient) way? I suppose another way is to include the banned IPs in the server config file and deny them that way. Thanks for everyone’s suggestions!

This is related but separate from a similar topic on IpSpoofAttackError that I also opened today.

(Sean Griffin) #2

Take a look at this gem: https://github.com/udzura/rack-block

(Lenart Rudel) #3

If it’s just a question of blocking robots from indexing your site I’d say that robots.txt is the simplest solution :wink:

(David Lee) #4

@seangriffin cool thanks, will check it out!
@lenartr I’m concerned more about unauthorized crawlers (not google, bing, etc.), which won’t respect the robots.txt.