how to prevent vulnerability scanning

DeadLine

New Member
I have a web site that reports about each non-expected server side error on my email.Quite often (once each 1-2 weeks) somebody launches automated tools that bombard the web site with a ton of different URLs:
  • sometimes they (hackers?) think my site has inside phpmyadmin hosted and they try to access vulnerable (i believe) php-pages...
  • sometimes they are trying to access pages that are really absent but belongs to popular CMSs
  • last time they tried to inject wrong ViewState...
It is clearly not search engine spiders as 100% of requests that generated errors are requests to invalid pages.Right now they didn't do too much harm, the only one is that I need to delete a ton of server error emails (200-300)... But at some point they could probably find something.I'm really tired of that and looking for the solution that will block such 'spiders'.Is there anything ready to use? Any tool, dlls, etc... Or I should implement something myself?In the 2nd case: could you please recommend the approach to implement? Should I limit amount of requests from IP per second (let's say not more than 5 requests per second and not more then 20 per minute)?P.S. Right now my web site is written using ASP.NET 4.0.
 
Back
Top