GoogleBot shut down my site(s).

Fluinialurb

New Member
Short version:- I have 46 websites.- I send out about 10 gigs a month in bandwidth from users surfing.- In the last 20 days, GoogleBot has crawled 100 gigs of bandwidth off my servers, shutting them down. Many pages twice. I now have a nice "Service suspended" message on all my sites.- Therefore 10% of my bandwidth is from real users, 90% is wasted on GoogleBot crawling it. Obviously I don't want to stop Google from crawling my site, but I need to slow it down somehow when my servers are reactivated or it will happen again.Suggestions?JMpost < META NAME="revisit-after" CONTENT="10 days" >that meta tag can reduce the bandwidth that google sucks up. you need a new ISP because they should just charge you for the bandwidth and keep the site up continuously. might check your contract to see if what they did breaks your contract you might have a case against them.I say send Google a bill. 3. Googlebot is crawling my site too fast. What can I do?Please contact us with the URL of your site and a detailed description of the problem. Please also include a portion of the weblog that shows Google accesses so we can track down the problem quickly.http://www.google.com/bot.html#fastThat might help.Bompacontacting google is a waste of time - they just send you snippets of the FAQ you already read... Use the revisit-after and PRAY that they observe it - I've not seen google be too quick at observing these kinds of changes, and/or robots.txt changes - often ignoring robots.txt changes for up to a week, but crawling DAILY.I had a recent problem where a number of sites we host had perl calendar systems - the rather dumb bots were crawing back to 1906 and forward to 2036 in these systems, and hitting the perl binary HARD... it took adding no-follow AND robots.txt exclusions to get them to ignore the perl scripts, and even then, it was 5-7 days per site of hammering before it finally let up...
 
Back
Top