Is there a way of reducing Googlebot Bandwidth usage without delaying time taken to spider new pages.You could use a robots.txt file to limit what Google (or any search engine) may spider. That would be the only way I can think of to reduce bandwidth without denying the bot completely. Obviously you don't wan't to do that.Which brings up a question I've had. If you deny access to image directories (which obviously would be your highest bandwidth hogs, does it inhibit the ability for Google to cache pages that include images (and still show the images)?This is only my understanding and I am sure somebody will correct me on this if I am wrong but:1) Denying Googlebot access to the image folder will only affect the image search (who cares about that one)2) The actual code, <IMG SRC=http://www.ozzu.com/other-google-information-and-resources/> tag, will be cached as part of the code in your pages. When someone performs a search using Google they will be able to pull up the images without any problems since they are not Googlebot.I already use this in my Robots file:User-agent: *Disallow: /moviesite/Disallow: /images/Disallow: /banners/Disallow: /products/Disallow: /*.gif$Disallow: /*.jpg$Disallow: /*.pdf$Disallow: /*.avi$Disallowing images shouldn't effect the showing in the cache. (unless you have some sort of referer block in place in images, then maybe)Why I believe this, (minus the exitcounter.cgi)http://66.102.7.104/search?q=cache:wlSv ... lr=lang_enGoogles Cache of this page wrote:So many different questions ...Quote:rtchar your a top dog! Never seen IF-MODIFIED-SINCE header before... why dont they just use a META tag? Anyway I will look into it.How did you know it was not supported...did you look at my server?I found the code for the ASP:Code: [ Select ]Who has control of your server? Is it hosted or does it belong to you?Technical Guidelines http://www.google.com/intl/en/webmasters/guidelines.htmlQuote:Thanks for the tips guys. That answered my question for sure.google-image bot banning is about as far as you should safely go I think.... unless you have no reason for regular google to index things like cgi-bin..put this into your robots.txt:User-agent: Googlebot-ImageDisallow: /and possibly these - IF you don't have a shopping cart or other system that uses cgi-bin based scripts to view pages:User-agent: *DISALLOW: /PDFDISALLOW: /cgi-binDISALLOW: /tempJust to let you guys know I have failed and given uip on this. Last-modified header for ASP is too much like hard work. Mata tags are not suitable and robot exclude dont work .I am still using IIS 5.0 on one server so if-modified-since is supported. I can see the docs are not really clear on IIS 6.0, so I will have to look at my other server to see what is required.It has to be easier than that .....If you are having problems with Google using to much of your bandwidth as you already do not have much bandwidth, then it may be best if you upgrade your hosting.I would never really want Google to stop crawling any part of my site for any part of their search engine including the image search.This is because the more links you get within one of the largest search engines in the world the more visitors you will receive to your site.rtchar wrote: