oliver11980
New Member
It looks like the Google cache of nearly all our site's pages have been corrupted somehow. Google search results for them show the problem:is R [ 9|p#˲ %{^ , E ...F P ; c/r 'x ?9 ܐE t K {w_B 0 n\; " n S t 9) V ;o~ > c_ c [ B_ j h ...http://www.ms-inc.net/ - 7k - Cached - Similar pagesClick here if you'd like to see the full site search as Google shows itIf you click the Cached link for any of those pages you'll see that the content Google is working with is thoroughly horked. I tried mailing Google, and after a week got an apology that they can't send a personal reply to my question.Has anyone ever seen such a thing? I'd love to know what's wrong, and what we can do to fix it.Thanks in advance!Are the pages on your website static html pages? or are all they dynamic and run with scripts? The reason I ask is that the cache you have almost looks like the source of a compiled script or some kind of binary file. So if your website wasn't functioning and for some reason outputing the compile scripts instead of actually running them and Google happened to crawl your site during that time, I could see how you might have your current situation.never seen that before!Did you send some mail to google? Maybe their get interested on knowing whats going on there...We're using ASP.NET for the site. It looked like binary code to me, as well. It would be most inconvenient and a nasty bug (on our server side!) if Google's crawling somehow managed to get at the binary code behind our pages - have you actually heard of that happening previously? Theoretically it's possible, I'm sure, but known problems are a lot better to find than previously unknown ones...I did send a mail to Google, and got nothing more than an automated reply. I agree it's something they should be looking at - but if I can't call and they don't read my mail, not much I can do with them...No there would be no way for Google to get at the binary code unless your server wasn't operating correctly at the time of their crawl. If for some reason your server was having a problem and wasn't executing your scripts, it might have been outputing the file contents of them instead for which google would index that.I have had a few cases where because of a server configuration that was messed up, instead of scripts executing like they were supposed to, their file contents were being dumped to everyone who visited the pages (including bots).So overall if this was the case I believe there is nothing wrong on your end currently (because your site seems to work fine), but maybe a server problem on your end during their last crawl. Maybe they crawled when your host was having a problem but now everything is fine so on the next Google crawl maybe your google cache problem will go away.Makes sense to me that it was a momentary problem - the site does work fine. Hopefully the next crawl will work properly and correct the problem. I'll look into Goggle Sitemaps and see if I can trigger a new crawl, perhaps by loading up a site map file.Thanks for the info!