fooling googlebot

darkness_black

New Member
what do you think about fooling the googlebot, so whenyou detect it you output something else well prepared forgoogle page rank algoritm?what are the drawbacks?That would be known as cloaking. There are definite drawbacks if you get caught. The main reason people do it is so that you can have a highly optimized page for the search engine, and show your visitors something else which might be more friendly for the user. I have heard rumors that Google has unmarked robots which visit sites to see if something like cloaking might be taking place. Use at your own risk. You can find a cloaking script here however:http://www.unmelted.com/cloaking.htmlWell, it seems like an obvious way to try and fool the search engines, but there is something in the article that ain't true, heh."They will have no idea that you even fed a completely different page to them.Well, yes, they will, Google caches many of the pages that it sees, especially the high ranking ones. Simply looking at the cache will show a user that what he is seeing and what the search engine sees are two completely different things.Then all one simply needs to do is telnet to the server, pretend to be google bot in fake headers and they'll be able to see exactly what google does (without having to view source on the Google cache & take out all the extra bits that Google adds to cached pages).But, there could be legitimate reasons for having your page automatically throw out several versions of a page - that could be mistaken by Google as cloaking.Let's say a page is set to detect graphical capabilities of a browser (or at least the name of the browser - Explorer, Nutscrape, Opera, Konqueror, etc.) and throw out a graphical version if it detects one of those. If it doesn't detect a known graphics capable browser send out a plain text version (For text based browsers such as Lynx, WAP phones, etc). So, Google could unintentionally get caught as a non-graphical browser, recieve a different page, and appear to Google as if cloaking were intentionally being used to try and trick it.Does anybody know how actively Google looks out for this "technique" of cloacking?gkboomus wrote:In my opinion, every page should have a dedicated version for search engines. One of the problems with the web today is the large eterogenity and hardships to find relevant information, so, it is absolute logical that a page should have versions for different agents, I don't see why google heads would battle this idea. In fact, they should promote it considering the large influence they have on webmasters. GOOGLE SHOULD PROMOTE a standard or a language for information on the web. If google would say they want pages to present information in some format or based on some loose roles at least, next day all sites would implement it, so web would be more 'searchable'.In theory, I agree gkboomus.But what about those unscrupulous spammers & porn sites that will do anything to get the exposure they need to earn that $0.000001 per banner impression?They'll be sending a high-ranking, good content page to Google, then when a human comes along, pages and pages of porn & popups.So, while I think it is cool in theory to have a specific format for search-engines to see, so that they can possibly more accurately get the right results.. But, I can see how this would be abused VERY quickly, and a LOT (not just by spammers & porn sites though, but those who simply wish to make their site sound better to search engines than it actually is to people).I agree with gkboomus's theory as well, and it would probably work. However, there is one major problem with that theory and it is that numerous people are not honest and will do anything to get to the top of the engines as Axe has described.The drawback is you'll be removed from the Google index forever. If you're not performing well in google with a site, then I guess you have nothing to lose, so go ahead. But...bear in mind if you start performing well, you have a good chance of losing it again. It's almost a catch 22, so I'd recommend against it.Cloaking apparently works - but erquires a lot of work. And, from what I hear, cloaked sites should always be disposable, as the game is about "how long before I am caught and banned" rather than "avoid ever getting caught and banned".What does google bot do when there's Javascript? Does it just ignore it?Anybody knows?Plus, you have competitors out there, that if they find you cloaking, they will report you. If you were my competitor, I would hahaha"nutscrape"is there anyone using it now?eCommando wrote:Do you have anything to verify that google is learning to crawl javascripts? Or reading them? I would find that very interesting and insiteful information!An example, like something found in a google listing thats js loaded, or whatever else you have would be welcome.Thanks.
 
Back
Top