How can we specify that a page should not be indexed and any links on it, not be followed by Googlebot? Are there typical pages on a website which should generally follow the said practice? Explain with an example. 1. Google webmaster tools to exclude any pages
2. Add the noindex HTML meta tag to the page
3. Make sure any linking pages have nofollow tag in the links to that page The exact meta tag being:
HTML Code: <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> Quote: Originally Posted by jennycarol25 http://www.v7n.com/forums/seo-forum...should-not-indexed-any-links.html#post1445855 i highly suggest to create robots.txt.. hope this one can help: http://www.robotstxt.org/
2. Add the noindex HTML meta tag to the page
3. Make sure any linking pages have nofollow tag in the links to that page The exact meta tag being:
HTML Code: <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> Quote: Originally Posted by jennycarol25 http://www.v7n.com/forums/seo-forum...should-not-indexed-any-links.html#post1445855 i highly suggest to create robots.txt.. hope this one can help: http://www.robotstxt.org/