I want to block a page on my site from Googlebot in robots.txt
It's a dynamic script, so there are variables being passed to it.
Does this look right:
User-agent: *
Allow: /
User-agent: Googlebot
Disallow: /dynamicpage.php*
Allow: / Google a while ago released this information about robot.txt and noindex: Controlling Crawling and Indexing
Your coding above does not appear correct to me. deny site.com Why not use a page-level instruction for search engine bots via Meta Robots Tag? Please refer to the code below. Hope it helps.
Code: <html><head><title>...</title><META NAME="ROBOTS" CONTENT="NOINDEX"></head> Quote: Originally Posted by jonny1 Sorry i dont have any idea about this They why did you reply to this thread? User-agent: *
Disallow:
do it like this
It's a dynamic script, so there are variables being passed to it.
Does this look right:
User-agent: *
Allow: /
User-agent: Googlebot
Disallow: /dynamicpage.php*
Allow: / Google a while ago released this information about robot.txt and noindex: Controlling Crawling and Indexing
Your coding above does not appear correct to me. deny site.com Why not use a page-level instruction for search engine bots via Meta Robots Tag? Please refer to the code below. Hope it helps.
Code: <html><head><title>...</title><META NAME="ROBOTS" CONTENT="NOINDEX"></head> Quote: Originally Posted by jonny1 Sorry i dont have any idea about this They why did you reply to this thread? User-agent: *
Disallow:
do it like this