My log files keep showing 404 errors on my robots.txt fileI had the file validated and it seems fine, here is the file# Allows all robotsUser-agent: *Disallow:I have just started using the robot.txt file, it is in the root dir.I think this is stopping the site from being deep crawled.I would appreciate any inputKozSEO wrote:You can try browsing to it to be sure it exists.http://www.yoursite.com/robots.txtIf it does, I would check with your hosting service to find out why it might be reporting 404s.Try rebuilding your robots.txt file at this address, it's a free to to use for robots. http://www.1-hit.com/all-in-one/tool-robots.txt-generator.htm