guyht / Glog

NodeJS and Git backed blogging engine
MIT License
52 stars 10 forks source link

google bot Crawl postponed because robots.txt was inaccessible. #22

Closed hughht5 closed 11 years ago

hughht5 commented 11 years ago

Google cannot find the robots.txt file, but because it's not a 404 and simply forwards the request to the home page google says they wont risk crawling dissalowed URLs and will come back to crawl the site when robots.txt is available.

I will look at adding this feature.

hugh

hughht5 commented 11 years ago

my bad!! I can just put it in the public folder in guido.

sorry.