Open GoogleCodeExporter opened 9 years ago
i found a problem with option 3, which is that it makes it way harder to do any
development on the registration part of the site. i am going to fall back to
option 2. the plan is to
1. remove the existing robots.txt from the repository and add an svn:ignore tag
to make sure we never re-add it.
2. make update-all.sh write out (the current) robots.txt into the prod directory
3. make site-ops write out a fully restrictive robots.txt into all test site
directories.
ugly, but no worse than anything else we could do.
Original comment by schmo...@frozencrow.org
on 23 Jun 2011 at 5:49
Ed,
I have a feeling I misunderstand something about robots.txt and svn:ignore.
Have you considered the following idea?
In every non-production account put robots.txt that will disallow everything:
User-agent: *
Disallow: /
And put svn:ignore for robots.txt so it doesn't get checked into SVN and make
it into production account.
Will that work? Did I miss something?
Original comment by victor.l...@gmail.com
on 23 Jun 2011 at 3:41
yes, that is mostly what i am going to do. we also have to remove
robots.txt from the repo and make update all.sh keep it up to date because
svn:ignore does not work on edits, just add and status.
Original comment by schmo...@frozencrow.org
on 23 Jun 2011 at 4:56
Original issue reported on code.google.com by
schmo...@frozencrow.org
on 20 Jun 2011 at 5:34