Closed jim618 closed 9 years ago
I think crawlers are downloading the out of date binaries we provide. This just eats bandwidth.
Perhaps a robots.txt file where we exclude the releases dir would lessen this
We already have a robots.txt file, but it wasn't filled out very much. I've added more detail.
robots.txt
Ready for review and close.
Should help our bandwidth a little. Closing
I think crawlers are downloading the out of date binaries we provide. This just eats bandwidth.
Perhaps a robots.txt file where we exclude the releases dir would lessen this