CollectionBuilder / collectionbuilder-csv

CollectionBuilder-CSV is a "stand alone" template for creating digital collection and exhibit websites using Jekyll and a metadata CSV.
MIT License
21 stars 16 forks source link

Robots.txt #91

Open maehr opened 3 weeks ago

maehr commented 3 weeks ago

Describe what feature you'd like. Pseudo-code, mockups, or screenshots of similar solutions are encouraged!

Hi guys, we implemented a robots.txt over here https://github.com/Stadt-Geschichte-Basel/forschung.stadtgeschichtebasel.ch/pull/113

Do you want me to open a PR for CB-CSV as well?

What type of pull request would this be?

New Feature

Any links to similar examples or other references we should review?

No response

evanwill commented 3 weeks ago

@maehr I like the idea of it in "utilities". However, robots.txt should only be at the root of a domain or subdomain, and only one per domain. I think the majority of CB projects are likely not at the root, so it might just end up being inapplicable and potentially confusing. The noindex: true option will cover adding robots meta tags to individual pages.

I think "how to add a robots.txt" might be a good as a cb-docs "advanced" topic, with the code for the example template? Then people who are actually at a subdomain root or root can add one in and learn more about why.

maehr commented 3 weeks ago

I will work on the docs and get back to you. PS: I could add a check for baseurl: to make sure it's not hosted in a subfolder.