vite-pwa / docs

Documentation: PWA integrations for Vite and the ecosystem
https://vite-pwa-org.netlify.app/
MIT License
197 stars 44 forks source link

Why `robots.txt`? #55

Open Zwyx opened 1 year ago

Zwyx commented 1 year ago

Hi,

The docs mention that You must add a robots.txt file to allow search engines to crawl all your application pages.

Why is that?

A robots.txt file allowing everything seems to be unnecessary:

Do I have to include an allow rule to allow crawling? No, you do not need to include an allow rule. All URLs are implicitly allowed and the allow rule is used to override disallow rules in the same robots.txt file.

Also:

Thank you!

joshas commented 1 year ago

Probably a measure to keep PageSpeed Insights happy and allow you to get that perfect 100 in "SEO" category?

Zwyx commented 1 year ago

Thanks for your reply Joshas, but that's not the reason: I have 100 for SEO, without any robots.txt file.