Open chalin opened 7 months ago
To achieve 3, we could set the
HUGO_ENV
from[context.branch-deploy.environment]
to dev, but it will probably propagate faster if we use arobots.txt
disallow on those domains (otherwise, the<meta>
tags will take until Google next indexes each page).
AFAIK, what you propose won't work. I've had to work through a similar issue for another CNCF project with multiple versions of the docs being indexed. Based on my experiences, you'll need to change each old-version branch individually (to somehow set / config it to emit noindex, nofollow
as appropriate for the branch) and have it rebuilt and redeployed.
Btw, you can't use robots.txt
to prevent domains from being indexed -- see https://developers.google.com/search/docs/crawling-indexing/robots/intro:
/cc @nate-double-u
As I mentioned elsewhere, I'm OOO, but I'll be glad to help with this in the new year.
@chalin It's possible if the Netelify configs are defined for all branches in master
(rather than the branches themselves) as discussed here https://github.com/kubeflow/website/pull/3628#discussion_r1416262439, then we might only need to update master, and then trigger a re-deploy of the older Netelify branches.
(However, I think the super new version of Hugo running in master
will probably break our really old Docsy versions and the deploy might fail).
Originally posted by @thesuperzapper in https://github.com/kubeflow/website/issues/3628#issuecomment-1841612784: