facebook / docusaurus

Easy to maintain open source documentation websites.
https://docusaurus.io
MIT License
55.36k stars 8.3k forks source link

Blog/docs: unlisted #5701

Closed slorber closed 1 year ago

slorber commented 2 years ago

🚀 Feature

As @MattiaPrimavera pointed out here: https://github.com/facebook/docusaurus/issues/3417#issuecomment-939320974

Docusaurus v1 had an "unlisted" feature allowing a blog post to exist in the production build, but to not appear in sidebar etc, and only those having the link could know the existence of it.

We should port this feature to v2 and make it consistent across all content plugins.

Things to not forget:

We might need to add an unlisted flag to the created route definitions to transmit this info to the sitemap plugin.

lex111 commented 2 years ago

Do a lot of users really need it? It's probably not worth effort to implement.

slorber commented 2 years ago

unlisted - The post will be accessible by directly visiting the URL but will not show up in the sidebar in the final build; during local development, the post will still be listed. Useful in situations where you want to share a WIP post with others for feedback.

Personally, it's something I already used on my own blog for exactly this use-case.

This may not be super useful for people using deploy previews, but it can be useful for those that only use a single git branch like GitHub Pages users + may be useful for the Forestry CMS use-case reported by @suntudo here (https://github.com/facebook/docusaurus/issues/3417#issuecomment-938175676)

If drafts are not written in separate branches and those branches are not deployed anywhere, those persons wouldn't be able to share a link and gather feedback

TanguyGiton commented 2 years ago

For those who want to find an alternative while waiting for an implementation, this is how I did it:

/* ./src/css/custom.css */
...
.menu__list li.brouillon a {
  color: var(--ifm-color-warning-dark);
}
jodyheavener commented 2 years ago

I just wanted to echo the request here. Our team has a make-shift version of "beta" or "unlisted" docs, where we set sidebar_custom_props: { beta: true } in a doc's frontmatter, and then in a wrap-swizzled DocSidebarItem exclude the sidebar item if not on the page itself:

import React from "react";
import DocSidebarItem from "@theme-original/DocSidebarItem";
import type { Props } from "@theme/DocSidebarItem";

const DocSidebarItemWrapper = (props: Props) => {
    const isBeta = (props.item.customProps || {}).beta;
    const isCurrentPage =
        props.item.type === "link" && props.item.href === props.activePath;

    if (isBeta && !isCurrentPage) {
        return null;
    }

    return <DocSidebarItem {...props} />;
};

export default DocSidebarItemWrapper;

And while this is very effective at removing the link from the sidebar, you can imagine how I've already been burned by this a few times, because it:

I had loosely tried to take a stab at it when working on this issue, but it didn't make the cut, so I would be happy to have another go at it.

slorber commented 2 years ago

Thanks for the feedback @jodyheavener

Yes we should provide this feature as a follow-up of https://github.com/facebook/docusaurus/pull/6457, as it's quite difficult to achieve this properly in userland.

We can exclude unlisted pages from SEO/sitemap with robots noindex directive, and from blog RSS feeds too.

However I'm not sure Algolia has any metadata to exclude a page from their search engine crawler (afaik it doesn't respect noindex. @shortcuts do you know if it's possible to add an Algolia meta to exclude from search?

For local search plugins we'd also have to figure out a way to signal pages to exclude from search. I guess <meta name="robots" content="noindex" /> could also be respected by all (Algolia or local) search engines?

shortcuts commented 2 years ago

However I'm not sure Algolia has any metadata to exclude a page from their search engine crawler (afaik it doesn't respect noindex. @shortcuts do you know if it's possible to add an Algolia meta to exclude from search?

Hey! We respect every restrictions and metadatas (noindex, robots.txt, etc.) by default. You can freely set those on the Docusaurus side and we won't index those pages.

In case you want to let you user know about it/let them control if it should be indexed, we have options to bypass them:

slorber commented 2 years ago

Hey! We respect every restrictions and metadatas (noindex, robots.txt, etc.) by default. You can freely set those on the Docusaurus side and we won't index those pages.

Thanks @shortcuts , didn't know.

Is this new? Because I think we discussed that a long time ago with Kevin and he said it wasn't supported

shortcuts commented 2 years ago

Is this new? Because I think we discussed that a long time ago with Kevin and he said it wasn't supported

Yup, since we've migrated to the new infra/Algolia Crawler :D

slorber commented 2 years ago

I see thanks :)