Open sliptonic opened 1 year ago
I'd like to run a couple of tests against the site to check metadata, LD+JSON, opengraph tags, etc. This goes for the entire site and all pages (couple of approaches to do this in docusaurus).
For now, the best bet is to provide solid content with high-quality outbound links. Any means of getting high-quality inbout links will need to be done as well (the more the merrier for the pagerank algorithm (google) - especially these days).
Then the site should be registered with the major search engines and I'll want to check that sitemaps are created and submitted as well. I can try to instantiate a scan but I'll need to check permissions on the google-y side first. :)
Looking at pagespeed we've got reasonable coverage for everything but performance (for now). We'll need to solidify some image formats and markup for size definitions to make the googles happy there. As this is react, we may need to do some code splitting and lazy loading as well to get the FCP (First Contentful Paint) score up as well.
https://pagespeed.web.dev/analysis/https-ondsel-com/1vppapvipr?form_factor=desktop
There is no metadata on the page to parse so I'm going to add react-helmet
to inject elements into the document head for each route. This plus a sitemap submission should go a long way towards giving the search engines a heads up that we're here.
https://www.npmjs.com/package/react-helmet
[edit]: there is metadata on the page once JS renders stuff (ugh). docusaurus has a plugin Head
that leverages react-helmet
so I'll look into using that.
We are using off the shelf docusaurus site. Very little effort has been made to tune SEO or to evaluate analytics data. (The site is instrumented to gather Google Analytics data)
We would appreciate an audit of the current system and recommendations for improvement.