kodadot / workers

Implementation of Cloudflare Workers in KodaDot
https://kodadot.xyz
9 stars 12 forks source link

Flex prerender on kodadot.xyz #111

Closed yangwao closed 8 months ago

yangwao commented 1 year ago

https://www.checkbot.io/guide/seo/?utm_source=checkbot-extension&utm_medium=extension&utm_content=learn-more#rule-set-mobile-scaling

image

roiLeo commented 1 year ago

Open issue on ssr worker

Capture d’écran 2023-06-23 à 8 58 48 AM
yangwao commented 1 year ago

Something is still fundamentally bad with our prerender.

Only developers. seems to work well, which has plain VitePress.

Even Hello is somehow crappy (gitbook boo)

I've used checkbot.io

At kodadot.xyz

only 9 URLs 🤔 from which are most of js bundles image

Testing https://canary-netlify.kodadot.xyz

which is supposed to have Netlify prerender...

image

image

yangwao commented 1 year ago

cc @preschian seems Opengraph was closed, yet seems doesn't work that well.

Let's have look, where might be issue? Just noticed that I even looked on Netlify Prerender and it's seems doesn't work either idk why

vikiival commented 1 year ago

Disallow _nuxt/* in robots.txt?

yangwao commented 1 year ago

Disallow _nuxt/* in robots.txt?

Well, crawlbot should get HTML fragment in the first place 😅

preschian commented 1 year ago

Just noticed that I even looked on Netlify Prerender and it's seems doesn't work either idk why

interesting, even netlify prerender can't catch that. what the result from google search console?

yangwao commented 1 year ago

This might be related?

vikiival commented 1 year ago

Fixed in #112 ?

preschian commented 1 year ago

which routes should we use through prerender? without prerender og tags won't work like this. no title or description on Google search results

image

preschian commented 1 year ago

just want to make sure, why do we disallow collection/items on newer robots.txt anyway? we already indexed some collection pages on Google

image

imo, let's keep google indexing our collection/items page. if that pages mostly poor performance results from google search console reports. that our task how to improve that. at least, the page is indexed in google search. if we disallow that pages, it will be completely lost from google search results, right?

for example, if we search "kusama canary fractal" on google. our website shows up on the first results image

if we disallow collections pages, we are getting the same results as this, isn't it? image

vikiival commented 1 year ago

Cc @yangwao 👀

yangwao commented 1 year ago

Re robots We've been penalized for having a lot of items pages which we should put down for now so intent was decrease lot of those which google indexes but doesn't show for "reason", we going from something 2.75k and now at 1666 so let's see, positive trade off was that some seo pages starting working, which is great. Simple say, to have lot of pages we need some strong backlink traffic going which isn't case currently, but will be in the future

Collections we could put up as well tho gradually back.

We will be focusing on a few pages which by hello insights make a lot of views already should be on main kodadot to drive more seo traffic through

I'll make issues once we circle which pages should be put back to main for indexing

preschian commented 1 year ago

We've been penalized

interesting, penalized from what actually? penalized from thin content/cleak content maybe?

because this is quite difference with my experience

I just wanted to share my experience when I was still at an ecommerce company. What we did back then was index as many product pages as possible. Even though the Page Speed Insights score for those pages were still in the "poor" or "needs improvements" category. Being indexed by Google was still better than not being indexed at all. After that, we created several sitemaps. For example, a sitemap for popular products. The point is to create a sitemap for important pages

Roughly, it's similar to what Magic Eden did https://magiceden.io/robots.txt. In that link there is a special sitemap for popular collections. That sitemap is the one submitted to Google Search Console

I think we can follow what MagicEden has done. Create additional sitemaps for important collection/items pages, then submit these sitemaps to Google Search Console

Then, make sure the links in those sitemaps use prerendering. At least use the minimal prerendering that we already have. Or even better, use prerender.io. Because OpenGraph is not only used for social media sharing but also useful for SEO

without prerendering. keyword: "kodadot carbonless" with prerendering. keyword: "kusama canary fractal"
Screenshot 2023-06-30 162200 Screenshot 2023-06-30 162206
Screenshot 2023-06-30 162016 Screenshot 2023-06-30 162026
no title. no description. blank page at least there is a title and description with minimal page