Closed yangwao closed 8 months ago
Open issue on ssr worker
Something is still fundamentally bad with our prerender.
Only developers.
seems to work well, which has plain VitePress.
Even Hello is somehow crappy (gitbook boo)
I've used checkbot.io
only 9 URLs 🤔 from which are most of js bundles
which is supposed to have Netlify prerender...
cc @preschian seems Opengraph was closed, yet seems doesn't work that well.
Let's have look, where might be issue? Just noticed that I even looked on Netlify Prerender and it's seems doesn't work either idk why
Disallow _nuxt/* in robots.txt?
Disallow _nuxt/* in robots.txt?
Well, crawlbot should get HTML fragment in the first place 😅
Just noticed that I even looked on Netlify Prerender and it's seems doesn't work either idk why
interesting, even netlify prerender can't catch that. what the result from google search console?
This might be related?
Fixed in #112 ?
which routes should we use through prerender? without prerender og tags won't work like this. no title or description on Google search results
just want to make sure, why do we disallow collection/items
on newer robots.txt anyway? we already indexed some collection pages on Google
imo, let's keep google indexing our collection/items page. if that pages mostly poor performance results from google search console reports. that our task how to improve that. at least, the page is indexed in google search. if we disallow that pages, it will be completely lost from google search results, right?
for example, if we search "kusama canary fractal" on google. our website shows up on the first results
if we disallow collections pages, we are getting the same results as this, isn't it?
Cc @yangwao 👀
Re robots We've been penalized for having a lot of items pages which we should put down for now so intent was decrease lot of those which google indexes but doesn't show for "reason", we going from something 2.75k and now at 1666 so let's see, positive trade off was that some seo pages starting working, which is great. Simple say, to have lot of pages we need some strong backlink traffic going which isn't case currently, but will be in the future
Collections we could put up as well tho gradually back.
We will be focusing on a few pages which by hello insights make a lot of views already should be on main kodadot to drive more seo traffic through
I'll make issues once we circle which pages should be put back to main for indexing
We've been penalized
interesting, penalized from what actually? penalized from thin content/cleak content maybe?
because this is quite difference with my experience
I just wanted to share my experience when I was still at an ecommerce company. What we did back then was index as many product pages as possible. Even though the Page Speed Insights score for those pages were still in the "poor" or "needs improvements" category. Being indexed by Google was still better than not being indexed at all. After that, we created several sitemaps. For example, a sitemap for popular products. The point is to create a sitemap for important pages
Roughly, it's similar to what Magic Eden did https://magiceden.io/robots.txt. In that link there is a special sitemap for popular collections. That sitemap is the one submitted to Google Search Console
I think we can follow what MagicEden has done. Create additional sitemaps for important collection/items pages, then submit these sitemaps to Google Search Console
Then, make sure the links in those sitemaps use prerendering. At least use the minimal prerendering that we already have. Or even better, use prerender.io. Because OpenGraph is not only used for social media sharing but also useful for SEO
without prerendering. keyword: "kodadot carbonless" | with prerendering. keyword: "kusama canary fractal" |
---|---|
no title. no description. blank page | at least there is a title and description with minimal page |
https://www.checkbot.io/guide/seo/?utm_source=checkbot-extension&utm_medium=extension&utm_content=learn-more#rule-set-mobile-scaling