Open bennycode opened 5 years ago
There is no real reason for that in 2019..! But you still need to add the description
and other open graph tags into the <header>
.
There is no real reason for that in 2019..!
@lipis There are several good reasons to do server-side rendering: https://www.elephate.com/blog/ultimate-guide-javascript-seo/
If you trust your JavaScript, it's not worth the effort.. :)
If you trust your JavaScript, it's not worth the effort.. :)
Boring claim without valuable input. I wish you would work with data and facts. 🙄
If the Googlebot is crawling OK, which in this case it does, then there is no much need for it :)
Facebook crawler is also doing fine and I'm sure others are working as well.. so if you can verify that it's all good with the main players you are set.. that's my point :)
According to this article:
Google and Ask cover only ~64% of the whole search engine market [...] This means that your new, shiny, JavaScript-rich website can cost you ~36% of your website’s visibility on all search engines.
Ofcourse, the article is from August 2017 but it draws attention to the fact that pre-rendered HTML can be better interpeted than dynamic JS content. To be on the safe side and not risk search engine visibility I would go the extra mile and do pre-rendering.
Even Facebook writes that pre-rendering "increases the likelihood that each route of your application will be picked up by search engines" (source). That's why I created this ticket here.
There is also the concept of a zero-configuration pre-rendering which sounds like pre-rendering is a manageable amount of work.
Instead of dynamically creating the page contents on the client it would be nice if we could pre-render the HTML pages to serve them as they are.
Read more: