Open codinginflow opened 1 year ago
Is JavaScript required to have loading.tsx
move on to the actual page? If yes, doesn't this make the page invisible to search engine crawlers? Seems like a downgrade from getServerSideProps
.
@codinginflow
JavaScript is indeed required to render streamed UI on the browser. This process, known as "hydration", uses ReactDOMClient.hydrateRoot
. Here is an example taken from https://github.com/reactwg/react-18/discussions/37:
import * as ReactDOMClient from 'react-dom/client';
import App from 'App';
const container = document.getElementById('app');
// Create *and* render a root with hydration.
const root = ReactDOMClient.hydrateRoot(container, <App tab="home" />);
// Unlike with createRoot, you don't need a separate root.render() call here
This approach is similar to pre-streaming Server-Side Rendering (SSR) where you need to call ReactDOMClient.hydrate
in the client to resume interactivity.
When streaming is not used, the server has to generate all UI and send it to the browser. This process may take longer, but you get a page that is more presentable without JavaScript even before ReactDOMClient.hydrate
can be executed (if JavaScript is enabled).
However, when streaming is used, the server sends over a shell first, which results in a shorter waiting time. Subsequent UI elements are then streamed over. With JavaScript disabled, ReactDOMClient.hydrateRoot
cannot be called to fill in these streamed elements, resulting in a stuck loading shell.
According to the Next.js documentation on Loading UI and Streaming > SEO:
<head>
tags.You can verify this by examining the actual HTML document received in your browser's devtools, under the network tab, for streamed content.
@williamli
Thank you for the clarification!
Google search crawlers can execute JavaScript, but how about other crawlers? Will they see my web page content without JS?
This seems like a step backwards from getServerSideProps, which would show the full page even with JS disabled.
Google search crawlers can execute JavaScript, but how about other crawlers? Will they see my web page content without JS?
Other crawlers will also see the content. You can verify this by examining the actual HTML document received in your browser's devtools (with JS disabled), just search for your site content in the raw HTML.
This seems like a step backwards from getServerSideProps, which would show the full page even with JS disabled.
If you need to render the full page without JS, you can still use SSR (do not use loading.js and
The new options are there to enhance performance for those with JS enabled.
@williamli see this issue, https://github.com/facebook/react/issues/24794 this is exactly what's going on, the content is hidden by React (
function $RC(a,b){a=document.getElementById(a);b=document.getElementById(b);b.parentNode.removeChild(b);if(a){a=a.previousSibling;var f=a.parentNode,c=a.nextSibling,e=0;do{if(c&&8===c.nodeType){var d=c.data;if("/$"===d)if(0===e)break;else e--;else"$"!==d&&"$?"!==d&&"$!"!==d||e++}d=c.nextSibling;f.removeChild(c);c=d}while(c);for(;b.firstChild;)f.insertBefore(b.firstChild,c);a.data="$";a._reactRetry&&a._reactRetry()}};$RC("B:0","S:0")
that piece of code runs then the content shows up.
I think NextJS deeply uses React, but that feature isn't working the way it should, since there is no hydration to do here, the content moves to another div then react runs.
Other crawlers will also see the content.
Only if they execute JS? Or can they still see the content without JS?
I'm a bit puzzled by the default behavior of Next.js regarding steaming components. By default Next.js renders the content of async components to a hidden div with a unique id:
This div is invisible and rendered almost at the bottom of the body and then the script mentioned above kicks in to place it in the correct position.
My question is, why not render it at the correct position in the DOM, why at the bottom? If Next.js did that people with no JavaScript enabled would still see the content at the correct position, without any JS needed to place fully formed HTML (with data) into the correct position.
Is it some technical limitation?
Got a similar issue, a Suspense fallback was always showed in the initial HTML doc. It seems that it was caused by root layout (app/layout.tsx) being an async function, when I changed it to a usual function, the initial HTML was fixed.
Other crawlers will also see the content.
Only if they execute JS? Or can they still see the content without JS?
yes, I actually have the same question. I'm facing a similar issue: if I disable JS, the navbar and footer are loading but server-side content within page.tsx is not rendering. Instead ,the loading.tsx is the one rendered.
The issue still persists 😔. What is interesting, though, is that it appears to behave inconsistently - based on the time the server component needs to return the response.
I created a page with this content:
export default async function PageTest() {
return new Promise((resolve) =>
setTimeout(() => resolve(<div>Hello World!</div>), 100)
);
}
loading.tsx
files from the project, the content is always rendered straight away no matter the timeout.Is there some config that could be changed to accommodate for slightly slower responses? In my case, I'd rather delay the response by 100 ms than have an ugly loading screen displayed for just a few milliseconds. 🙏
I'm seeing this too. Disabling javascript and having a loading.tsx means that the page is stuck at loading.tsx, which is quite surprising.
Is this intended? For now, removed loading.tsx so that it works as I would have expected.
I found the same case here and noticed some points in #66244. I will also put it here:
The content loaded via SSR has the HTML code of the content not displayed, but this is hidden via CSS by a [hidden] tag together with the loading.tsx. Perhaps it would be ideal, instead of removing the [hidden] tag, to start with the default view and replace this content later if necessary.
It will always only occur on pages with page.tsx using async type return, for example if you need to fetch an api with await on the server side, you will probably run into this problem.
Even if I remove loading.tsx, the content is still invisible.
Because the content is invisible, the impact on core web vitals occurs, mainly in FCP, LCP and INP. Ideally, immediate display would be better, because hydration already takes care of and guarantees that pages do not change the content between the SSR and client-side first processing. Even on pages with Javascript activated normally, there is still an impact on display performance, greatly taking away the advantage and purpose of using SSR.
In some very rare and intermittent cases, even with javascript disabled, the page starts rendering correctly without displaying loading.tsx. I couldn't find out why or the cause of this.
Summing up and reinforcing, I believe that the approach with the [hidden] tag is not ideal, perhaps we can create an approach in which the content starts visible (without [hidden]) and that will be replaced later by the result of the render/hydration, I don't think so would greatly alter the current structure and solve the problem.
Similar behavior. Can confirm. When JS is disabled, loading state is displayed forever. Usually, a loading toggle is covered by javascript, but it also can be achieved by CSS as well
Verify canary release
Provide environment information
Which area(s) of Next.js are affected? (leave empty if unsure)
App directory (appDir: true), Data fetching (gS(S)P, getInitialProps), Routing (next/router, next/navigation, next/link)
Link to the code that reproduces this issue or a replay of the bug
https://codesandbox.io/p/sandbox/relaxed-davinci-o1i9yl?file=%2Fapp%2Fpage.tsx%3A5%2C15
To Reproduce
Create a page that dynamically fetches some data on every request, either via
export const revalidate = 0
or theno-store
configuration on fetch.Set up a global
loading.tsx
in theapp
folder.Build and run the project.
Disable JavaScript in the web browser and open the dynamic page. The
loading.tsx
page is shown indefinitely.Note: You can disable JavaScript in Chrome by pressing
F12
to open the Chrome dev tools. Then pressCtrl + Shift + P
and search for "disable JavaScript".Describe the Bug
When you open a dynamic page with JavaScript disabled, you get stuck on the global
loading.tsx
indefinitely.This also means that this page is effectively invisible to web crawlers, which kills SEO for that page.
Expected Behavior
The page should be loaded even with JavaScript disabled.
Which browser are you using? (if relevant)
Google Chrome
How are you deploying your application? (if relevant)
Locally and on Vercel