vercel / next.js

The React Framework
https://nextjs.org
MIT License
122.69k stars 26.26k forks source link

Dynamic pages stuck on loading.tsx when JavaScript is disabled #50150

Open codinginflow opened 1 year ago

codinginflow commented 1 year ago

Verify canary release

Provide environment information

Operating System:
      Platform: win32
      Arch: x64
      Version: Windows 10 Home
    Binaries:
      Node: 18.16.0
      npm: N/A
      Yarn: N/A
      pnpm: N/A
    Relevant packages:
      next: 13.4.3
      eslint-config-next: 13.4.3
      react: 18.2.0
      react-dom: 18.2.0

Which area(s) of Next.js are affected? (leave empty if unsure)

App directory (appDir: true), Data fetching (gS(S)P, getInitialProps), Routing (next/router, next/navigation, next/link)

Link to the code that reproduces this issue or a replay of the bug

https://codesandbox.io/p/sandbox/relaxed-davinci-o1i9yl?file=%2Fapp%2Fpage.tsx%3A5%2C15

To Reproduce

Create a page that dynamically fetches some data on every request, either via export const revalidate = 0 or the no-store configuration on fetch.

Set up a global loading.tsx in the app folder.

Build and run the project.

Disable JavaScript in the web browser and open the dynamic page. The loading.tsx page is shown indefinitely.

Note: You can disable JavaScript in Chrome by pressing F12 to open the Chrome dev tools. Then press Ctrl + Shift + P and search for "disable JavaScript".

Describe the Bug

When you open a dynamic page with JavaScript disabled, you get stuck on the global loading.tsx indefinitely.

This also means that this page is effectively invisible to web crawlers, which kills SEO for that page.

Expected Behavior

The page should be loaded even with JavaScript disabled.

Which browser are you using? (if relevant)

Google Chrome

How are you deploying your application? (if relevant)

Locally and on Vercel

codinginflow commented 1 year ago

Is JavaScript required to have loading.tsx move on to the actual page? If yes, doesn't this make the page invisible to search engine crawlers? Seems like a downgrade from getServerSideProps.

williamli commented 1 year ago

@codinginflow

JavaScript is indeed required to render streamed UI on the browser. This process, known as "hydration", uses ReactDOMClient.hydrateRoot. Here is an example taken from https://github.com/reactwg/react-18/discussions/37:

import * as ReactDOMClient from 'react-dom/client';
import App from 'App';

const container = document.getElementById('app');

// Create *and* render a root with hydration.
const root = ReactDOMClient.hydrateRoot(container, <App tab="home" />);
// Unlike with createRoot, you don't need a separate root.render() call here

This approach is similar to pre-streaming Server-Side Rendering (SSR) where you need to call ReactDOMClient.hydrate in the client to resume interactivity.

When streaming is not used, the server has to generate all UI and send it to the browser. This process may take longer, but you get a page that is more presentable without JavaScript even before ReactDOMClient.hydrate can be executed (if JavaScript is enabled).

However, when streaming is used, the server sends over a shell first, which results in a shorter waiting time. Subsequent UI elements are then streamed over. With JavaScript disabled, ReactDOMClient.hydrateRoot cannot be called to fill in these streamed elements, resulting in a stuck loading shell.

According to the Next.js documentation on Loading UI and Streaming > SEO:

  1. Next.js will wait for data fetching inside generateMetadata to complete before streaming UI to the client. This guarantees the first part of a streamed response includes <head> tags.
  2. Since streaming is server-rendered, it does not impact SEO. You can use the Mobile Friendly Test tool from Google to see how your page appears to Google's web crawlers and view the serialized HTML (source).

You can verify this by examining the actual HTML document received in your browser's devtools, under the network tab, for streamed content.

codinginflow commented 1 year ago

@williamli

Thank you for the clarification!

Google search crawlers can execute JavaScript, but how about other crawlers? Will they see my web page content without JS?

This seems like a step backwards from getServerSideProps, which would show the full page even with JS disabled.

williamli commented 1 year ago

Google search crawlers can execute JavaScript, but how about other crawlers? Will they see my web page content without JS?

Other crawlers will also see the content. You can verify this by examining the actual HTML document received in your browser's devtools (with JS disabled), just search for your site content in the raw HTML.

This seems like a step backwards from getServerSideProps, which would show the full page even with JS disabled.

If you need to render the full page without JS, you can still use SSR (do not use loading.js and ).

The new options are there to enhance performance for those with JS enabled.

RomainSF commented 1 year ago

@williamli see this issue, https://github.com/facebook/react/issues/24794 this is exactly what's going on, the content is hidden by React (

) function $RC(a,b){a=document.getElementById(a);b=document.getElementById(b);b.parentNode.removeChild(b);if(a){a=a.previousSibling;var f=a.parentNode,c=a.nextSibling,e=0;do{if(c&&8===c.nodeType){var d=c.data;if("/$"===d)if(0===e)break;else e--;else"$"!==d&&"$?"!==d&&"$!"!==d||e++}d=c.nextSibling;f.removeChild(c);c=d}while(c);for(;b.firstChild;)f.insertBefore(b.firstChild,c);a.data="$";a._reactRetry&&a._reactRetry()}};$RC("B:0","S:0")

that piece of code runs then the content shows up.

I think NextJS deeply uses React, but that feature isn't working the way it should, since there is no hydration to do here, the content moves to another div then react runs.

codinginflow commented 1 year ago

Other crawlers will also see the content.

Only if they execute JS? Or can they still see the content without JS?

MrHus commented 1 year ago

I'm a bit puzzled by the default behavior of Next.js regarding steaming components. By default Next.js renders the content of async components to a hidden div with a unique id:

Screenshot 2023-06-09 at 15 53 29

This div is invisible and rendered almost at the bottom of the body and then the script mentioned above kicks in to place it in the correct position.

My question is, why not render it at the correct position in the DOM, why at the bottom? If Next.js did that people with no JavaScript enabled would still see the content at the correct position, without any JS needed to place fully formed HTML (with data) into the correct position.

Is it some technical limitation?

MarkLauer commented 1 year ago

Got a similar issue, a Suspense fallback was always showed in the initial HTML doc. It seems that it was caused by root layout (app/layout.tsx) being an async function, when I changed it to a usual function, the initial HTML was fixed.

sracela commented 9 months ago

Other crawlers will also see the content.

Only if they execute JS? Or can they still see the content without JS?

yes, I actually have the same question. I'm facing a similar issue: if I disable JS, the navbar and footer are loading but server-side content within page.tsx is not rendering. Instead ,the loading.tsx is the one rendered.

zounar commented 4 months ago

The issue still persists 😔. What is interesting, though, is that it appears to behave inconsistently - based on the time the server component needs to return the response.

I created a page with this content:

export default async function PageTest() {
  return new Promise((resolve) => 
   setTimeout(() => resolve(<div>Hello World!</div>), 100)
  );
}

Is there some config that could be changed to accommodate for slightly slower responses? In my case, I'd rather delay the response by 100 ms than have an ugly loading screen displayed for just a few milliseconds. 🙏

harrisrobin commented 4 months ago

I'm seeing this too. Disabling javascript and having a loading.tsx means that the page is stuck at loading.tsx, which is quite surprising.

Is this intended? For now, removed loading.tsx so that it works as I would have expected.

dfiedlerx commented 1 month ago

I found the same case here and noticed some points in #66244. I will also put it here:

Summing up and reinforcing, I believe that the approach with the [hidden] tag is not ideal, perhaps we can create an approach in which the content starts visible (without [hidden]) and that will be replaced later by the result of the render/hydration, I don't think so would greatly alter the current structure and solve the problem.

Carduelis commented 1 week ago

Similar behavior. Can confirm. When JS is disabled, loading state is displayed forever. Usually, a loading toggle is covered by javascript, but it also can be achieved by CSS as well