facebook / docusaurus

Easy to maintain open source documentation websites.
https://docusaurus.io
MIT License
55.03k stars 8.25k forks source link

[v2] npm start, build fails with `JavaScript out of memory` because of huge files #4785

Closed idontknowjs closed 2 years ago

idontknowjs commented 3 years ago

πŸ› Bug Report

npm start crashes with JavaScript out of memory error after 42%. npm run build crashes with JavaScript out of memory error after 23%.

Error logs for npm start:

* Client β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ building (42%) 3/5 entries 746/800 dependencies 92/379 modules 79 active 
 babel-loader Β» mdx-loader Β» docs\contribute\guidelines-ui\icon.md

i ο½’wdsο½£: Project is running at http://localhost:3000/
i ο½’wdsο½£: webpack output is served from /
i ο½’wdsο½£: Content not from webpack is served from ...
i ο½’wdsο½£: 404s will fallback to /index.html

<--- Last few GCs --->

[16236:000001C176447760]   186754 ms: Mark-sweep (reduce) 2034.7 (2052.1) -> 2034.3 (2053.1) MB, 4644.7 / 0.1 ms  (average mu = 0.089, current mu = 0.002) allocation failure scavenge might not succeed
[16236:000001C176447760]   192112 ms: Mark-sweep (reduce) 2035.3 (2055.1) -> 2034.8 (2055.8) MB, 5350.2 / 0.1 ms  (average mu = 0.041, current mu = 0.001) allocation failure scavenge might not succeed

<--- JS stacktrace --->

FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
 1: 00007FF70D36046F napi_wrap+109311
 2: 00007FF70D305156 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap,2>::NumberOfElementsOffset+33302
 3: 00007FF70D305F26 node::OnFatalError+294
 4: 00007FF70DBD2B4E v8::Isolate::ReportExternalAllocationLimitReached+94
 5: 00007FF70DBB792D v8::SharedArrayBuffer::Externalize+781
 6: 00007FF70DA61CCC v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1516
 7: 00007FF70DA4C86B v8::internal::NativeContextInferrer::Infer+59451
 8: 00007FF70DA31CFF v8::internal::MarkingWorklists::SwitchToContextSlow+56991
 9: 00007FF70DA4591B v8::internal::NativeContextInferrer::Infer+30955
10: 00007FF70DA3CA3D v8::internal::MarkCompactCollector::EnsureSweepingCompleted+6269
11: 00007FF70DA44B6E v8::internal::NativeContextInferrer::Infer+27454
12: 00007FF70DA48B2B v8::internal::NativeContextInferrer::Infer+43771
13: 00007FF70DA52472 v8::internal::ItemParallelJob::Task::RunInternal+18
14: 00007FF70DA52401 v8::internal::ItemParallelJob::Run+641
15: 00007FF70DA25C63 v8::internal::MarkingWorklists::SwitchToContextSlow+7683
16: 00007FF70DA3CEEC v8::internal::MarkCompactCollector::EnsureSweepingCompleted+7468
17: 00007FF70DA3B734 v8::internal::MarkCompactCollector::EnsureSweepingCompleted+1396
18: 00007FF70DA392B8 v8::internal::MarkingWorklists::SwitchToContextSlow+87128
19: 00007FF70DA67A91 v8::internal::Heap::LeftTrimFixedArray+929
20: 00007FF70DA69B75 v8::internal::Heap::PageFlagsAreConsistent+789
21: 00007FF70DA5EDE1 v8::internal::Heap::CollectGarbage+2033
22: 00007FF70DA5D005 v8::internal::Heap::AllocateExternalBackingStore+1317
23: 00007FF70DA7D2A7 v8::internal::Factory::NewFillerObject+183
24: 00007FF70D7ACC31 v8::internal::interpreter::JumpTableTargetOffsets::iterator::operator=+1409
25: 00007FF70DC5B50D v8::internal::SetupIsolateDelegate::SetupHeap+463949
26: 00007FF70DC35A68 v8::internal::SetupIsolateDelegate::SetupHeap+309672
27: 00007FF70DBEFDD1 v8::internal::SetupIsolateDelegate::SetupHeap+23825
28: 0000021D94260858

Error logs for npm run build:

[en] Creating an optimized production build...

* Client β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ building (23%) 0/1 entries 1358/1378 dependencies 115/433 modules 140 active
 babel-loader Β» mdx-loader Β» docs\troubleshoot\faq.md

* Server β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ building (35%) 0/1 entries 1302/1423 dependencies 303/600 modules 82 active
 babel-loader Β» mdx-loader Β» docs\contribute\guidelines-ui\ui.md

<--- Last few GCs --->

[14572:000001F439CA0710]   163441 ms: Mark-sweep (reduce) 2032.7 (2054.4) -> 2032.2 (2054.9) MB, 4677.5 / 0.1 ms  (average mu = 0.154, current mu = 0.128) allocation failure scavenge might not succeed
[14572:000001F439CA0710]   168904 ms: Mark-sweep (reduce) 2033.2 (2051.9) -> 2032.8 (2052.9) MB, 4305.5 / 0.1 ms  (average mu = 0.184, current mu = 0.212) allocation failure scavenge might not succeed

<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 00007FF70D36046F napi_wrap+109311
 2: 00007FF70D305156 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap,2>::NumberOfElementsOffset+33302
 3: 00007FF70D305F26 node::OnFatalError+294
 4: 00007FF70DBD2B4E v8::Isolate::ReportExternalAllocationLimitReached+94
 5: 00007FF70DBB792D v8::SharedArrayBuffer::Externalize+781
 6: 00007FF70DA61CCC v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1516
 7: 00007FF70DA6D04A v8::internal::Heap::ProtectUnprotectedMemoryChunks+1258
 8: 00007FF70DA6A1F9 v8::internal::Heap::PageFlagsAreConsistent+2457
 9: 00007FF70DA5EDE1 v8::internal::Heap::CollectGarbage+2033
10: 00007FF70DA5D005 v8::internal::Heap::AllocateExternalBackingStore+1317
11: 00007FF70DA7D2A7 v8::internal::Factory::NewFillerObject+183
12: 00007FF70D7ACC31 v8::internal::interpreter::JumpTableTargetOffsets::iterator::operator=+1409
13: 00007FF70DC5B50D v8::internal::SetupIsolateDelegate::SetupHeap+463949
14: 00007FF70DC35A68 v8::internal::SetupIsolateDelegate::SetupHeap+309672
15: 00007FF70DBEFDD1 v8::internal::SetupIsolateDelegate::SetupHeap+23825
16: 000001B9069CD34E

Have you read the Contributing Guidelines on issues?

yes

To Reproduce

  1. git clone https://github.com/covalentbond/docs-site.git
  2. npm install
  3. npm start

Actual Behavior

npm start does not crash and server starts running at port:3000. npm run build successfully builds the app.

Your Environment

slorber commented 3 years ago

Have you tried increasing the memory that you allocate with the nodejs process?

Something like this might help for large sites:NODE_OPTIONS="--max-old-space-size=8192"

idontknowjs commented 3 years ago

Tried changing the start script with node --max-old-space-size=8192 node_modules/@docusaurus/core/bin/docusaurus start. But the process still freezes at 21%.

Also a note comes up:

[BABEL] Note: The code generator has deoptimised the styling of \docs-site\docs\appendix\tpsr.md as it 
exceeds the max of 500KB.
idontknowjs commented 3 years ago

hey @slorber - yes increasing the memory allocated seems to work!

But due to the file tpsr.md having around 23k lines of nearly 4MB is causing the app server to freeze at a certain percentage. Without this particular file the app is running as well as building fine.

slorber commented 3 years ago

We use MDX to parse the markdown content and transform it into React components.

Unfortunately, it is probably difficult to process a file of that size, but there might be other ways to create that page?

If the content is pure markdown, we'll enable later to provide an alternate md parser (https://github.com/facebook/docusaurus/issues/3018), which may be more memory efficient (but also more limited due to the inability to use React inside markdown).

Why does this page need to be so long in the first place? Can it be splitted into multiple pages? Is the content auto-generated?

My feeling is that this page is autogenerated, and generating markdown might not be the best solution in the first place. I would rather:

adventure-yunfei commented 3 years ago

If the content is pure markdown, we'll enable later to provide an alternate md parser (#3018), which may be more memory efficient (but also more limited due to the inability to use React inside markdown).

Will this alternative md parser be the key to make Docusaurus v2 available for large doc site? Or is there any other workaround?

I've met similar problem when switching docusaurus v1 to v2. I have a very, very huge doc site (containing 7000+ md files, auto-generated by *.d.ts as api file). Docusaurus v1 generates the doc site in about one or two minutes. But Docusaurus v2 is unable to generate. The process breaks with a strange error after one hour.

Docusaurus v2 has more amazing features than v1 (actually in my opinion v1 is only "can-use" but missing many configurations). So I'm really looking forward to switching to v2. With some effort, I've resolved the parsing error in md syntax. But stuck at compiling time.

slorber commented 3 years ago

@adventure-yunfei here we talk about a very large file, not a lot of small files.

Docusaurus 2 compiles each md file to a React component by default (with MDX), so this adds a bit of overhead if you don't need MDX, and allowing a less powerful parser can be useful but also make it easier to adopt for sites using commonmark and not willing to change the docs during the migration.

I suggest opening another issue dedicated to your specific problem. It will be hard to troubleshoot that without a repro that I can run. Note we already have a few very large sites on Docusaurus but yes built time is definitively a pain point (ex: https://xsoar.pan.dev/docs/reference/index)

Josh-Cena commented 2 years ago

Closing in favor of #4765. We should work hard on reducing memory / build time as I've heard many complaints about it.