nodejs / node

Node.js JavaScript runtime ✨🐢🚀✨
https://nodejs.org
Other
107.67k stars 29.64k forks source link

100% CPU usage for a long time after execution of the last statement #48581

Open sidorares opened 1 year ago

sidorares commented 1 year ago

Version

v12.x-v20.x - severe, v8.x - v10.x - less severe but still visible

Platform

Darwin Andreys-MacBook-Pro.local 22.2.0 Darwin Kernel Version 22.2.0: Fri Nov 11 02:03:51 PST 2022; root:xnu-8792.61.2~4/RELEASE_ARM64_T6000 arm64

Subsystem

suspect GC

What steps will reproduce the bug?

run first script in the gist: https://gist.github.com/sidorares/128160e6b3dea1da3ad45cd672651d2d#file-repro1-js

After last line console.log("done"); is executed CPU spikes to 100% and stays like that for about 5 minutes. In case of node v8 and v10 cpu spike is much shorter, 5-7 seconds. Bun.js executes the script almost instantly

How often does it reproduce? Is there a required condition?

Consistently reproduceable

What is the expected behavior? Why is that the expected behavior?

the script is expected to finish in sub second time

What do you see instead?

CPU spikes for 300-400 seconds which is unexpected for the workload

Additional information

The profiler only reports the code before the CPU spike, no high GC usage, the time after cpu spike is "unaccounted".

 [Shared libraries]:
   ticks  total  nonlib   name
     98    0.0%          /Users/laplace/.nvm/versions/node/v20.3.0/bin/node

 [JavaScript]:
   ticks  total  nonlib   name
      2    0.0%    0.0%  JS: *parseLengthCodedInt /Users/laplace/tmp/2090/repro3.js:2014:24

 [C++]:
   ticks  total  nonlib   name

 [Summary]:
   ticks  total  nonlib   name
      2    0.0%    0.0%  JavaScript
      0    0.0%    0.0%  C++
     66    0.0%    0.0%  GC
     98    0.0%          Shared libraries
  291682  100.0%          Unaccounted

 [C++ entry points]:
   ticks    cpp   total   name

 [Bottom up (heavy) profile]:
  Note: percentage shows a share of a particular caller in the total
  amount of its parent calls.
  Callers occupying less than 1.0% are not shown.

   ticks parent  name
  291682  100.0%  UNKNOWN
bnoordhuis commented 1 year ago

You're basically hitting pathological behavior in V8. Node is waiting for V8's optimizer to finish a compilation job but I suspect it's exhibiting quadratic behavior that gets exacerbated by the size of the function.

I suggest reporting this to V8 because it's not something we can fix.

The stack trace:

* thread #5
  * frame #0: 0x0000000101625d24 node`v8::internal::compiler::LoadElimination::AbstractField::Kill(v8::internal::compiler::LoadElimination::AliasStateInfo const&, v8::internal::MaybeHandle<v8::internal::Name>, v8::internal::Zone*) const + 68
    frame #1: 0x00000001016287cb node`v8::internal::compiler::LoadElimination::AbstractState::KillFields(v8::internal::compiler::Node*, v8::internal::MaybeHandle<v8::internal::Name>, v8::internal::Zone*) const + 107
    frame #2: 0x0000000101623484 node`v8::internal::compiler::LoadElimination::ReduceStoreField(v8::internal::compiler::Node*, v8::internal::compiler::FieldAccess const&) + 900
    frame #3: 0x000000010155578a node`v8::internal::compiler::Reducer::Reduce(v8::internal::compiler::Node*, v8::internal::compiler::ObserveNodeManager*) + 26
    frame #4: 0x00000001016931e9 node`v8::internal::compiler::(anonymous namespace)::SourcePositionWrapper::Reduce(v8::internal::compiler::Node*) + 57
    frame #5: 0x00000001015565aa node`v8::internal::compiler::GraphReducer::Reduce(v8::internal::compiler::Node*) + 154
    frame #6: 0x00000001015560f5 node`v8::internal::compiler::GraphReducer::ReduceTop() + 613
    frame #7: 0x0000000101555c38 node`v8::internal::compiler::GraphReducer::ReduceNode(v8::internal::compiler::Node*) + 216
    frame #8: 0x0000000101693dee node`v8::internal::compiler::LoadEliminationPhase::Run(v8::internal::compiler::PipelineData*, v8::internal::Zone*) + 718
    frame #9: 0x000000010168501b node`auto v8::internal::compiler::PipelineImpl::Run<v8::internal::compiler::LoadEliminationPhase>() + 123
    frame #10: 0x00000001016818f7 node`v8::internal::compiler::PipelineImpl::OptimizeGraph(v8::internal::compiler::Linkage*) + 455
    frame #11: 0x00000001016814fe node`v8::internal::compiler::PipelineCompilationJob::ExecuteJobImpl(v8::internal::RuntimeCallStats*, v8::internal::LocalIsolate*) + 142
    frame #12: 0x000000010034a01b node`v8::internal::OptimizedCompilationJob::ExecuteJob(v8::internal::RuntimeCallStats*, v8::internal::LocalIsolate*) + 43
    frame #13: 0x00000001003778e3 node`v8::internal::OptimizingCompileDispatcher::CompileNext(v8::internal::TurbofanCompilationJob*, v8::internal::LocalIsolate*) + 35
    frame #14: 0x0000000100378359 node`v8::internal::OptimizingCompileDispatcher::CompileTask::RunInternal() + 425
    frame #15: 0x000000010015304a node`node::(anonymous namespace)::PlatformWorkerThread(void*) + 362
bnoordhuis commented 1 year ago

https://groups.google.com/g/v8-dev/c/OUb4BRUnTgI/m/wF25A6SzAAAJ

Problem disappears with --noopt.