npm / cli

the package manager for JavaScript
https://docs.npmjs.com/cli/
Other
8.47k stars 3.16k forks source link

[BUG] npm 10.4.0+ running out of memory with no `package-lock.json` #7276

Closed JanStureNielsen closed 6 months ago

JanStureNielsen commented 8 months ago

Is there an existing issue for this?

This issue exists in the latest npm version

Current Behavior

It appears the latest dependency resolving algorithm is taking more memory than before. The following fails:

npm install -g npm@10.4.0; rm -rf node_modules/ package-lock.json; npm install

as does

npm install -g npm@10.5.0; rm -rf node_modules/ package-lock.json; npm install

with

jan@jan-lite:~/src/my-ui$ npm install -g npm@10.4.0; rm -rf node_modules/ package-lock.json; npm install

removed 13 packages, and changed 28 packages in 2s

24 packages are looking for funding
  run `npm fund` for details
npm WARN using --force Recommended protections disabled.
(⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂⠂) ⠼ idealTree:@angular/core: sill placeDep ROOT zone.js@0.14.4 OK for: my-ui@3.4.0 want: ^0.14.4
<--- Last few GCs --->

[11434:0x7323c70]    33445 ms: Scavenge (reduce) 2045.6 (2082.1) -> 2045.6 (2083.1) MB, 7.5 / 0.0 ms  (average mu = 0.634, current mu = 0.776) allocation failure; 
[11434:0x7323c70]    33785 ms: Mark-sweep (reduce) 2046.6 (2083.1) -> 2046.1 (2083.6) MB, 167.1 / 0.0 ms  (+ 70.6 ms in 16 steps since start of marking, biggest step 7.7 ms, walltime since start of marking 255 ms) (average mu = 0.507, current mu = 0.339) 

<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
 1: 0xb95be0 node::Abort() [npm install]
 2: 0xa9a7f8  [npm install]
 3: 0xd6f5b0 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [npm install]
 4: 0xd6f957 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [npm install]
 5: 0xf4ceb5  [npm install]
 6: 0xf5f38d v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [npm install]
 7: 0xf39a7e v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [npm install]
 8: 0xf3ae47 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [npm install]
 9: 0xf1b3c0 v8::internal::Factory::AllocateRaw(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [npm install]
10: 0xf12e34 v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawWithImmortalMap(int, v8::internal::AllocationType, v8::internal::Map, v8::internal::AllocationAlignment) [npm install]
11: 0xf150e8 v8::internal::FactoryBase<v8::internal::Factory>::NewRawOneByteString(int, v8::internal::AllocationType) [npm install]
12: 0x1055349 v8::internal::JsonParser<unsigned short>::MakeString(v8::internal::JsonString const&, v8::internal::Handle<v8::internal::String>) [npm install]
13: 0x1057066 v8::internal::JsonParser<unsigned short>::ParseJsonValue() [npm install]
14: 0x1057b4f v8::internal::JsonParser<unsigned short>::ParseJson() [npm install]
15: 0xdf5213 v8::internal::Builtin_JsonParse(int, unsigned long*, v8::internal::Isolate*) [npm install]
16: 0x170e179  [npm install]
Aborted (core dumped)

I have described this issue on SO as well.

Expected Behavior

The following succeeds:

npm install -g npm@10.3.0; rm -rf node_modules/ package-lock.json; npm install

as does:

npm install -g npm@10.2.5; rm -rf node_modules/ package-lock.json; npm install

Steps To Reproduce

npm install -g npm@10.4.0; rm -rf node_modules/ package-lock.json; npm install --force

or

npm install -g npm@10.5.0; rm -rf node_modules/ package-lock.json; npm install --force

with

{
  "name": "npm-oom",
  "version": "0.0.1",
  "dependencies": {
    "@angular/animations": "^16.2.12",
    "@angular/cdk": "^16.2.14",
    "@angular/cli": "^16.2.12",
    "@angular/common": "^16.2.12",
    "@angular/compiler": "^16.2.12",
    "@angular/compiler-cli": "^16.2.12",
    "@angular/core": "^16.2.12",
    "@angular/flex-layout": "^15.0.0-beta.42",
    "@angular/forms": "^16.2.12",
    "@angular/language-service": "^16.2.12",
    "@angular/material": "^16.2.14",
    "@angular/platform-browser": "^16.2.12",
    "@angular/platform-browser-dynamic": "^16.2.12",
    "@angular/router": "^16.2.12",
    "@sendgrid/mail": "^8.1.1",
    "@stomp/stompjs": "^7.0.0",
    "@types/file-saver-es": "^2.0.3",
    "cross-var": "^1.1.0",
    "enhanced-resolve": "^5.15.1",
    "file-saver-es": "^2.0.5",
    "jwt-decode": "^4.0.0",
    "ng-storages": "^1.1.5",
    "npm": "^10.5.0",
    "reflect-metadata": "^0.2.1",
    "reinstall": "^2.0.0",
    "rxjs": "^7.4.0",
    "tslib": "^2.0.0",
    "typescript": "^5.1.6",
    "zone.js": "^0.14.4"
  },
  "devDependencies": {
    "@angular-devkit/build-angular": "^16.2.12",
    "@angular-eslint/builder": "^16.3.1",
    "@angular-eslint/eslint-plugin": "^16.3.1",
    "@angular-eslint/eslint-plugin-template": "^16.3.1",
    "@angular-eslint/schematics": "^16.3.1",
    "@angular-eslint/template-parser": "^16.3.1",
    "@types/jasmine": "^5.1.4",
    "@types/node": "^20",
    "@typescript-eslint/eslint-plugin": "^7.1.1",
    "@typescript-eslint/parser": "^7.1.1",
    "jasmine-core": "^5.1.2",
    "jasmine-spec-reporter": "^7.0.0",
    "karma": "^6.3.14",
    "karma-chrome-launcher": "^3.1.0",
    "karma-coverage-istanbul-reporter": "^3.0.2",
    "karma-jasmine": "^5.1.0",
    "karma-jasmine-html-reporter": "^2.1.0",
    "protractor": "^7.0.0",
    "ts-node": "^10.0.0",
    "webdriver-manager": "^12.1.9"
  }
}

Environment

force = true

; node bin location = /home/jan/.nvm/versions/node/v18.19.1/bin/node ; node version = v18.19.1 ; npm local prefix = /home/jan/src/my-ui ; npm version = 10.4.0 ; cwd = /home/jan/src/my-ui ; HOME = /home/jan ; Run npm config ls -l to show all defaults.

ljharb commented 8 months ago

Pretty sure your dependence on “18” is a typo (not sure if it’s relevant) also, your engines.npm isn’t using ^ and doesn’t match the npm version you’re reporting.

JanStureNielsen commented 8 months ago

Thanks, @ljharb -- report fixed; same result.

XhmikosR commented 7 months ago

I can reproduce this myself under WSL with half the host's RAM available (8 GB) and it's quite a serious regression especially since these npm versions ship with Node.js LTS.

npm v10.2.4 works fine that I could try with the Node.js 20.11.1 docker image. v20.12.0 which has npm 10.5.0 has the memory leak issue which brings down my machine since it starts swapping.

Could someone have a look at it please?

melroy89 commented 7 months ago

I got the same problem I believe with: https://github.com/npm/cli/issues/7351

wraithgar commented 6 months ago

We have identified the primary culprit here, it's an unbound cache that saves all packuments in Arborist. Because we now fetch full packuments for all situations, this is causing issues on what would otherwise have been a working system configuration.

This was always technically a problem. We are doing some metrics so we know exactly the right way to fix this, and fix it in a way that hopefully even benefits those running on more memory limited environments.

You can see some initial proof of concept testing at https://github.com/npm/cli/pull/7463.

Ultimately we plan on putting in an lru-cache instead of an unbound Map, and we are doing analytics to be able to let npm make a decision about how large to let that cache get.

There is also an existing bug that has up until now not meaningfully impacted npm, and that is when the cache has stale data the packument._contentLength attribute is 0. This bug will also have to be fixed so that in those situations npm still knows the size of the packument.

JanStureNielsen commented 6 months ago

Thank you for the update, @wraithgar -- have you identified a root-cause commit for this regression?

wraithgar commented 6 months ago

Yes, as I said.

Because we now fetch full packuments for all situations

https://github.com/npm/cli/pull/7126

However it must be reiterated this was always a problem, it just didn't manifest as commonly on systems with plenty of memory. This commit pushed the line much lower, is all.