pnpm / pnpm

Fast, disk space efficient package manager
https://pnpm.io
MIT License
29.8k stars 1.02k forks source link

pnpm fetch way in CI does not work #6615

Open mrspartak opened 1 year ago

mrspartak commented 1 year ago

I followed the guide here https://pnpm.io/cli/fetch and it seems, that pnpm install does not create node_modules folder for apps/frontend. If I run pnpm install --frozen-lockfile --filter "chai-frontend-vue..." locally (on my laptop), then everything works fine.

pnpm version: 8.5.1

Code to reproduce the issue:

My Dockerfile:

FROM node:18.13.0-buster as build-stage

RUN corepack enable
RUN corepack prepare pnpm@8.5.1 --activate

WORKDIR /usr/app

COPY --chown=node:node pnpm-lock.yaml ./

RUN pnpm fetch

COPY --chown=node:node . ./
RUN pnpm install --frozen-lockfile --offline --filter "chai-frontend-vue..."

WORKDIR /usr/app/apps/frontend

RUN pnpm run build

Log:

<skip dummy info>
#11 [build-stage 2/10] RUN corepack enable
#11 DONE 1.3s
#12 [build-stage 3/10] RUN corepack prepare pnpm@8.5.1 --activate
#12 0.304 Preparing pnpm@8.5.1 for immediate activation...
#12 DONE 0.9s
#13 [build-stage 4/10] WORKDIR /usr/app
#13 DONE 0.0s
#14 [build-stage 5/10] COPY --chown=node:node pnpm-lock.yaml ./
#14 DONE 0.0s
#15 [build-stage 6/10] RUN pnpm fetch
#15 1.021 Importing packages to virtual store
#15 1.048 Already up to date
#15 1.089 Progress: resolved 1, reused 0, downloaded 0, added 0
#15 1.914 Packages are hard linked from the content-addressable store to the virtual store.
#15 1.914   Content-addressable store is at: /root/.local/share/pnpm/store/v3
#15 1.914   Virtual store is at:             node_modules/.pnpm
#15 2.096 Progress: resolved 730, reused 0, downloaded 24, added 10
#15 3.107 Progress: resolved 730, reused 0, downloaded 53, added 45
#15 4.117 Progress: resolved 730, reused 0, downloaded 83, added 81
#15 5.134 Progress: resolved 730, reused 0, downloaded 174, added 167
#15 6.137 Progress: resolved 730, reused 0, downloaded 216, added 215
#15 7.[141](<skip dummy info>#L141) Progress: resolved 730, reused 0, downloaded 242, added 241
#15 8.141 Progress: resolved 730, reused 0, downloaded 271, added 271
#15 9.141 Progress: resolved 730, reused 0, downloaded 302, added 303
#15 10.15 Progress: resolved 730, reused 0, downloaded 350, added 349
#15 11.16 Progress: resolved 730, reused 0, downloaded 428, added 425
#15 12.16 Progress: resolved 730, reused 0, downloaded 465, added 465
#15 13.16 Progress: resolved 730, reused 0, downloaded 541, added 541
#15 14.16 Progress: resolved 730, reused 0, downloaded 644, added 644
#15 15.18 Progress: resolved 730, reused 0, downloaded 694, added 696
#15 16.19 Progress: resolved 730, reused 0, downloaded 724, added 727
#15 11.16 Progress: resolved 730, reused 0, downloaded 428, added 425
#15 12.16 Progress: resolved 730, reused 0, downloaded 465, added 465
#15 13.16 Progress: resolved 730, reused 0, downloaded 541, added 541
#15 14.16 Progress: resolved 730, reused 0, downloaded 644, added 644
#15 15.18 Progress: resolved 730, reused 0, downloaded 694, added 696
#15 16.19 Progress: resolved 730, reused 0, downloaded 724, added 727
#15 17.20 Progress: resolved 730, reused 0, downloaded 727, added 730, done
<skip postinstall info>
#15 DONE 24.2s
#16 [build-stage 7/10] COPY --chown=node:node . ./
#16 DONE 3.4s
#17 [build-stage 8/10] RUN pnpm install --frozen-lockfile --offline --filter "chai-frontend-vue..."
#17 0.701 Scope: 4 of 10 workspace projects
#17 0.735 ? The modules directory at "/usr/app/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true
#17 DONE 0.8s
#18 [build-stage 9/10] WORKDIR /usr/app/apps/frontend
#18 DONE 0.0s
#19 [build-stage 10/10] RUN pnpm run build
#19 0.724 
#19 0.724 > chai-frontend-vue@0.0.0 build /usr/app/apps/frontend
#19 0.724 > run-p type-check build-only
#19 0.724 
#19 0.731 sh: 1: run-p: not found
#19 0.734  ELIFECYCLE  Command failed.
#19 0.735  WARN   Local package.json exists, but node_modules missing, did you mean to install?
#19 ERROR: executor failed running [/bin/sh -c pnpm run build]: runc did not terminate sucessfully

Expected behavior:

pnpm install will install all the needed packages from virtual store

Actual behavior:

It says that it will install them Scope: 4 of 10 workspace projects but node_modules are missing

Additional information:

patroza commented 1 year ago

this is a pain

thynson commented 1 year ago

I noticed this issue in Docker build with recent pnpm too, it failed when pnpm install packages with --prod and after I removed --prod it works again. Have you tried to downgrade pnpm to an earlier version, say, 8.1.0 and test if it works?

thynson commented 1 year ago

After checking the code, it looks like to happen when the registry setting is mismatched with the lockfile or, the included dependencies( --prod / --dev) is mismatched. @mrspartak, could you please check if you're using a custom registry, and if so, ensure it's in of your project .npmrc file, and in the Dockerfile, copy it into the docker build context together with pnpm-lockfile.yaml before pnpm fetch instruction.

au-z commented 1 year ago

I have the same issue. In my case, pnpm i seems to be skipping the actual install after printing out the "Proceed? (Y/n)" message so that no actual packages are installed.

The relevant snippet of my logs:

#15 0.811 ? The modules directory at "/api/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ #15 DONE 1.2s
zkochan commented 1 year ago

This prompt is skipped if you use the --force CLI option.

cfeenstra67 commented 1 year ago

Tip if anyone else comes across this; I took a look at the code and it appears that this only happens when confirmModulesPurge is true or unset. I'm not sure if it's possible to specify this directly on the command line (I didn't see an option documented, I might have missed it though), but it also checks whether the command is being run as part of a CI process, and if so it does not print this message: https://github.com/pnpm/pnpm/blob/347ca9f30385042584ee220bfe422911d07c304f/pkg-manager/get-context/src/index.ts#L123

So I was able to get around this by simply setting CI=1 in the environment when running the command:

CI=1 pnpm install <args>

The solution above using --force should also work, but according to the description of that argument it will also refetch dependencies, so if you're using an approach where you install production dependencies in one step, then later install build dependencies in addition to the production dependencies as part of a multi-stage build, it's not ideal

iiian commented 11 months ago

FWIW, I face this issue when I use pnpm@8.11.0 but not pnpm@7.33.0. Possible to get around it if you tread back some versions

Technoash commented 11 months ago

This just happend to me now. It stalls the vercel build CI=1 in vercel environment variables does not prevent it. also tried updating the install command from pnpm i -w --filter admin... --filter ...^admin

to

CI=1 pnpm i -w --filter admin... --filter ...^admin

Does not help.

Tried also in the root package.json "packageManager": "pnpm@7.33.0",

Didn't help either


../node_modules/@prisma/client postinstall: If you do not have a Prisma schema file yet, you can ignore this message.
--
18:42:27.738 | .../node_modules/@prisma/client postinstall: Done
18:42:28.664 | . prepare$ npx husky install
18:42:33.232 | . prepare: npm WARN exec The following package was not found and will be installed: husky@8.0.3
18:42:33.416 | . prepare: husky - Git hooks installed
18:42:33.431 | . prepare: Done
18:42:33.490 | Done in 18.8s
18:42:33.687 | Installing dependencies...
18:42:34.218 | Scope: all 11 workspace projects
18:42:34.317 | ? The modules directory at "/vercel/path0/packages/eslint-config/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/ts-config/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/assets/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/db/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/env/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/constants/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/utils/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/server/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/apps/admin/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/apps/site/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true
18:42:34.317 | (node:330) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 keypress listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
18:42:34.317 | (Use `node --trace-warnings ...` to show where the warning was created)
18:42:34.317 | (node:330) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 end listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
Technoash commented 11 months ago

Update: It's even happening with --force


Running build in Washington, D.C., USA (East) – iad1
--
18:46:46.944 | Cloning bitbucket.org/releaseit-pty-ltd/greenfriday-v2 (Branch: upgrade, Commit: b85cd30)
18:47:00.704 | Cloning completed: 13.761s
18:47:01.161 | Restored build cache
18:47:01.237 | Running "vercel build"
18:47:01.730 | Vercel CLI 32.6.0
18:47:01.869 | > Detected Turbo. Adjusting default settings...
18:47:02.472 | Warning: Detected "engines": { "node": ">=18.0.0" } in your `package.json` that will automatically upgrade when a new major Node.js Version is released. Learn More: http://vercel.link/node-version
18:47:02.509 | Detected `pnpm-lock.yaml` version 6 generated by pnpm 8
18:47:02.509 | Running "install" command: `pnpm i -w --filter admin... --filter ...^admin --force`...
18:47:04.607 | .                                        \|  WARN  using --force I sure hope you know what you are doing
18:47:04.608 | Scope: 10 of 11 workspace projects
18:47:04.728 | Lockfile is up to date, resolution step is skipped
18:47:04.812 | Progress: resolved 1, reused 0, downloaded 0, added 0
18:47:05.087 | .                                        \|    +1239 ++++++++++++++++++++++++++++
18:47:05.814 | Progress: resolved 1239, reused 0, downloaded 86, added 82
18:47:06.819 | Progress: resolved 1239, reused 0, downloaded 296, added 284
18:47:07.842 | Progress: resolved 1239, reused 0, downloaded 455, added 452
18:47:08.868 | Progress: resolved 1239, reused 0, downloaded 497, added 488
18:47:09.843 | Progress: resolved 1239, reused 0, downloaded 696, added 685
18:47:10.875 | Progress: resolved 1239, reused 0, downloaded 789, added 772
18:47:11.844 | Progress: resolved 1239, reused 0, downloaded 960, added 953
18:47:12.871 | Progress: resolved 1239, reused 0, downloaded 1095, added 1084
18:47:13.859 | Progress: resolved 1239, reused 0, downloaded 1142, added 1136
18:47:14.923 | Progress: resolved 1239, reused 0, downloaded 1151, added 1145
18:47:15.924 | Progress: resolved 1239, reused 0, downloaded 1160, added 1150
18:47:16.924 | Progress: resolved 1239, reused 0, downloaded 1163, added 1150
18:47:17.924 | Progress: resolved 1239, reused 0, downloaded 1172, added 1154
18:47:18.924 | Progress: resolved 1239, reused 0, downloaded 1177, added 1159
18:47:20.252 | Progress: resolved 1239, reused 0, downloaded 1177, added 1160
18:47:21.253 | Progress: resolved 1239, reused 0, downloaded 1181, added 1163
18:47:22.253 | Progress: resolved 1239, reused 0, downloaded 1182, added 1167
18:47:23.260 | Progress: resolved 1239, reused 0, downloaded 1194, added 1181
18:47:24.262 | Progress: resolved 1239, reused 0, downloaded 1238, added 1238
18:47:24.475 | Progress: resolved 1239, reused 0, downloaded 1239, added 1239, done
18:47:24.992 | .../es5-ext@0.10.62/node_modules/es5-ext postinstall$  node -e "try{require('./_postinstall')}catch(e){}" \|\| exit 0
18:47:25.037 | .../node_modules/@prisma/engines postinstall$ node scripts/postinstall.js
18:47:25.110 | .../es5-ext@0.10.62/node_modules/es5-ext postinstall: Done
18:47:25.833 | .../node_modules/@prisma/engines postinstall: Done
18:47:25.971 | .../esbuild@0.17.6/node_modules/esbuild postinstall$ node install.js
18:47:25.971 | .../esbuild@0.18.20/node_modules/esbuild postinstall$ node install.js
18:47:26.086 | .../prisma@5.6.0/node_modules/prisma preinstall$ node scripts/preinstall-entry.js
18:47:26.132 | .../esbuild@0.17.6/node_modules/esbuild postinstall: Done
18:47:26.156 | .../esbuild@0.18.20/node_modules/esbuild postinstall: Done
18:47:26.192 | .../prisma@5.6.0/node_modules/prisma preinstall: Done
18:47:26.320 | .../node_modules/@prisma/client postinstall$ node scripts/postinstall.js
18:47:27.166 | .../node_modules/@prisma/client postinstall: prisma:warn We could not find your Prisma schema at `prisma/schema.prisma`.
18:47:27.167 | .../node_modules/@prisma/client postinstall: If you have a Prisma schema file in a custom path, you will need to run
18:47:27.167 | .../node_modules/@prisma/client postinstall: `prisma generate --schema=./path/to/your/schema.prisma` to generate Prisma Client.
18:47:27.168 | .../node_modules/@prisma/client postinstall: If you do not have a Prisma schema file yet, you can ignore this message.
18:47:27.199 | .../node_modules/@prisma/client postinstall: Done
18:47:28.343 | . prepare$ npx husky install
18:47:28.717 | . prepare: npm WARN using --force Recommended protections disabled.
18:47:33.384 | . prepare: npm WARN exec The following package was not found and will be installed: husky@8.0.3
18:47:33.829 | . prepare: husky - Git hooks installed
18:47:33.846 | . prepare: Done
18:47:33.911 | Done in 31.2s
18:47:34.158 | Installing dependencies...
18:47:34.734 | Scope: all 11 workspace projects
18:47:34.835 | ? The modules directory at "/vercel/path0/packages/eslint-config/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/ts-config/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/assets/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/db/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/env/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/constants/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/utils/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/packages/server/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/apps/admin/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true? The modules directory at "/vercel/path0/apps/site/node_modules" will be removed and reinstalled from scratch. Proceed? (Y/n) ‣ true
18:47:34.836 | (node:375) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 keypress listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
18:47:34.836 | (Use `node --trace-warnings ...` to show where the warning was created)
18:47:34.836 | (node:375) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 end listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
deadcoder0904 commented 8 months ago

--force worked for me otherwise it was breaking before like this.

BjoernRave commented 7 months ago

I have the same issue, and using --force will always reinstall all packages...