aws-samples / aws-codebuild-samples

Utilities and samples for building on CodeBuild
Apache License 2.0
235 stars 428 forks source link

Local cache of node_modules doesn't work with yarn workspaces #8

Open jogold opened 5 years ago

jogold commented 5 years ago

See https://stackoverflow.com/questions/55890275/aws-codebuild-does-not-work-with-yarn-workspaces for a description of the problem.

daniel-cottone commented 5 years ago

Also getting this error.

error An unexpected error occurred: "EEXIST: file already exists, mkdir ..."
jogold commented 5 years ago

Any update on this?

simon-lanf commented 4 years ago

It does not work for npm either.

https://github.com/aws-samples/aws-codebuild-samples/issues/6 https://stackoverflow.com/questions/55890275/aws-codebuild-does-not-work-with-yarn-workspaces https://stackoverflow.com/questions/54621412/passing-all-node-modules-path-to-aws-codebuild-cache

YoshiWalsh commented 4 years ago

I'm also encountering issues when node_modules is retrieved from cache. For me it's happening with S3 cache.

[Container] 2019/12/27 02:19:01 Entering phase BUILD 
[Container] 2019/12/27 02:19:01 Running command yarn run build 
yarn run v1.19.1 
$ gatsby build 
error There was a problem loading the local build command. Gatsby may not be installed in your site's "node_modules" directory. Perhaps you need to run "npm install"? You might need to delete your "package-lock.json" as well. 
error Command failed with exit code 1. 
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. 

[Container] 2019/12/27 02:19:02 Command did not exit successfully yarn run build exit status 1 
[Container] 2019/12/27 02:19:02 Phase complete: BUILD State: FAILED 
[Container] 2019/12/27 02:19:02 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: yarn run build. Reason: exit status 1 
[Container] 2019/12/27 02:19:02 Entering phase POST_BUILD 
[Container] 2019/12/27 02:19:02 Running command yarn run deploy 
yarn run v1.19.1 
$ gatsby-plugin-s3 deploy --yes && aws cloudfront create-invalidation --distribution-id $CLOUDFRONT_DISTRIBUTION --paths "/*" 
/bin/sh: gatsby-plugin-s3: command not found 
error Command failed with exit code 127. 
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command. 

[Container] 2019/12/27 02:19:02 Command did not exit successfully yarn run deploy exit status 127 
[Container] 2019/12/27 02:19:02 Phase complete: POST_BUILD State: FAILED 
[Container] 2019/12/27 02:19:02 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: yarn run deploy. Reason: exit status 127 
thedmeyer commented 4 years ago

Hi, any update on this?

jpswade commented 4 years ago

@josephvusich this issue does not appear to have been resolved, would you be able to have a look into it?

jpswade commented 4 years ago

I've seen an approach, by setting a cache directory here:

https://mechanicalrock.github.io/2019/02/03/monorepos-aws-codebuild.html

jpswade commented 4 years ago

This seems like another approach, specifying the entire directory should work.

https://stackoverflow.com/questions/58793704/aws-codebuild-local-cache-failing-to-actually-cache

I've not tried it yet.

jogold commented 4 years ago

This seems like another approach, specifying the entire directory should work.

https://stackoverflow.com/questions/58793704/aws-codebuild-local-cache-failing-to-actually-cache

I've not tried it yet.

Unfortunately this doesn't work for me.

solarmosaic-kflorence commented 3 years ago

I have this same issue. I am using LOCAL cache type and LOCAL_CUSTOM_CACHE cache mode with cache paths ["/root/.sbt/", "/root/.m2/", "/root/.ivy2/", "/root/.cache/"] (this is for an SBT project). I can see in my build that it is correctly caching and restoring those folders when the builds run, but the folders are always empty, even when the builds run right after one another.

pbn4 commented 3 years ago

So is there a solution to this?

solarmosaic-kflorence commented 3 years ago

The best solution I can think of for now is to copy all of the cache paths (e.g. /root/.cache) into the build cache (which is stored in S3 for example, /root/.cache -> $CODEBUILD_SRC_DIR/.cache), and then when the build starts up copying it back to the expected locations (e.g. /root/.cache). It works but it adds a bit of complexity to each buildspec.

rdsedmundo commented 3 years ago

I was having the exact same issue ("EEXIST: file already exists, mkdir"), I ended up using S3 cache and it worked pretty well. Note: for some reason the first upload to S3 took way (10 minutes) too long, the others went fine.

Before:

[5/5] Building fresh packages...
--
Done in 60.28s.

After:

[5/5] Building fresh packages...
--
Done in 6.64s.

If you already have your project configured you can edit the cache accessing the Project -> Edit -> Artifacts -> Additional configuration.

My buildspec.yml is as follows:

version: 0.2

phases:
  install:
    runtime-versions:
       nodejs: 14
  build:
    commands:
      - yarn config set cache-folder /root/.yarn-cache
      - yarn install --frozen-lockfile
      - ...other build commands go here

cache:
  paths:
    - '/root/.yarn-cache/**/*'
    - 'node_modules/**/*'
    # This third entry is only if you're using monorepos (under the packages folder)
    # - 'packages/**/node_modules/**/*'

If you use NPM you'd do something similar, with slightly different commands:

version: 0.2

phases:
  install:
    runtime-versions:
       nodejs: 14
  build:
    commands:
      - npm config -g set prefer-offline true
      - npm config -g set cache /root/.npm
      - npm ci
      - ...other build commands go here

cache:
  paths:
    - '/root/.npm-cache/**/*'
    - 'node_modules/**/*'
    # This third entry is only if you're using monorepos (under the packages folder)
    # - 'packages/**/node_modules/**/*'

Kudos to: https://mechanicalrock.github.io/2019/02/03/monorepos-aws-codebuild.html

acatusse commented 3 years ago

GIVEN :

cache:
  paths:
     - /tmp/cached-dir/
     - /tmp/cached-dir/**/*

Buildspec execute this script on CI :

[[ -f /tmp/cached-dir/cached-file ]] || echo 'FILE NOT CACHED YET'
touch /tmp/cached-dir/cached-file # create the file in the cache (should not print line above next build)

THEN on Aws CI:

Run 2 build in a row and got :

2021/09/12 08:34:48 Moving to directory /codebuild/output/src530968031/src
2021/09/12 08:34:48 Symlinking: /tmp/cached-dir /codebuild/local-cache/custom/5ed44f40b68ea51c687c07d4dfdc9f4c800e493dbad13bc37339f9d742085300/tmp/cached-dir

FILE NOT CACHED YET

Conclusion : Local cache does not work or the setup is kept secret so we use S3 bucket and pay. :)