jenkinsci / jobcacher-plugin

Jenkins plugin that improves build performance for transient agents by caching files
https://plugins.jenkins.io/jobcacher/
MIT License
31 stars 32 forks source link

It skips cache creation when changing `compressionMethod` #106

Closed viceice closed 1 year ago

viceice commented 1 year ago

Jenkins and plugins versions report

Environment ```text Jenkins: 2.361.4 OS: Linux - 5.15.0-1021-azure --- Office-365-Connector:4.17.0 ace-editor:1.1 additional-identities-plugin:1.1 analysis-model-api:10.20.0 ansicolor:1.0.2 antisamy-markup-formatter:155.v795fb_8702324 apache-httpcomponents-client-4-api:4.5.13-138.v4e7d9a_7b_a_e61 authentication-tokens:1.4 authorize-project:1.4.0 aws-credentials:191.vcb_f183ce58b_9 aws-java-sdk:1.12.287-357.vf82d85a_6eefd aws-java-sdk-cloudformation:1.12.287-357.vf82d85a_6eefd aws-java-sdk-codebuild:1.12.287-357.vf82d85a_6eefd aws-java-sdk-ec2:1.12.287-357.vf82d85a_6eefd aws-java-sdk-ecr:1.12.287-357.vf82d85a_6eefd aws-java-sdk-ecs:1.12.287-357.vf82d85a_6eefd aws-java-sdk-efs:1.12.287-357.vf82d85a_6eefd aws-java-sdk-elasticbeanstalk:1.12.287-357.vf82d85a_6eefd aws-java-sdk-iam:1.12.287-357.vf82d85a_6eefd aws-java-sdk-logs:1.12.287-357.vf82d85a_6eefd aws-java-sdk-minimal:1.12.287-357.vf82d85a_6eefd aws-java-sdk-sns:1.12.287-357.vf82d85a_6eefd aws-java-sdk-sqs:1.12.287-357.vf82d85a_6eefd aws-java-sdk-ssm:1.12.287-357.vf82d85a_6eefd basic-branch-build-strategies:71.vc1421f89888e blueocean-autofavorite:1.2.5 blueocean-commons:1.25.8 blueocean-config:1.25.8 blueocean-core-js:1.25.8 blueocean-dashboard:1.25.8 blueocean-display-url:2.4.1 blueocean-events:1.25.8 blueocean-git-pipeline:1.25.8 blueocean-github-pipeline:1.25.8 blueocean-i18n:1.25.8 blueocean-jwt:1.25.8 blueocean-personalization:1.25.8 blueocean-pipeline-api-impl:1.25.8 blueocean-pipeline-editor:1.25.8 blueocean-pipeline-scm-api:1.25.8 blueocean-rest:1.25.8 blueocean-rest-impl:1.25.8 blueocean-web:1.25.8 bootstrap4-api:4.6.0-5 bootstrap5-api:5.2.1-3 bouncycastle-api:2.26 branch-api:2.1051.v9985666b_f6cc build-token-root:151.va_e52fe3215fc caffeine-api:2.9.3-65.v6a_47d0f4d1fe checks-api:1.8.0 cloudbees-folder:6.758.vfd75d09eea_a_1 code-coverage-api:3.2.0 command-launcher:90.v669d7ccb_7c31 commons-lang3-api:3.12.0-36.vd97de6465d5b_ commons-text-api:1.10.0-27.vb_fa_3896786a_7 config-file-provider:3.11.1 credentials:1189.vf61b_a_5e2f62e credentials-binding:523.vd859a_4b_122e6 data-tables-api:1.12.1-4 display-url-api:2.3.6 docker-commons:1.21 docker-java-api:3.2.13-37.vf3411c9828b9 docker-plugin:1.2.10 docker-workflow:528.v7c193a_0b_e67c durable-task:501.ve5d4fc08b0be echarts-api:5.4.0-1 email-ext:2.92 embeddable-build-status:304.vdcf48d6b_d2eb external-monitor-job:203.v683c09d993b_9 favorite:2.4.1 font-awesome-api:6.2.0-3 forensics-api:1.16.0 git:4.13.0 git-client:3.13.0 git-forensics:1.11.0 git-parameter:0.9.18 gitea:1.4.4 gitea-checks:449.v0672c320ee47 github:1.36.0 github-api:1.303-400.v35c2d8258028 github-branch-source:1696.v3a_7603564d04 google-oauth-plugin:1.0.7 gravatar:2.2 handlebars:3.0.8 handy-uri-templates-2-api:2.1.8-22.v77d5b_75e6953 htmlpublisher:1.31 instance-identity:116.vf8f487400980 ionicons-api:31.v4757b_6987003 jackson2-api:2.13.4.20221013-295.v8e29ea_354141 jakarta-activation-api:2.0.1-2 jakarta-mail-api:2.0.1-2 javax-activation-api:1.2.0-5 javax-mail-api:1.6.2-8 jaxb:2.3.7-1 jdk-tool:63.v62d2fd4b_4793 jenkins-design-language:1.25.8 jjwt-api:0.11.5-77.v646c772fddb_0 jobcacher:304.v634d2e1b_2489 jquery:1.12.4-1 jquery3-api:3.6.1-2 jsch:0.1.55.61.va_e9ee26616e7 junit:1160.vf1f01a_a_ea_b_7f kubernetes:3734.v562b_b_a_627ea_c kubernetes-client-api:5.12.2-193.v26a_6078f65a_9 kubernetes-credentials:0.9.0 ldap:2.12 lockable-resources:2.18 mailer:438.v02c7f0a_12fa_4 matrix-auth:3.1.5 matrix-project:785.v06b_7f47b_c631 metrics:4.2.10-405.v60a_9cc74e923 mina-sshd-api-common:2.9.2-50.va_0e1f42659a_a mina-sshd-api-core:2.9.2-50.va_0e1f42659a_a momentjs:1.1.1 mstest:1.0.0 nunit:0.28 oauth-credentials:0.5 okhttp-api:4.9.3-108.v0feda04578cf pam-auth:1.10 pipeline-build-step:2.18 pipeline-graph-analysis:195.v5812d95a_a_2f9 pipeline-groovy-lib:621.vb_44ce045b_582 pipeline-input-step:456.vd8a_957db_5b_e9 pipeline-milestone-step:101.vd572fef9d926 pipeline-model-api:2.2118.v31fd5b_9944b_5 pipeline-model-definition:2.2118.v31fd5b_9944b_5 pipeline-model-extensions:2.2118.v31fd5b_9944b_5 pipeline-multibranch-defaults:2.1 pipeline-rest-api:2.27 pipeline-stage-step:296.v5f6908f017a_5 pipeline-stage-tags-metadata:2.2118.v31fd5b_9944b_5 pipeline-stage-view:2.27 pipeline-utility-steps:2.13.2 plain-credentials:139.ved2b_9cf7587b plugin-util-api:2.18.0 popper-api:1.16.1-3 popper2-api:2.11.6-2 prism-api:1.29.0-1 pubsub-light:1.17 saml:4.372.v89f13e4c9e97 scm-api:621.vda_a_b_055e58f7 scoring-load-balancer:1.0.1 script-security:1218.v39ca_7f7ed0a_c sidebar-link:2.2.0 simple-theme-plugin:136.v23a_15f86c53d snakeyaml-api:1.33-90.v80dcb_3814d35 sse-gateway:1.26 ssh-agent:295.v9ca_a_1c7cc3a_a_ ssh-credentials:305.v8f4381501156 ssh-slaves:2.854.v7fd446b_337c9 sshd:3.249.v2dc2ea_416e33 structs:324.va_f5d6774f3a_d timestamper:1.21 token-macro:308.v4f2b_ed62b_b_16 translation:1.16 trilead-api:2.72.v2a_3236754f73 variant:59.vf075fe829ccb versioncolumn:87.v8fe7c090a_d3b warnings-ng:9.20.1 windows-slaves:1.8.1 workflow-aggregator:590.v6a_d052e5a_a_b_5 workflow-api:1200.v8005c684b_a_c6 workflow-basic-steps:994.vd57e3ca_46d24 workflow-cps:3536.vb_8a_6628079d5 workflow-durable-task-step:1217.v38306d8fa_b_5c workflow-job:1254.v3f64639b_11dd workflow-multibranch:716.vc692a_e52371b_ workflow-scm-step:400.v6b_89a_1317c9a_ workflow-step-api:639.v6eca_cd8c04a_a_ workflow-support:839.v35e2736cfd5c ```

What Operating System are you using (both controller, and any agents involved in the problem)?

Controller: official docker lts image Agent: ubuntu 20.04 amd64 vm

Reproduction steps

  1. run job with tgz compression
  2. run job with zstd compression

Expected Results

it should use existing tgz file for extract or should nod skip compression

Actual Results

it uses neither the existing file nor it creates a new cache file

[Cache for .yarn/cache] Skip cache creation as the cache is up-to-date

Anything else?

workaround: delete the hash file

repolevedavaj commented 1 year ago

@viceice what do you think about if we restore an existing cache but recreate it at the end of the pipeline (if the compression method has been changed)? Alternatively, we could just keep the cache until the cache is really outdated an needs to be recreated.

viceice commented 1 year ago

i think we should keep until it's outdated. so restore whatever compression type is found.