meltwater / drone-cache

A Drone plugin for caching current workspace files between builds to reduce your build times
https://underthehood.meltwater.com/blog/2019/04/10/making-drone-builds-10-times-faster/
Apache License 2.0
338 stars 81 forks source link

[Bug] Unable to create really big caches #253

Open jimsynz opened 11 months ago

jimsynz commented 11 months ago

Observed Behaviour

I have a job which is trying to cache about 8GB of data rather than download it again from the internet. Sadly it fails with the following error:

level=error name=****** ts=2023-10-12T08:04:27.021698118Z caller=main.go:628 err="[IMPORTANT] build cache, rebuild failed, upload from <.nerves> to <bivouac_system_x86_64/eb0a191797624dd3a48fa681d3061212/.nerves>, rebuilder rebuild put file, upload file, pipe reader failed, storage backend put failure, put the object, MultipartUpload: upload multipart failed\n\tupload id: MWVmYTU1MzQtODY1YS00MjQ4LWEwMDItMWE5ODNkNTE5Yzg5LjQ1ODU0OTc2LWZhNzAtNDE5Mi05OTk0LThiNGZkY2EwMTViNw\ncaused by: RequestCanceled: request context canceled\ncaused by: context deadline exceeded\n"

See the entire job here: https://drone.harton.nz/bivouac/bivouac_system_x86_64/35/1/11

Steps to Reproduce

  1. Generate a lot of data
  2. Try and cache it.

Expected Behaviour

It should push the cache to the remote storage (even if it takes a long time).

Nutomic commented 8 months ago

I had the same problem, you need to set --backend.operation-timeout to fix it.

Edit: One remaining problem is that the cache is split into 5mb files, which probably causes a lot of overhead when dealing with a few gb of data. Would be nice if that could be configured to a higher file size.