Open JoeHCQ1 opened 4 months ago
Good mention/link on the HEAD
(un)limit in Dockerhub docs...is it possible that zarf's "use cache" code is making other GET
requests here that are triggering the limit? I've definitely encountered the same behavior, hitting dockerhub rate limits, even when rebuilding the same package without clearing cache.
Is your feature request related to a problem? Please describe.
When I am developing a UDS bundle I run multiple
uds run
oruds run dev
iterations. Each of those results in a HEAD request to dockerhub (most often) to confirm there is no new sha for my selected tag. DockerHub eventually blocks those requests (took about 4 hours as an anyonymous user when I frankly wasn't even working that fast). Logging into DockerHub helps, but may not solve the problem.Describe the solution you'd like
package create
is the command.Alternative/additional solution
Provide a
--no-pull
or--force-cache
flag which skips the head request altogether.Additional context
The DockerHub doc on download rate-limiting makes it clear that HEAD requests do not count as a pull or count towards your pull rate limit. It also says HEAD requests are limited, but that you should never run into that limit. This document alone would suggest that the particular scenario where I am encountering this problem shouldn't exist. Well, it does. And another unicorn at Defense Unicorns encountered it recently as well. Recording this detail to avoid a future red-herring.
@mjnagel @Racer159 @schristoff-du