Closed SeriousBug closed 11 months ago
It looks like the code already supports (anong others) these config on a pipeline step:
See here for the meaning: https://docs.docker.com/config/containers/resource_constraints/#cpu
Can you try if this solves your problem.
As these are not documented ,use at your own risk
I don't have any limits placed on the agent though. It seems like the agent is placing limits on the containers it spawns. Is there a configuration for that?
Have you tried to put this into your pipeline configuration on the step / container you want more CPU?
The code says you can do something like this:
-name: Run on more CPU
image: debian
cpu_shares: 100000
commands:
- ...
These should be the settings used by the agent to start the container, but i only started to understand the code.
re
It looks like the code already supports (anong others) these config on a pipeline step:
* cpu_quota * cpuset * cpu_shares
See here for the meaning: https://docs.docker.com/config/containers/resource_constraints/#cpu
Can you try if this solves your problem.
As these are not documented ,use at your own risk
reminder: if it's not documented, it's a "feature" that can be refactored away at any time ... if it does get in the way ... until its officially added to the docs (witch should follow only after there is consensus ...)
how to handle resources ... is a bigger topic ..., ideas are welcome but I personally consider other issues more important for now :sweat:
Clear and concise description of the problem
It is possible to increase the number of workflows that run in parallel by increasing
WOODPECKER_MAX_WORKFLOWS
. However it seems like each workflow only gets a single CPU core/thread. I haven't confirmed this, but I did see thatcargo
was using only a single thread.Suggested solution
It would be nice if we could increase the number of threads that each workflow gets. Workflows that can take advantage of many cores would benefit from this. For example,
cargo
will spawn multiple threads to compile things in parallel, which greatly speeds up compilation.This should be configurable through an environment variable like
WOODPECKER_MAX_THREADS_PER_WORKFLOW
.Alternative
Using multi-pipelines does allow multiple pipelines to run in parallel, which is amazing but not enough as steps like compiling one package can't be broken into multiple pipelines.
Additional context
No response
Validations