-
I'm currently running a handful of periodic batch jobs. It's great that nomad doesn't schedule another one if a current one is still running. However, I think it would be helpful to stop a batch job o…
-
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Current Behavior
Batch Array Jobs **ARE NOT** executed with environment that includes AWS_BATCH_JOB_ARRAY_INDE…
-
Thank you for your work on this project. I've encountered an issue while deploying the code following the instructions:
1. Updated package.json to use "sst": "^2.41.4"
2. Deployed using r7gd.4xlar…
egs40 updated
2 weeks ago
-
After the successful deployment of our `pixi`/`micromamba`-based interactive sessions, we'd like to now create a similar system for batch jobs, where we use our ability to mount EFS volumes with pre-i…
-
### Description
When defining a batch consumer with a batch size bigger than current amount of messages in queue, it still flushes the queue no matter what by adding FlushBatchHandlersStamp to the en…
-
Hi I'm trying to run a batch job utilizing the --array slurm option. Wondering if this is possible using drmaa-python. I know there is a runBulkJobs(...), however it seems that this doesn't run an arr…
-
2 new use cases came up, with a similar solution:
1. Our UDP users want to treat a UDP basically as a 'virtual collection'. To allow this, they would like to know the STAC (collection) metadata of…
-
Hi @yunguan-wang,
I noticed in the manuscript it noted that parallel processing actually resulted in slower jobs (presumably implemented in R), but spacia.py has a worker pool. It appears to batch…
-
I wanted to know if it was possible to submit let's say 5 folding jobs on the same GPU in parallel by increasing the batch size? I don't want to do it one by one. When I try and do multiprocessing and…
-
### Label
batch, inference, meta/workflow
### Priority Label
high priority
### Is your feature request related to a problem? Please describe.
One needs to write it's own batch script (see example…