Is there something like a wait argument within submitJobs (as within its BatchJobs-version)? And can I somewhere set the max. number of concurrent jobs within batchtools? Currently, I always get the following error message when exceeding the max. number of concurrent jobs:
> submitJobs(findNotSubmitted(ids), resources = list(walltime = 8L * 3600L))
Submitting 30 jobs in 30 chunks using cluster functions 'TORQUE' ...
Submitting [======================-----------------------------] 43% eta: 3s
Error in submitJobs(ids, resources = list(walltime = 8L * :
Fatal error occurred: 101. Command 'qsub' produced exit code 226. Output: 'qsub: Maximum number of jobs already in queue for user MSG=total number of current user's jobs exceeds the queue limit: user [abc], queue [def]
Also, I now have to manually submit new jobs once some of my jobs are done.
And do you have any suggestions for a convenient fix to that -- except for a loop that checks every couple of minutes whether I can submit new jobs and then submits the amount of jobs that is accepted by the cluster?
Is there something like a
wait
argument withinsubmitJobs
(as within itsBatchJobs
-version)? And can I somewhere set the max. number of concurrent jobs withinbatchtools
? Currently, I always get the following error message when exceeding the max. number of concurrent jobs:Also, I now have to manually submit new jobs once some of my jobs are done.
And do you have any suggestions for a convenient fix to that -- except for a loop that checks every couple of minutes whether I can submit new jobs and then submits the amount of jobs that is accepted by the cluster?