miniwdl-ext / miniwdl-slurm

MIT License
4 stars 1 forks source link

Support gpus on slurm #3

Closed williamrowell closed 1 year ago

williamrowell commented 1 year ago

Checklist

I can change num_gpu to gpuCount if preferred.

I've tested all of these internally and can build a test if that would help.

codecov-commenter commented 1 year ago

Codecov Report

:exclamation: No coverage uploaded for pull request base (develop@3a5c7d0). Click here to learn what that means. The diff coverage is n/a.

:mega: This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

@@            Coverage Diff             @@
##             develop       #3   +/-   ##
==========================================
  Coverage           ?   85.71%           
==========================================
  Files              ?        1           
  Lines              ?       70           
  Branches           ?        0           
==========================================
  Hits               ?       60           
  Misses             ?       10           
  Partials           ?        0           

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

rhpvorderman commented 1 year ago

Release is done! https://pypi.org/project/miniwdl-slurm/0.2.0/

Thanks again @williamrowell !