Open segalinc opened 4 months ago
Hi thanks for the great contribution
I have an instance with multiple gpus How can I train using all devices but without SLURM. like with torch distributed or accelerate?
Hi thanks for the great contribution
I have an instance with multiple gpus How can I train using all devices but without SLURM. like with torch distributed or accelerate?