Closed lwj1980s closed 3 years ago
Yeah, I think you can. But we do not just deploy, we maintain the lifecycle of the training jobs.
Yeah, I think you can. But we do not just deploy, we maintain the lifecycle of the training jobs.
Thank you very much,I have got it
despite of the characters of k8s, could I considered that pytorch-opterator is kind of one-key deploy of pytorch DistributedDataParallel training?