dmmiller612 / sparktorch

Train and run Pytorch models on Apache Spark.
MIT License
339 stars 44 forks source link

Inference on multi GPU #30

Open milad1378yz opened 2 years ago

milad1378yz commented 2 years ago

Hi, I have a sizeable pre-trained model and I want to get inference on multiple GPU from it(I don't want to train it).so is there any way for that? In fact, I want model-parallelism. and if there is a way, how is it done?