Lightning-AI / pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.11k stars 3.36k forks source link

distributed inference #13187

Open williamFalcon opened 2 years ago

williamFalcon commented 2 years ago

We need to support distributed inference...

In addition, for now, let's make sure to add a warning when a user runs predict with accelerators > 1:

“warning: you’re running .predict with devices > 1. predictions across multiple {self.accelerator}s is not yet supported.”

# prints like
“warning: you’re running .predict with devices > 1. predictions across multiple GPUs is not yet supported.”

so that user could do: https://github.com/Lightning-AI/lightning/pull/15149#issuecomment-1282360328

cc @justusschock @kaushikb11 @awaelchli @akihironitta @rohitgr7 @Borda @carmocca

ShaneTian commented 2 years ago

In the latest 1.7.1, multi-GPU prediction is still not supported.