pytorch / xla

Enabling PyTorch on XLA Devices (e.g. Google TPU)
https://pytorch.org/xla
Other
2.49k stars 483 forks source link

Query regarding using 1 chip (2 cores of TPU v3) for Inference #8359

Open deepakkumar2440 opened 2 weeks ago

deepakkumar2440 commented 2 weeks ago

❓ Questions and Help

Hello, I am trying to benchmark the performance of TPU v3 for inference. However, I would like to use 2 cores (1 chip). Please point me to any documentation that I can get started on. Also, is it possible to launch 2 inferences on 2 cores as separate independent processes? (This would just give 2x the performance of one core) Thanks again, Deepak