Open listless-dude opened 1 year ago
Do you require specific PyTorch/XLA version or it is fine to use the most recent stable version (2.0)? If you are fine with version 2.0, can you remove the line os.environ['XRT_TPU_CONFIG'] = "tpu_worker;0;10.128.0.29:8470"
and retry?
Also for XRT_TPU_CONFIG
, as the name suggests, it uses the xrt
runtime which we plan to drop the support in the near future.
I removed it, and did export PJRT_DEVICE=TPU
, still got the same error.
Same for me, @mr-oogway any update?
Hi, @vanbasten23 , is that ok to assign this to you?
I tried your script on https://colab.sandbox.google.com/github/pytorch/xla/blob/master/contrib/colab/getting-started.ipynb and the script runs fine on the colab.
❓ Questions and Help
I did set up XRT_TPU_CONFIG with the IP address of the TPU. This is my test.py script
Here's the error:
I don't know what am I doing wrong. Can someone give me a possible fix