Open hiralU opened 3 months ago
I am assuming you are running on TPU, what command did you use ? Does instruction in https://github.com/pytorch/xla#tpu works?
I had updated code as following to resolve issue,| USE_PEFT_BACKEND = True
if is_torch_xla_available():
import torch as xm
XLA_AVAILABLE = True
else: XLA_AVAILABLE = False
but still not working, showing many compatibility issues
what workload are you trying to run?
I am working on Virtual dressing room project. in which i have getting issue of CUDA .
I guess my question is more basic, it seem you are trying to use HF peft and you don't intend to use PyTorch/XLA nor TPU. In which case none of the torch_xla logic should be trigger,
is_torch_xla_available
should be false. Can you check why it returns True? if is_torch_xla_available
is false you most likely won't run into any xla related codes.
Hi, @JackCaoG, is that ok to assign this ticket to you?
getting issues on torch_xla.core.xla_model. , while installing package also getting errors : "ERROR: Could not find a version that satisfies the requirement torch-xla (from versions: none) ERROR: No matching distribution found for torch-xla" I have installed python version is : Python 3.10.0
Any Solution ?