pytorch / xla

Enabling PyTorch on XLA Devices (e.g. Google TPU)
https://pytorch.org/xla
Other
2.49k stars 483 forks source link

Import "torch_xla.core.xla_model" could not be resolved #7897

Open hiralU opened 3 months ago

hiralU commented 3 months ago

getting issues on torch_xla.core.xla_model. , while installing package also getting errors : "ERROR: Could not find a version that satisfies the requirement torch-xla (from versions: none) ERROR: No matching distribution found for torch-xla" I have installed python version is : Python 3.10.0

Any Solution ?

JackCaoG commented 3 months ago

I am assuming you are running on TPU, what command did you use ? Does instruction in https://github.com/pytorch/xla#tpu works?

hiralU commented 3 months ago

I had updated code as following to resolve issue,| USE_PEFT_BACKEND = True

if is_torch_xla_available():

import torch_xla.core.xla_model as xm

import torch as xm

XLA_AVAILABLE = True

else: XLA_AVAILABLE = False

but still not working, showing many compatibility issues

JackCaoG commented 3 months ago

what workload are you trying to run?

hiralU commented 3 months ago

I am working on Virtual dressing room project. in which i have getting issue of CUDA .

JackCaoG commented 3 months ago

I guess my question is more basic, it seem you are trying to use HF peft and you don't intend to use PyTorch/XLA nor TPU. In which case none of the torch_xla logic should be trigger,

is_torch_xla_available

should be false. Can you check why it returns True? if is_torch_xla_available is false you most likely won't run into any xla related codes.

ManfeiBai commented 2 months ago

Hi, @JackCaoG, is that ok to assign this ticket to you?