google-ai-edge / ai-edge-torch

Supporting PyTorch models with the Google AI Edge TFLite runtime.
Apache License 2.0
327 stars 45 forks source link

Could you please confirm if this project can be used on pure CPU machines that do not support TPU? #92

Closed quaeast closed 3 months ago

quaeast commented 3 months ago

Description of the bug:

No response

Actual vs expected behavior:

No response

Any other information you'd like to share?

No response

talumbau commented 3 months ago

Hi, thanks for reaching out. Models converted with this project can be run on pure CPU devices. Many (maybe most?) of the models are converted in a way that they benefit from the acceleration capabilities of the XNNPack library, which is a CPU inference engine optimized for many classes of CPUs (with very high coverage for ARM CPUs specficially). Please re-open if you have additional questions.

quaeast commented 3 months ago

Thank you for your response. I have another specific question. However, the requirements.txt file of this package includes torch_xla, which my machine does not support. After installation, errors will occur because my machine only supports CPU. How should I install it if my machine only supports CPU?

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/data/home/patfang/miniconda3/envs/aiedge_nightly/lib/python3.10/site-packages/ai_edge_torch/__init__.py", line 16, in <module>
    from .convert.converter import convert
  File "/data/home/patfang/miniconda3/envs/aiedge_nightly/lib/python3.10/site-packages/ai_edge_torch/convert/converter.py", line 22, in <module>
    from ai_edge_torch import model
  File "/data/home/patfang/miniconda3/envs/aiedge_nightly/lib/python3.10/site-packages/ai_edge_torch/model.py", line 28, in <module>
    from ai_edge_torch.convert import conversion_utils as cutils
  File "/data/home/patfang/miniconda3/envs/aiedge_nightly/lib/python3.10/site-packages/ai_edge_torch/convert/conversion_utils.py", line 27, in <module>
    from torch_xla import stablehlo
  File "/data/home/patfang/miniconda3/envs/aiedge_nightly/lib/python3.10/site-packages/torch_xla/__init__.py", line 7, in <module>
    import _XLAC
ImportError: libpython3.10.so.1.0: cannot open shared object file: No such file or directory
talumbau commented 3 months ago

Hi! Very sorry to hear about the troubles you are having with XLA. This issue doesn't have anything to do with the hardware on your machine. XLA will work on a machine that has only a CPU. The problem is that the compiled XLA library requires the libpython for your Python version (3.10) but it can't find the shared library to load. We can resolve this by updating your LD_LIBRARY_PATH variable like this:

export LD_LIBRARY_PATH=<path to Python>/lib:$LD_LIBRARY_PATH

I see that you are using a Miniconda environment, so for your case, I believe this would be resolved by doing:

export LD_LIBRARY_PATH=/data/home/patfang/miniconda3/envs/aiedge_nightly/lib:$LD_LIBRARY_PATH

Please give that a try. We will update the documentation soon so that others are aware of this potential problem.

quaeast commented 2 months ago

Thank you very much for your help. I have now managed to run it successfully on a CPU machine. Initially, I had to manually set the environment variables. Although the path was already set correctly as shown by patchelf --print-rpath `python`, I still needed to set it using an environment variable here. It's quite strange that only torch_xla requires this setup:

export LD_LIBRARY_PATH=<path to Python>/lib:$LD_LIBRARY_PATH

Afterwards, it is necessary to ensure that the glibc version is >=2.30, which can be checked using the following command:

ldd --version