intel-analytics / ipex-llm

Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, Phi, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, Axolotl, etc.
Apache License 2.0
6.3k stars 1.23k forks source link

libze_loader.so.1: cannot open shared object file: No such file or directory #11294

Open bconsolvo opened 1 month ago

bconsolvo commented 1 month ago

When I go to use the generate.py script, I get the following error:

python ./generate.py --repo-id-or-model-path 'google/codegemma-7b-it' --prompt 'Write a hello world program in Python' --n-predict 32
Traceback (most recent call last):
  File "/root/repos/ipex-llm/./generate.py", line 21, in <module>
    from ipex_llm.transformers import AutoModelForCausalLM
  File "/root/miniconda3/envs/gpu_ipexllm/lib/python3.10/site-packages/ipex_llm/__init__.py", line 34, in <module>
    ipex_importer.import_ipex()
  File "/root/miniconda3/envs/gpu_ipexllm/lib/python3.10/site-packages/ipex_llm/utils/ipex_importer.py", line 103, in import_ipex
    self.directly_import_ipex()
  File "/root/miniconda3/envs/gpu_ipexllm/lib/python3.10/site-packages/ipex_llm/utils/ipex_importer.py", line 116, in directly_import_ipex
    import intel_extension_for_pytorch as ipex
  File "/root/miniconda3/envs/gpu_ipexllm/lib/python3.10/site-packages/intel_extension_for_pytorch/__init__.py", line 95, in <module>
    from .utils._proxy_module import *
  File "/root/miniconda3/envs/gpu_ipexllm/lib/python3.10/site-packages/intel_extension_for_pytorch/utils/_proxy_module.py", line 2, in <module>
    import intel_extension_for_pytorch._C
ImportError: libze_loader.so.1: cannot open shared object file: No such file or directory

I have made sure to install the oneAPI base toolkit already. And I have also made sure that MKL is being found by doing

echo $MKLROOT
/opt/intel/oneapi/mkl/2024.1

I installed IPEX a different way than what was advised because both PT and Transformers were not working with the original recommendation. The original recommendation was:

pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ -->

The one that worked for me was:

python -m pip install torch==2.1.0.post2 torchvision==0.16.0.post2 torchaudio==2.1.0.post2 intel-extension-for-pytorch==2.1.30.post0 oneccl_bind_pt==2.1.300+xpu --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

Please advise on how to proceed.

qiuxin2012 commented 1 month ago

It looks like you have installed oneapi 2024.1. Oneapi 2024.1 + intel-extension-for-pytorch==2.1.30.post0 are not fully supported right now. We recommend you to use oneapi 2024.0 + intel-extension-for-pytorch==2.1.10+xpu. Please uninstall your oneapi basekit and follow https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/install_linux_gpu.html#install-oneapi to install oneapi 2024.0.