henk717 / KoboldAI

KoboldAI is generative AI software optimized for fictional use, but capable of much more!
http://koboldai.com
GNU Affero General Public License v3.0
359 stars 130 forks source link

Error installing deps on IPEX #464

Closed kotx closed 11 months ago

kotx commented 11 months ago

For some reason torch==2.0.1a0 can't be installed even though the index url specified https://developer.intel.com/ipex-whl-stable-xpu has it.

Output of ./play-ipex.sh:

Installing pip packages: -f https://developer.intel.com/ipex-whl-stable-xpu, torch==2.0.1a0, intel_extension_for_pytorch==2.0.110+xpu, flask-cloudflared==0.0.10, flask-ngrok, flask-cors, lupa==1.10, transformers[sentencepiece]==4.33.1, huggingface_hub==0.16.4, optimum[onnxruntime]==1.12.0, safetensors==0.3.3, accelerate==0.20.3, git+https://github.com/VE-FORBRYDERNE/mkultra, ansi2html, flask_compress, ijson, ftfy, pydub, diffusers, git+https://github.com/0cc4m/hf_bleeding_edge/, einops, peft==0.3.0, windows-curses; sys_platform == 'win32', pynvml
Looking in links: https://developer.intel.com/ipex-whl-stable-xpu
Collecting git+https://github.com/VE-FORBRYDERNE/mkultra (from -r /home/kot/KoboldAI/environments/mambafHF4mVpnEsB (line 13))
  Cloning https://github.com/VE-FORBRYDERNE/mkultra to /tmp/pip-req-build-p9fcr7h6
  Running command git clone --filter=blob:none --quiet https://github.com/VE-FORBRYDERNE/mkultra /tmp/pip-req-build-p9fcr7h6
  Resolved https://github.com/VE-FORBRYDERNE/mkultra to commit ef544de73ec6a1a4bd55e824d0628fa0ef1323ac
  Preparing metadata (setup.py) ... done
Collecting git+https://github.com/0cc4m/hf_bleeding_edge/ (from -r /home/kot/KoboldAI/environments/mambafHF4mVpnEsB (line 20))
  Cloning https://github.com/0cc4m/hf_bleeding_edge/ to /tmp/pip-req-build-i8ntll6w
  Running command git clone --filter=blob:none --quiet https://github.com/0cc4m/hf_bleeding_edge/ /tmp/pip-req-build-i8ntll6w
  Resolved https://github.com/0cc4m/hf_bleeding_edge/ to commit f4c747d4c9f3143f7f290e35b4bf8edf6c08621c
  Preparing metadata (setup.py) ... done
Ignoring windows-curses: markers 'sys_platform == "win32"' don't match your environment
ERROR: Could not find a version that satisfies the requirement torch==2.0.1a0 (from versions: 1.4.0, 1.5.0, 1.5.1, 1.6.0, 1.7.0, 1.7.1, 1.8.0, 1.8.1, 1.9.0, 1.9.1, 1.10.0, 1.10.1, 1.10.2, 1.11.0, 1.12.0, 1.12.1, 1.13.0, 1.13.1, 2.0.0, 2.0.1)
ERROR: No matching distribution found for torch==2.0.1a0
critical libmamba pip failed to install packages
Setting OneAPI environment

:: initializing oneAPI environment ...
   play-ipex.sh: BASH_VERSION = 5.1.16(1)-release
   args: Using "$@" for setvars.sh arguments:
:: compiler -- latest
:: debugger -- latest
:: dev-utilities -- latest
:: dpl -- latest
:: mkl -- latest
:: tbb -- latest
:: oneAPI environment initialized ::

ERROR: ld.so: object '/usr/lib/libstdc++.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/libstdc++.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/libstdc++.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/libstdc++.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/libstdc++.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
Traceback (most recent call last):
  File "aiserver.py", line 15, in <module>
    from modeling.inference_model import GenerationMode
  File "/home/kot/KoboldAI/modeling/inference_model.py", line 10, in <module>
    import torch
ModuleNotFoundError: No module named 'torch'
Disty0 commented 11 months ago

Sometimes file caching messes up on Intel's side. You can download these manually torch==2.0.1a0 intel_extension_for_pytorch==2.0.110+xpu

kotx commented 11 months ago

Turns out it works now, didn't need to install manually.