intel-analytics / ipex-llm

Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, GraphRAG, DeepSpeed, Axolotl, etc
Apache License 2.0
6.67k stars 1.26k forks source link

Use "--extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ " for IPEX package installation #9409

Open ywang30intel opened 12 months ago

ywang30intel commented 12 months ago

Use "--extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ " instead of "-f https://developer.intel.com/ipex-whl-stable-xpu" in the instruction guide for bigdl-llm installation. IPEX installation instructions: https://intel.github.io/intel-extension-for-pytorch/xpu/latest/tutorials/installations/linux.html for generic python: python -m pip install torch==2.0.1a0 torchvision==0.15.2a0 intel_extension_for_pytorch==2.0.110+xpu --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/

The "-f" may not work. I got bunch of errors in one of IDC system with following command: pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu This is due to the "-f https://developer.intel.com/ipex-whl-stable-xpu" not working and the right torch and IPEX package can't be found from that link. Use pip install --pre --upgrade bigdl-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ can solve the issue.

hkvision commented 12 months ago

@rnwang04 Shall we update to this command?

rnwang04 commented 12 months ago

@ywang30intel Thanks for suggestion! @hkvision I will do some validation first.

rnwang04 commented 12 months ago

--extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ doesn't work at our local machine. Maybe it just works for some special cases or proxy?

I prefer to keep using current -f https://developer.intel.com/ipex-whl-stable-xpu, maybe we can add --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ into our doc as a backup. @hkvision

hkvision commented 12 months ago

I agree. Good idea.