Open plusbang opened 1 month ago
Hi @plusbang , thanks for reporting it we will try to reproduce it at our end and return to you.
Hi @plusbang, I am trying to get a machine to reproduce the issue.
Hi @plusbang, this issue is reproducible as current support for IPEX iGPU is limited.
Describe the bug
On Windows iGPU, I tried to run LLM inference with![image](https://github.com/intel/intel-extension-for-pytorch/assets/108676127/044b73d6-13e5-4322-8388-6a283710a3d3)
ipex=2.1.30+xpu
andoneapi=2024.1
, but failed. Wait for more than 1 hour but still pending at hereTo reproduce:
And run the following code
Versions
Collecting environment information... PyTorch version: N/A PyTorch CXX11 ABI: N/A IPEX version: N/A IPEX commit: N/A Build type: N/A
OS: Microsoft Windows 11 家庭中文版 GCC version: (GCC) 13.2.0 Clang version: N/A IGC version: N/A CMake version: N/A Libc version: N/A
Python version: 3.11.9 | packaged by conda-forge | (main, Apr 19 2024, 18:27:10) [MSC v.1938 64 bit (AMD64)] (64-bit runtime) Python platform: Windows-10-10.0.22631-SP0 Is XPU available: N/A DPCPP runtime version: N/A MKL version: N/A GPU models and configuration: N/A Intel OpenCL ICD version: N/A Level Zero version: N/A
CPU: Architecture=9 CurrentClockSpeed=1200 DeviceID=CPU0 Family=1 L2CacheSize=14336 L2CacheSpeed= Manufacturer=GenuineIntel MaxClockSpeed=3600 Name=Intel(R) Core(TM) Ultra 5 125H ProcessorType=3 Revision=
Versions of relevant libraries: [pip3] intel-extension-for-pytorch==2.1.30+xpu [pip3] numpy==1.26.4 [pip3] torch==2.1.0.post2+cxx11.abi [pip3] torchaudio==2.1.0.post2+cxx11.abi [pip3] torchvision==0.16.0.post2+cxx11.abi [conda] intel-extension-for-pytorch 2.1.30+xpu pypi_0 pypi [conda] numpy 1.26.4 pypi_0 pypi [conda] torch 2.1.0.post2+cxx11.abi pypi_0 pypi [conda] torchaudio 2.1.0.post2+cxx11.abi pypi_0 pypi [conda] torchvision 0.16.0.post2+cxx11.abi pypi_0 pypi