-
We should be able to support intel GPUs! We are using the intel developer cloud. Please advise.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.4 LTS
Release: 22.04
Codename: …
-
### Describe the bug
I got this runtime error while infering speech with xpu. without the device param works fine with cpu or nvidia gpu on Colab. Please Feel Free to check my notebook.
chat = C…
-
I try to run the TTS (English and Multi Language Text-to-Speech) in my PC.
https://github.com/intel/intel-extension-for-transformers/blob/main/intel_extension_for_transformers/neural_chat/pipeline…
-
### Describe the bug
I am getting issue running below code using ipex-llm:
```
(llm_vision) spandey2@IMU-NEX-ADLP-voice-SUT:~/LLM_Computer_Vision$ cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.4 L…
-
After upgrade triton commit pin from `b8c64f64c18d8cac598b3adb355c21e7439c21de`( currently stock Pytorch in used) to `514e4cdf004278c82216364d1f8534b940cd4238` (2.4 release candidate),
We found the …
-
### Why it needs to get done
In order to be able to tackle https://github.com/canonical/data-science-stack/issues/144, we 'll need first to spend some time to familiarize with the Intel DSS environme…
-
All I need is to run ollama3 on an Intel GPU (Arc™ A750) and I follow the steps as described in the IPEX-LLM documentation, but it runs on the CPU. Search engines can't find a solution to the problem.…
-
## 🚀 Feature
Support for a new device in DGL - Intel GPU. The POC of Intel GPU support in DGL for GraphSAGE model is available here: https://github.com/RafLit/dgl/tree/xpu_poc.
## Motivation
Intel …
-
## Motivation
As the [[RFC] Intel GPU Upstreaming](https://github.com/pytorch/pytorch/issues/114723) mentioned, to integrate the new Intel GPU device and its associated features into PyTorch, we nee…
-
Please update TGI image to 2.0 from 1.4 in all TGI readme files.
I faced issues with Phi-3 model.