open-webui / docs

https://docs.openwebui.com
https://docs.openwebui.com
48 stars 64 forks source link

Add tutorial: Local LLM Setup with IPEX-LLM on Intel GPU #89

Closed Oscilloscope98 closed 4 weeks ago

Oscilloscope98 commented 4 weeks ago

IPEX-LLM is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc A-Series, Flex and Max) with very low latency.

This tutorial demonstrates how to setup Open WebUI with IPEX-LLM accelerated Ollama backend hosted on Intel GPU. By following this guide, you will be able to setup Open WebUI even on a low-cost PC (i.e. only with integrated GPU) with a smooth experience.

tjbck commented 4 weeks ago

Thanks!