nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
70.54k stars 7.69k forks source link

[Feature] Support for Intel GPUs (HD Graphics, Iris Xe, Arc) #1676

Open zwilch opened 11 months ago

zwilch commented 11 months ago

System Info

32GB RAM Intel HD 520, Win10 Intel Graphics Version 31.0.101.2111

Information

Reproduction

Select GPU Intel HD Graphics 520

Expected behavior

All answhere are unreadable like

Ich The „ I … ... I „ In S _ E

\ ** #smith " # Englishimster glob Qsmith

tazz4843 commented 11 months ago

I can reproduce this on an Intel Arc A770 dGPU, without fail. Arch Linux, latest updates.

goldyfruit commented 10 months ago

Same here Intel ARC A770:

Python 3.11.7 (main, Dec 18 2023, 00:00:00) [GCC 13.2.1 20231205 (Red Hat 13.2.1-6)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from gpt4all import GPT4All
>>> model = GPT4All("mistral-7b-openorca.Q4_0.gguf", device='intel')
llama_new_context_with_model: max tensor size =   102.55 MB
llama.cpp: using Vulkan on Intel(R) Arc(tm) A770 Graphics (DG2)
>>> 
>>> output = model.generate("The capital of Canada is ", max_tokens=10)
>>> print(output)
5 city K   
 S ,  “
koech-v commented 10 months ago

OneAPI might solve this issue, i have also seen this thread.

zwilch commented 9 months ago

GPT4ALLv2 6 1_IntelHD520 Version 2.6.1 Bug is still there

zwilch commented 9 months ago

GPT4ALLv2 6 2_IntelHD520 Version 2.6.1

Garbage output on Intel GPUs (HD 520 Graphics )

zwilch commented 8 months ago

GPT4ALLv2 7 0_IntelHD520

Gargage output on Intel GPUs (HD 520 Graphics) gpt4all V.2.7.0

TheWonderfulTartiflette commented 8 months ago

I have another problem with Arc with GPT4All, that is that it just doesn't recognize my GPU at all, leaving me the option just for CPU

cebtenzzre commented 8 months ago

I have another problem with Arc with GPT4All, that is that it just doesn't recognize my GPU at all, leaving me the option just for CPU

This is because we recently started hiding these GPUs in the UI, such that GPT4All doesn't use them by default given that they are known not to be compatible.

simplybel9000 commented 7 months ago

I have another problem with Arc with GPT4All, that is that it just doesn't recognize my GPU at all, leaving me the option just for CPU

This is because we recently started hiding these GPUs in the UI, such that GPT4All doesn't use them by default given that they are known not to be compatible.

is it possible to use them and if so how can we use them? its the only gpu that i have and cant test others

enzy commented 7 months ago

Intel Arc A770 with the latest llama.cpp (and SYCL enabled) works for me (on Linux). Can we enable these discrete graphics?

x0rgo commented 1 month ago

I have another problem with Arc with GPT4All, that is that it just doesn't recognize my GPU at all, leaving me the option just for CPU

This is because we recently started hiding these GPUs in the UI, such that GPT4All doesn't use them by default given that they are known not to be compatible.

Not to be compatible as in Garbage output or simply not recognized by the library??