nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
69k stars 7.57k forks source link

GPU in the Chat UI #487

Open Takodachi6969 opened 1 year ago

Takodachi6969 commented 1 year ago

The Chat UI doesn't seem to run on GPU, is there a way to do it? The original method still works, but I don't know how to use it in the chatUI

zanussbaum commented 1 year ago

There currently isn't a way to run the model on GPU within the chat UI AFAIK cc @manyoso

matbgn commented 1 year ago

Do you have any plan to introduce it? Do you think it's a question of weeks, months or years, just to have an idea?

claell commented 1 year ago

@niansa As this seems to be the oldest issue for that topic (apart from #463, which drifted a bit towards Python specifics and has also been closed), and this also has the most votes, shouldn't this stay open and the other closed as duplicates?

matbgn commented 1 year ago

Let me heavily double this comment

niansa commented 1 year ago

@claell yes, that makes sense. Thank you!