nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
69k stars 7.57k forks source link

[Feature] Multi-GPU support #2528

Open otoloui opened 2 months ago

otoloui commented 2 months ago

Bug Report

Right now, GPT4All only utilizes 1 GPU so for machines with multiple GPU's, it blocks them from having access to higher parameter count models to use.

Steps to Reproduce

  1. 2x RTX 3090 installed
  2. Download llama-3-70b
  3. Try and load the model and watch it load one GPU with 24gb and then crash, 2nd GPU is not utilized

Expected Behavior

In rigs where there are multiple GPU's the app should be able to split the models across them - enabling users to have access of higher parameter count models.

Your Environment

theLoDD commented 1 month ago

I have the same issue. I cannot use big models with multiple GPU's with GPT4All. I can only use a one GPU. Version 3.0.0, Win10, cards 8xGTX1070_8GB.