ParisNeo / lollms

Lord of LLMS
Apache License 2.0
270 stars 51 forks source link

How to choose GPU with AllGPT binding. #13

Open ediweber opened 1 year ago

ediweber commented 1 year ago

Hello.

First things first: Thank you for creating lollms, it is simply spoken wonderful!

Unfortunately I suffer with the AllGPT binding. I cannot load my GGUF models, output is:

Lollms webui version : 6.8 Listing all extensions Listing all personalities Listing models Loading discussion for client mrQYbMGCR0JO3rJSAAAB Checking for updates from E:\lollms-webui update availability: True Listing all extensions

I suspect that it tries to use my APU (Ryzen 5600) @ 4GB instead of my Nvidia 3060 - 12 GB, is there a simple way to force it to use the Nvidia card.

Will try with smaller models. In fact it would be nice to have a small one running on the APU and a large one on the dedicated GPU card.

ediweber commented 1 year ago

sorry. I accidently posted this in lollms while this is an lollms-webui issue

ParisNeo commented 1 year ago

Well, in the parameters of gpt4all yo ucan select GPU (cuda)