marella / chatdocs

Chat with your documents offline using AI.
MIT License
683 stars 97 forks source link

Cant load big models across multiple gpu’s #70

Open pnylokken opened 1 year ago

pnylokken commented 1 year ago

Is it possible to run models that require 48gb vram by combining two 24gb gpus? Ive tried playing around with the chatdocs.yml file to no avail.