marella / chatdocs

Chat with your documents offline using AI.
MIT License
684 stars 98 forks source link

I cant get the GPU to work #2

Open PotatoDestroyer opened 1 year ago

PotatoDestroyer commented 1 year ago

I dont understand why the GPU doesnt want to work. its always the CPU

image

marella commented 1 year ago

Do you want to use ctransformers or huggingface model?

If you want to use huggingface set:

llm: huggingface

If you want to use ctransformers make sure it is installed with CUDA enabled: https://github.com/marella/chatdocs#c-transformers-1

PotatoDestroyer commented 1 year ago

Im sorry but I simply cannot understand the proccess of chatdocs.yml

can you explain? all I need is to run the GPU. https://huggingface.co/TheBloke/Vigogne-Instruct-13B-HF

i need to use huggingface

marella commented 1 year ago

So basically you should create a chatdocs.yml file in the directory you are running commands from and add the following to it:

llm: huggingface # this makes it use huggingface instead of ctransformers

huggingface:
  model: TheBloke/Vigogne-Instruct-13B-HF # here you can specify any huggingface model
  device: 0 # this makes it run on GPU instead of CPU

Note that you should NOT be editing the original chatdocs/data/chatdocs.yml file present in this repo. You should create a new one with the settings you want.