bclavie / RAGatouille

Easily use and train state of the art late-interaction retrieval methods (ColBERT) in any RAG pipeline. Designed for modularity and ease-of-use, backed by research.
Apache License 2.0
2.45k stars 173 forks source link

How to do Indexing using from_index() on CPU only? #209

Open turjo-001 opened 2 months ago

turjo-001 commented 2 months ago

What should be the number of n_gpu parameter if I want to run the retriever model and load index on CPU RAM?

Any help will be appreciated.

bclavie commented 2 months ago

Hey! To run on CPU, you don't need to actually set the n_gpu parameter yourself, it should default to 1 (and then ignore it) or 0... Have you tried this and ran into any issue?

turjo-001 commented 2 months ago

Hey! To run on CPU, you don't need to actually set the n_gpu parameter yourself, it should default to 1 (and then ignore it) or 0... Have you tried this and ran into any issue?

Hi @bclavie. I've tried using -1, 0, 1 but nothing seems to work. It keeps loading the model and index to my GPU RAM.

It should be mentioned that till now I had used GPU to run indexing flawlessly. To complete my project I now need to run an LLM model which leaves barely any VRAM on my GPU to run the retriever model and the Index.

I have my conda ragatouille environment setup to utilize the GPU packages. That's why i was wondering if there was any parameter i could change to force the retriever and index to run on using my CPU.

Thanks.

bclavie commented 2 months ago

Oh my bad, I was under the impression that you wee on a GPU-less machine!

I think the easiest way to get the retriever to not use the GPU at all would be to hide it from the script. Are you running it in a way where it's easy for you to set CUDA_VISIBLE_DEVICES="" and/or tell torch to ignore it? There is room for change here but colbert-ai still has a lot of hardcoded .cuda() calls :/

turjo-001 commented 2 months ago

Thanks. I'll try setting CUDA_VISIBLE_DEVICES="" and look into the torch thing. Will update here if i can do something.