snexus / llm-search

Querying local documents, powered by LLM
MIT License
519 stars 60 forks source link

Generate embeddings with CPU only #80

Open vulcano9 opened 10 months ago

vulcano9 commented 10 months ago

Hi @snexus,

Thank you for your support and patient response to my (beginner) questions. I truly appreciate your assistance.

I don't have a dedicated graphics card, but I am eager to create the embeddings. Is there a way to achieve this without one (Google Colab unfortunately crashes after a certain period of time while creating embeddings with ~13000 chunks)?

Thank you again for your support and I look forward to hearing from you.

snexus commented 10 months ago

Hi @vulcano9

Without GPU, it would take a few orders of magnitude longer (technically I didn't test it - but should be supported out of the box on a system without GPU).

If you need only the embeddings, Google Colab should be able to handle it. What is the error message? How many documents and what format/size are you trying to parse?