Closed chozillla closed 1 month ago
Hi,
i ran it on Nvidia 3060 with 10GB of VRAM, using local models with up to 8B parameters (same limitations apply as to any other software using local models).
If you are using off-the shelf model such as OpenAI, hardware requirements are much more modest, as you need it only for the embeddings part. Even there, you can choose an embeddings model that will run OK on cpu.
Awesome, will try it out! Thanks!
Hi,
Not an issue, but a question. I was curious on what was the exact hardware you had when you ran that demo with the .gif. I am running on a potato of a Mac Pro 2017, and would use CPU.
Thanks!