Open cargecla1 opened 3 months ago
Hello there,
It will be good to add a version of this that uses ollama to run a local LLM, e.g. Mixtral.
Is there any interest with this? @emrekiciman @amit-sharma @RoseDeSicilia26
Cheers!
Hello there,
It will be good to add a version of this that uses ollama to run a local LLM, e.g. Mixtral.
Is there any interest with this? @emrekiciman @amit-sharma @RoseDeSicilia26
Cheers!