py-why / pywhy-llm

Experimental library integrating LLM capabilities to support causal analyses
MIT License
69 stars 12 forks source link

Add example with ollama local LLM #21

Open cargecla1 opened 3 months ago

cargecla1 commented 3 months ago

Hello there,

It will be good to add a version of this that uses ollama to run a local LLM, e.g. Mixtral.

Is there any interest with this? @emrekiciman @amit-sharma @RoseDeSicilia26

Cheers!