Closed dOrgJelli closed 7 months ago
Support running AutoTx against a locally-running LLM. This could be done via an environment variable where, if present, disable the use of OpenAI and instead uses the local LLM.
Relevant Docs: https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration
Completed here: https://github.com/polywrap/AutoTx/pull/123
Support running AutoTx against a locally-running LLM. This could be done via an environment variable where, if present, disable the use of OpenAI and instead uses the local LLM.
Relevant Docs: https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration