Adds support for open source models with Ollama which allows the execution of LLM's locally. Per the current version of the API it doesn't seem like function calling is supported yet. See this issue to track on the possible status of function calling
Adds support for open source models with Ollama which allows the execution of LLM's locally. Per the current version of the API it doesn't seem like function calling is supported yet. See this issue to track on the possible status of function calling
This issue should close https://github.com/brainlid/langchain/issues/67