Overview
Add support for Ollama to enable users to run open source models locally.
This feature is required for privacy focused users who do not want to share their code with LLM providers.
Requirements
Integration with Ollama API using langchain
Support for multiple open source models available through Ollama
Seamless switching between cloud and local models using the providers API
Technical Details
Implement Ollama API client
Add configuration options for Ollama endpoint and model selection
Ensure compatibility with existing application interfaces
Success Criteria
Users can run local models through Ollama
Successful Knowledge graph creation and agent execution.
Overview Add support for Ollama to enable users to run open source models locally. This feature is required for privacy focused users who do not want to share their code with LLM providers.
Requirements
Technical Details
Success Criteria