issues
search
monarch-initiative
/
ontogpt
LLM-based ontological extraction tools, including SPIRES
https://monarch-initiative.github.io/ontogpt/
BSD 3-Clause "New" or "Revised" License
548
stars
68
forks
source link
Add support for more models
#373
Open
caufieldjh
opened
1 month ago
caufieldjh
commented
1 month ago
This PR will:
Switch out the current model loading process for one driven by
litellm
Remove the restriction on models to those in
models.yaml
Enable using local models through Ollama (without requiring proxy configuration)
caufieldjh
commented
1 month ago
Remaining on this PR:
Implement LiteLLM cache functions
Select appropriate embedding model for operation and model provider, or let user specify
Helper functions for non-openai APIs
Also support for Azure endpoints (
https://docs.litellm.ai/docs/providers/azure
)
and for custom URL (
https://docs.litellm.ai/docs/providers/openai_compatible
)
Helper functions for running local models
Focus on using Ollama (
https://docs.litellm.ai/docs/providers/ollama#using-ollama-apichat
)
Docs
Tests
This PR will:
litellm
models.yaml