devflowinc / trieve

All-in-one infrastructure for search, recommendations, RAG, and analytics offered via API
https://dashboard.trieve.ai
Other
1.44k stars 123 forks source link

bugfix: support for litellm model calls #1809

Closed cdxker closed 2 months ago

cdxker commented 3 months ago

Description

Server has error {"message": "Error getting topic string: BadRequest: No OpenAI Completion for topic"} on calls to litellm. Litellm login/creds are in bitwarden. Using http://llm.mintlifytrieve.com as the base url. litellm has a ui @ http://llm.mintlifytrieve.com/ui and the password listed in bitwarden is also the api key. If that doesn't work generate a new api key locally.

Set LLM_API_KEY to the api key, and LLM API URL in dataset config to http://llm.mintliftrieve.com, and LLM DEFAULT Model to claude-3-opus to reproduce the error.

This pr will likely need a pr to the openai_dive crate, use our fork and submit the pr link in the pull request for this issue. Set openai_dive to the forked branch.

image

Target(s)

chat

Community channels

Matrix is preferred. Reach out on discord or Matrix for further assistance.

cdxker commented 2 months ago

Fixed by configuraing openrouter better