-
## Problem
I'm juggling with a lot of model providers (Groq, SambaNova, Cerebras, ...). But when I have to choose the API provider in Aider-composer parameters, there is only one place to set my en…
-
### Description
I'm attempting to make an API to interact with a RAG chat bot that I've made.
The API is OpenAI compatible (should be) but I cannot get big-AGI to display the response that the A…
-
Thanks for your wonderful project! Will your support OpenAI compatible API?
-
### Which API Provider are you using?
OpenAI Compatible
### Which Model are you using?
Qwen/Qwen2.5-Coder-32B-Instruct
### What happened?
deeinfra has openai compatible api that using https://api…
-
### Feature request
prepend v1 to OpenAI compatible APIs
### Motivation
This allows us to integrate infinity the same way as other openai compatible API engines into KubeAI: https://github.com/subs…
-
I am using a custom Open AI compatible proxy with no api key (i put in "noop"). It works fine using "continue" vscode extension.
I don't really understand why is claude dev not making any requests …
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a…
-
```
import os
import yaml
from loguru import logger
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_community.chat_models import ChatLiteLLMRouter
import li…
-
- [x] #123
- [ ] passing api-key
- [ ] auth layer and persistence
-
I and a number of folks use LLM providers that are not OpenAI but are "OpenAI compatible". The difference is that one typically provides
OPENAI_API_BASE_URL=https://some-llm-provider/api
OPENAI_API_…