-
*This issue is a catch all for questions about using aider with other or local LLMs. The text below is taken from the [FAQ](https://aider.chat/docs/faq.html#can-i-use-aider-with-other-llms-local-llms-…
-
### System Info
- LangChain version: 0.0.346
- Platform: Mac mini M1 16GB - macOS Sonoma 14.0
- Python version: 3.11
- LiteLLM version: 1.10.6
### Who can help?
@hwchase17
@agola11
### Informa…
-
Right now, again based on our initial, long obsolete Langchain orientation, we are managing the OpenAI API connection globally. To be fair, the [openai](https://github.com/openai/openai-python) librar…
-
(Test with `test_chat_completion_TechLead()`, just on codellama)
First response:
```json
{
"plan": [{
"description": "Set up the project structure and initialize the necessary dependenci…
-
It's weird to me that you're my debugger and key manager. Right now i have an error due to the debugger - but I can't turn it off since all my model keys are also hosted through the same integration.
-
### The Feature
Starting this issue to ensure LiteLLM is compatible with OpenAI v1.0.0
## The main goal of this issue:
If a user has OpeAI v1.0.0 their OpenAI calls through litellm should not…
-
### What happened?
I was trying to make the OpenAI Proxy Server on my local to work, following the documentation
1. git clone https://github.com/BerriAI/litellm.git
2. Modify template_secrets.tom…