-
https://github.com/All-Hands-AI/OpenHands/issues/4894 Issue
---
@Belzedar94 Which model are you using? Check out Google Gemini which has 2M context window.
-
### Description & Motivation
Given the landscape of Inference providers it seems like a good idea to have a way to interact with then using a relatively 'proven' library. Developing a ModelClient for…
-
### Feature Request
I would like to suggest adding a connector for LiteLLM to LangFlow.
LiteLLM is a lightweight and fast open-source library for integrating various LLM providers with a simplified …
-
I would like to know what makes this project different from litellm and detailed docs for its usage like litellm project is needed for others to understand clearly how to integrate different llms with…
-
Currently we use Haystack to provide various LLM providers; however, we found it's still too hard for the community to contribute to various LLMs. We plan to use litellm instead; it provides more func…
cyyeh updated
1 month ago
-
### Describe the bug
liteLLM is installed on my windows local machine. and I can access it with the link http://localhost:4000.
I added this to the .env file as below.
OPENAI_LIKE_API_BASE_URL…
-
Advanced mode only.
https://github.com/BerriAI/litellm?tab=readme-ov-file
-
Hi,
I am using the app with the recommended settings (the free google gemini api key) and after chatting with the system for a bit I get the following error.
```
Give Feedback / Get Help: https…
-
I'm attempting to use an azure-hosted chatgpt model with curateGPT via the litellm proxy system.
I have the proxy running and tested and the LLM package underlying curateGPT successfully using it, …
-
In my current setup, I write everything in DSPy, then I extract the prompt form the dspy module. Then, I use that prompt with litellm to stream the output to the user(if the module is chain of thought…