-
### What happened?
Using gemini-pro as a litellm proxy is currently impossible.
Some `llm_providers` such as `palm`, `gemini`, `vertex_ai` is ignoring the `base_url` argument.
https://github.com/…
-
Hi,
I am trying to running reinforcement learning example using the provided toml config file.
* Platform: Linux (ubuntu 22.04)
* conda environment created using the requirement file (requireme…
-
Create unit tests for the Python module src/fish_ai/engine.py using `pytest` and optionally `unittest.mock` if you need to mock any dependencies.
The Python tests should be put in the file `src/fis…
-
### Description of the bug:
Greetings,
While experimenting with the GenerationConfig parameters in the Gemini Pro model, I've noticed an unexpected variability in the outputs generated with the t…
-
**描述这个 bug**
对 bug 作一个清晰简明的描述,包括:
- 什么问题?
- 客户端(机器人框架改造)运行一段时间就好接收不到微信信息了,像是卡住了一样。
- 但是好像一直聊天就没问题,就是长时间(10分钟往上)没有微信信息后会卡住,无法再接收信息了,需要重启客户端。
- 是否安装后第一次启动就出现?
- 否
- 重启微信后能否解决?(使用任务管理器,杀掉微信进…
-
For some reason, I have to set up a proxy for that API, but the `Base Path` doesn't work. The plugin still uses the google API, not mine.
-
### Before You Report a Bug, Please Confirm You Have Done The Following...
- [X] I have updated to the latest version of the packages.
- [X] I have searched for both [existing issues](https://github.…
-
# Bug Report
## Description
**Bug Summary:**
[Provide a brief but clear summary of the bug]
I installed the first time open-webui after having installed ollama and created a conda env ´openweb…
-
### The Feature
Allow adding new vertex models via the `/model/new` endpoint
### Motivation, pitch
allow user to not need to add vertex models via the config
### Twitter / LinkedIn details
_No …
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…