-
I'm super confused I get this error on a Next.js app (latest) server-side:
```sh
Something went wrong TypeError: Only absolute URLs are supported
builder:dev: at getNodeRequestOptions (webpac…
-
## Description
I installed the jupyter ai extension in a mamba environment using `pip install jupyter-ai`, I got jupyter lab installed automatically with this command. However, when I loaded the …
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a…
-
### Describe the bug
new install of openllm on python 3.12.2
running cmd: `TRUST_REMOTE_CODE=True openllm start mistralai/Mistral-7B-Instruct-v0.1`
### To reproduce
1. install pyenv: `brew…
-
There are some edge case we are missing doing the serve logger redirect those errors. Let's look into the issue and fix it.
A full traceback looks like
```app.py:133 - Request 04155ecd-be7d-4bb4-a0…
-
### Is there an existing issue for the same bug?
- [X] I have checked the existing issues.
### Branch name
main
### Commit ID
497bc1438a212f23c2f6ae029ae4547012b3c823
### Other environment infor…
-
### Describe the bug
It is possible to pass a xhttp client to llm_config, like:
```python
import httpx …
-
### System Info
```
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 550.54.15 Driver Version: 550.54.15 CUDA Version: 12.4…
-
### Describe the bug
Apparently I cannot use Mistral models in a group chat setting. When a mistral model is set as the manager itself (using 'auto' speaker selection), it fails with the error `open…
-
```
File "/opt/conda/lib/python3.10/site-packages/cma/evolution_strategy.py", line 4392, in fmin2
res = fmin(objective_function, x0, sigma0,
File "/opt/conda/lib/python3.10/site-packages/cm…