-
Hello everyone, I use the vllm openapi service, but I encountered a 400 status code (no body) error. How can I change it? Thanks
vllm:
```
python -m vllm.entrypoints.openai.api_server --model /ho…
-
I'd love to use this product but my issue is how to start it up.
Can someone explain me what I am supposed to do?
Where do I unzip?
Where do I run these commands?
![image](https://github.com/use…
-
# User Description
chat Webview not initialised within 10 sec.
requestId=92f56bf6-6934-4765-a7b1-24c0a15b1600
# Stack Trace
```
```
# Other Information
```
=== About ===
Build version: …
N6REJ updated
3 weeks ago
-
### Bug description
When searching (for text, our only option right now), it is possible to end up with results that show messages adjacently under the same heading (recipients, conversation).
T…
-
Description
Today: Customers have to install the Operator via Helm Chart.
Expectations: Customers can install the Operator from the Operator Hub.
-
which file do i run to manually setup the AAA sniper for WoW without the application, and how do i start it
-
Running the script.
> chainlit run chat.py -w
Traceback (most recent call last):
File "/Volumes/KaliPro/Applications/miniconda3/envs/ranger/bin/chainlit", line 8, in
sys.exit(cli())
…
-
I built a small RAG with a local embedding model in the normal [python-based llamaindex](https://github.com/run-llama/llama_index). How do I use this react-based chat application with the [python-base…
-
Ideally when we set a field to be configurable, it should be updated accordingly when new configurable values are given by per_req_config_modifier.
However, it seems only `model_kwargs` **('user': …
-
**Bug description**
Trying the new ChatGpt provider in the beginning everything was fine, but after some time the following error appeared in response to any request:
```
Using ChatGpt provider and…