-
## 🐛 Bug Report
**🔎 Describe the Bug**
Give a clear and concise description of the bug.
I have a fastapi uvicorn server which serves multiple concurrent requests. In each of the call, I am using …
-
**Is your feature request related to a problem? Please describe.**
Collecting evidence for potential rule infractions
**Describe the solution you'd like**
I want to be able to have a file-based c…
-
When trying to use the completions endpoint (rather than chat_completions) on a vLLM runpod serverless instance I get a server error. This happens with all models that I've tried. The chat_completions…
-
### Link to the coursework
https://github.com/CodeYourFuture/Module-Node/tree/main/chat-server
### Why are we doing this?
In this project, you'll be able to start building out different method endp…
-
### System Info / 系統信息
操作系统=ubuntu20.04
显卡=4 x v100
model=glm-9b-chat
python=3.11.8
llama-factory=0.8.3
transformers=4.43.3
vllm=0.5.3.post1
### Who can help? / 谁可以帮助到您?
_No response_
### …
-
I need to look more closely at error handling
-
### Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [X] I'm not able to find an [open issue](ht…
-
### Issues Policy acknowledgement
- [X] I have read and agree to submit bug reports in accordance with the [issues policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md)
### Where…
-
Got this funny message when using text-gen plugin:
```
File "/home/user/workspace/other/llamaindex_rag/chat_server.py", line 81, in chat_with_data
chat_engine = index.as_chat_engine(
…
-
Hello,
Is it possible to add a command for toggle staffchat mode please ?
For example i've this:
```
staff_chat:
type: "chat"
enabled: true
name: "staffchat"
aliases: [ "sc" ]
per…