-
### Your current environment
The output of `python collect_env.py`
```text
Your output of `python collect_env.py` here
```
### 🐛 Describe the bug
Hello,
On a container env I …
-
### 是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
### 该问题是否在FAQ中有解答? | Is there an existing ans…
-
but got chatting error as following:
```
ERROR: Invalid API-key provided. Please set LLM API-Key in 'User Setting -> Model Providers -> API-Key'
```
Current Repo: ragflow
Commit Id: 89…
-
### System Info
```Shell
Please see
https://github.com/huggingface/peft/issues/484#issue-1718704717
```
### Information
- [ ] The official example scripts
- [X] My own modified scripts
### Tasks…
-
### Describe the bug
Thanks for building this amazing tool. I had been loving it.
While I am using langchain to process lots of webpage in bulk, I run into hight CPU issue after task runs for fe…
-
This appears to be an issue with CUDA version requiring 11.7. However, there are security concerns with this version of CUDA requiring me to use version 12.2 is there any intention of updating vLLM to…
-
docker container version : ipex-llm-serving-xpu:2.2.0-b2
start shell script:
model="/llm/models/Qwen/Qwen2.5-32B-Instruct-AWQ"
served_model_name="Qwen2.5-32B-Instruct-AWQ"
export CCL_WORKER_…
-
Every weired, the return is in role function, but userProxy just return LLM the json output rather than exec it?
```py
config_list = (
{
"model": "qwen-v2",
"base_url"…
-
# bug
I'm using the API of Tongyi Qianwen in proxy mode. I've successfully experimented with it on Lobechat, but I'm encountering network errors with this plugin.
![image](https://github.com/logan…
-
### System Info
transformers= 4.42.4
python=3.10.13
ubuntu=20.04
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
#…