-
### 提交前必须检查以下项目 | The following items must be checked before submission
- [X] 请确保使用的是仓库最新代码(git pull),一些问题已被解决和修复。 | Make sure you are using the latest code from the repository (git pull), some issue…
-
### Documentation Issue Description
Need to run first `pip install langchainhub` before :
```python
from langchain import hub
langchain_prompt = hub.pull("rlm/rag-prompt")
```
And also, got …
-
### 起始日期 | Start Date
12/20/2023
### 实现PR | Implementation PR
_No response_
### 相关Issues | Reference Issues
_No response_
### 摘要 | Summary
Gemini
ChatGLM
文心一言4
### 基本示例 | Basic Example
--
…
-
### Dify version
0.3.30
### Cloud or Self Hosted
Self Hosted
### Steps to reproduce
按照教程上的,使用openllm启动模型:
openllm start opt --model_id facebook/opt-125m -p 3333
然后在dify上添加。报错
![image](https:…
-
https://github.com/EvoEvolver/EvoNote/blob/main/evonote/model/openllm.py
-
### Describe the bug
run command : openllm start facebook/opt-2.7b
environment mac pro
os:sonoma 14.1.1
penllm 0.4.26
openllm-client …
-
### Describe the bug
openllm start facebook/opt-1.3b
It is recommended to specify the backend explicitly. Cascading backend might lead to unexpected behaviour.
Traceback (most recent call last):
…
-
### Describe the bug
when I load my local model
openllm start chatglm --model-id /chatglm-6b
I get a error
openllm.exceptions.OpenLLMException: Model type is not supported yet.
How can I…
-
### Describe the bug
Inside 'https://github.com/bentoml/BentoML/tree/main/src/bentoml/bentos.py' you find down in
def build
the following comment:
"""
User-facing API for building a Bent…
FBR65 updated
7 months ago
-
### Feature request
I would like to setup OpenLLM on a ARM64 based Linux machine. I tried using Docker but it seems aarch64 architecture is not available. Could it be possible to build an aarch64 Doc…