-
Hi,
I have a question please, assume that I fine-tuned the Falcon model with adapters on a specific dataset and want to upload it on [openLLM leaderboard ](https://huggingface.co/spaces/HuggingFace…
-
### Describe the bug
I've tried to execute your toy example from the docs with different host and client machines:
```
import openllm
client = openllm.client.HTTPClient('http://some_other_machin…
-
It would be great to be able to reproduce the results provided in the paper.
![image](https://user-images.githubusercontent.com/15141326/224653940-cc3afb81-7bb3-43e0-b63d-d51094361695.png)
the z…
-
### Describe the bug
First at all: Thank you very much, openllm looks awesome so far 💯
This issue is regarding to #47. We tried to start an openllm server with the command:
`openllm sta…
-
### Issue you'd like to raise.
can i load my local model by chain = LLMChain(llm=chat, prompt=chat_prompt)
### Suggestion:
_No response_
-
### Describe the bug
I'm running through the most basic install. I have creates an empty virtualenv with python 3.11. I've run `pip install openllm`, and I get a crash when I run `openllm start dolly…
-
### Describe the bug
when running cmd to load chatglm or chatglm2, cmd as below:
openllm start chatglm --model-id thudm/chatglm2-6b
openllm start chatglm --model-id thudm/chatglm-6b
Error: No su…
-
> [2023/07] We released the OpenLLMs model series. Among them, OpenChat obtains 80.9% win-rate on AlpacaEval and 105% ChatGPT performance on Vicuna GPT-4 evaluation.
Are you saying your model is ge…
-
### Describe the bug
OpenLLM not working in langchain in google colab doc
### To reproduce
https://colab.research.google.com/drive/11awO0MyCeh0Yi88EoPY_4LBs8IfiN0Iu?usp=sharing
### Logs
_No respo…
-
**Describe the bug**
`bentoml containerize` with conda options in `bentofile.yaml` fails with `chmod: cannot access '/home/bentoml/bento/env/python/install.sh': No such file or directory`.
`bent…
smidm updated
11 months ago