-
### Describe the bug
I tried OpenLLM and wanted to have the model run on my CPU for testing because the GPU does not have enough RAM.
I tried setting `CUDA_VISIBLE_DEVICES=""` and `--device ""` (f…
-
### Feature request
Currently, the default context can be modified manually under `$BENTOML_HOME/.yatai.yaml`
There should be a CLI command that allow user to change this
```bash
bentoml yatai…
-
### Describe the bug
Following steps from [example](https://github.com/bentoml/BentoML/tree/main/examples/sklearn/pipeline)
bentoml serve service.py:svc will produce
2023-06-08T08:24:26+000…
-
This model is ready for testing. If you are assigned to this issue, please try it out using the CLI, Google Colab and DockerHub and let us know if it works!
-
BentoCloud signup page translation
`당신은 딱 한 걸음 멀리 있습니다!` means `you are far from one step` I think it's awkwards
`You're just one step away! ` should be translated into `한단계만 더 진행하시면 됩니…
-
### Describe the bug
FS uses three special characters (`:`, `@`, and `!`) to infer things about the path. This is perfectly fine when working with remote filesystems; however, locally this can cause …
-
### Describe the bug
Thank you for creating this great repo! I am running simple "openllm start dolly-v2" and getting a 500 internal server error related to "model_kwargs". Not sure how to proceed. S…
-
### Describe the bug
Running my code crashes with:
```
Traceback (most recent call last):
File "****/site-packages/bentoml/_internal/tag.py", line 111, in from_str
raise BentoMLException(f"…
-
### Week 1 - Get to know the community
- [x] Join the communication channels
- [x] Open a GitHub issue (this one!)
- [x] Install the Ersilia Model Hub and test the simplest model
- [x] Install Docker…
-
### Describe the bug
### background
When the model loading time is very long, about seven minutes, I try to find a time point available for the runner to provide external services
### resolutio…