-
-
### Describe the bug
Hi!
I tried to run an llm locally using `openllm`, and `phi3:3.8b-ggml-q4` happens to be the only model which I am able to run locally according to openllm, so I ran `openl…
-
reference: https://github.com/protectai/ai-exploits/blob/main/bentoml/README.md
I think it is easy to exploit but I must find a solution to create a python pickle easily with java.
-
https://github.com/bentoml/OpenLLM
-
### Feature request
I really appreciated https://docs.bentoml.com/en/latest/guides/testing.html#unit-tests. It would be helpful to include also docs on mocking the bentoml decorated API methods.
…
-
### Describe the bug
Hi folks,
I'm having a issue to push monitoring logs to Elastic APM through OTLPHttpExporter.
I'm able to send dummy logs with this standalone code and see them in Elasti…
-
Hi.
I run the command from README:
`docker run --gpus all -p 3000:3000 ghcr.io/bentoml/ocr-as-a-service:gpu`
and after pulling it throws an error:
`FileNotFoundError: BentoML config file s…
-
how can i use openllm for local lora model?
-
Hi @sayakpaul @osanseviero, here is the outline proposal for the blog we discussed. Let me know what you think!
1. How HuggingFace eco-system (transformers, diffusers, etc.) helps access state-of-t…
-
### Describe the bug
I was doing a bentoml build and I am getting the error
```pytb
Traceback (most recent call last):
File "~/usr/vendors/pyenv/versions/housing_valuation/bin/bentoml", line…