-
Hi there!
I just bundled and deployed a Pytorch image using the `deploy.py` file. Everything seems to deploy fine, but when I invoke the endpoint it crashed because BentoML is using a Flask call th…
-
**Is your feature request related to a problem? Please describe.**
With the recent configuration revamp, building the config file into the Docker image is the last missing link. Without such option…
-
Hello,
I try to deploy bentoML docker on Inf1 instance and I am struggling with the same issue as #176
My base docker command is:
`docker run -p 80:5000 --env AWS_NEURON_VISIBLE_DEVICES="0" -…
-
**Is your feature request related to a problem? Please describe.**
Current practice in a ML workflow is to train model on GPUs and then convert to CPUs during inference time. Most DL frameworks hav…
-
* Add support for loading MLFlow's model format
-
**Is your feature request related to a problem? Please describe.**
It would be nice if `bentoml` provided a Pythonic way of using `docker_base_image` with `bentoml/model-server`.
**Describe the …
-
**Describe the bug**
The remote yatai client lists the model artifacts but fails to load them. Bentoml client works fine:
`bentoml retrieve IrisClassifier:20210728102908_1CDF04 --target_dir $MYD…
-
Hello,
I tried to deploy a sentence-transformer bert-like model on Inf1.xlarge instance. I used bentoML as framework for containerizing inference service and when increasing the initial workers up …
-
My model name is
```
test-model:1.0.20210507102237_9234C9
```
which means I've used `bentoml.ver(major=1, minor=0)` and
```
@property
def name(self):
return "test-model"
```
in the s…
-
**Describe the bug**
When I run YataiService with `--web-prefix-path` option, the YataiService gRPC server doesn't have a prefix path. I think https://github.com/bentoml/BentoML/issues/1063 missed …