-
This issue originates from an [issue](https://github.com/bentoml/bentoctl/issues/159) in the Bentoctl repository. I later found out it was more related to the aws-sagemaker operator and I have therefo…
-
**Describe the bug**
A Runner object can be instantiated with a `name` argument. It seem as if this name contains a space seperator mentioned ValueError is raised when serving in production mode. As …
-
**Describe the bug**
In QuickStart guide, calling `bentoml serve --reload` will throw error:
```
NotImplementedError: Streams are not supported on Windows.
```
**To Reproduce**
Run QuickStart pyt…
-
**Describe the bug**
Can't pass `__init__` parameters to a Runnable class when creating a Runner.
**To Reproduce**
When running this example from docs:
```python
import bentoml
import torch
…
-
Describe the problem:
Currently I am working with creating the bento from tensorflow and onnx models.
So, I am creating models with two inputs x1,x2. when I custom train the model and create the b…
-
Problem statement:
Currently there are a few ways to include a tokenizer in a BentoML service in ver 1.0:
1. Load directly from a file and use it in the API service code
2. Load directly from a …
-
**Describe the bug**
I am trying this example from the documentation about [mounting custom middleware](https://docs.bentoml.org/en/latest/guides/server.html?highlight=fastapi#mounting-asgi-based-web…
-
**Describe the bug**
If `conda env update ...` fails when containerizing a bento that specifies a conda environment, the error is silently ignored during the Docker build and `bentoml containerize`…
-
- Why:
- "Problems: bridging the gap between training and serving"
- "Tying it all together."
- Vision:
- "Take the boilerplate out of building and serving models"
- "Minimize code chang…
-
**Describe the bug**
Hello :wave: ,
I'm facing a problem in the `bentoml containerize sknn:xu7rnbhfvw2d7mi2` command in the `Installing pip packages from 'requirements.txt'` stage when it tries to i…