bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.16k stars 792 forks source link

Bug: 'bentoml containerize' doesn't include models in the image #4702

Closed mohsenim closed 6 months ago

mohsenim commented 6 months ago

Describe the bug

'bentoml containerize' doesn't include the model specified in 'bentofile.yaml' in the image. Running the command:

docker run -it --rm -p 3000:3000 predictor:lkpxx2u5o24wpxjr serve

results in this error:

bentoml.exceptions.NotFound: no Models with name 'my_model' exist in BentoML store <osfs '/home/bentoml/bento/models'>

I also checked inside the docker container and noticed the models folder is empty. However, when a runner is used in service.py, as shown in this example, the model is automatically included in the image without needing to add it to 'benotfile.yaml'. Currently, my workaround involves saving and loading the model myself and using the include option in 'benotfile.yaml' to ensure it's included in the image. But if this is the intended method, it raises questions about the purpose and usage of the BentoML Model Store.

To reproduce

1- service.py looks like this:

@bentoml.service(
    resources={"cpu": "1"},
    traffic={"timeout": 10},
)
class Predictor:
    def __init__(self) -> None:
        self.model = bentoml.sklearn.load_model('my_model:latest')

   @bentoml.api
    def predict(self, 
                      input: np.ndarray
                     ) -> np.ndarray:
        result = self.model.predict(input)
        return result

2- benofile.yaml includes this option:

models:
  - "my_model:latest"

3- bentoml build successfully builds the Bento

4- bentoml containerize summarization:latest containerizes the Bento

5- docker run --rm -p 3000:3000 predictor:ep3yr7aiwgczsoaa

6- Any request to the service returns this error

bentoml.exceptions.NotFound: no Models with name 'my_model' exist in BentoML store <osfs '/home/bentoml/bento/models'>

Expected behavior

No response

Environment

BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''

System information

bentoml: 1.2.12 python: 3.12.2 platform: Linux-6.5.0-28-generic-x86_64-with-glibc2.35 uid_gid: 1000:1000 conda: 24.3.0 in_conda_env: True

vmallya-123 commented 2 months ago

I am facing a similar issue, how was this resolved? I have my model in the model store but it throws the same error

frostming commented 2 months ago

@vmallya-123 this is a legacy bug that has been resolved in previous versions.

You should share your configuration and what you observed in order for us to investigate.