Closed mohsenim closed 6 months ago
I am facing a similar issue, how was this resolved? I have my model in the model store but it throws the same error
@vmallya-123 this is a legacy bug that has been resolved in previous versions.
You should share your configuration and what you observed in order for us to investigate.
Describe the bug
'bentoml containerize' doesn't include the model specified in 'bentofile.yaml' in the image. Running the command:
results in this error:
I also checked inside the docker container and noticed the
models
folder is empty. However, when a runner is used inservice.py
, as shown in this example, the model is automatically included in the image without needing to add it to 'benotfile.yaml'. Currently, my workaround involves saving and loading the model myself and using theinclude
option in 'benotfile.yaml' to ensure it's included in the image. But if this is the intended method, it raises questions about the purpose and usage of the BentoML Model Store.To reproduce
1-
service.py
looks like this:2-
benofile.yaml
includes this option:3-
bentoml build
successfully builds the Bento4-
bentoml containerize summarization:latest
containerizes the Bento5-
docker run --rm -p 3000:3000 predictor:ep3yr7aiwgczsoaa
6- Any request to the service returns this error
Expected behavior
No response
Environment
System information
bentoml
: 1.2.12python
: 3.12.2platform
: Linux-6.5.0-28-generic-x86_64-with-glibc2.35uid_gid
: 1000:1000conda
: 24.3.0in_conda_env
: True