Closed sarmientoj24 closed 2 years ago
Hi @sarmientoj24! Hope I can help.
What would be the possible problem on this? I'm not totally sure why "bentoml serve" would work, but containerizing it then running it would throw this error. If you reference a model name which you haven't saved, then you it will definitely not work and throw this error though
Do I need to save the model before building the Bento? Yes, definitely, you need to save with the same model name that you later load it in your service.
Doesn't containerizing it also packages the model itself?
Are you sure you saved it with consistent naming? If you go look at the bento in ~/bentoml/bentos, is the model present in the bento itself?
If you reference a model name which you haven't saved, then you it will definitely not work and throw this error though
So is there a need to save the model locally first? I am under the impression that a new model is saved automatically when you create the bento using bentoml build
? Hence, I do not need to create it prior to building the bento.
My tree directory is this
├── bentos
│ └── bipa_detection_onnx
│ ├── latest
│ └── sarlh2vvrwnkfump <--- Here is the model automatically created by bentoml build
│ ├── apis
│ │ └── openapi.yaml
│ ├── bento.yaml
│ ├── env
│ │ ├── conda
│ │ ├── docker
│ │ │ ├── Dockerfile
│ │ │ ├── entrypoint.sh
│ │ │ └── init.sh
│ │ └── python
│ │ ├── requirements.lock.txt
│ │ ├── requirements.txt
│ │ └── version.txt
│ ├── models
│ │ └── bipa_detection_onnx
│ │ ├── latest
│ │ └── toyaauft52menp6x
│ │ ├── model.yaml
│ │ └── saved_model.onnx
│ ├── README.md
│ └── src
Recreated a new one where I used my saved model.
Inside service script
runner = bentoml.onnx.load_runner(
"bipa_detection_onnx:toyaauft52menp6x", providers=["CPUExecutionProvider"]
)
service = bentoml.Service("bipa_detection_onnx", runners=[runner])
(yolov5) user@user:~/workspace/user/user$ bentoml build
Wednesday, 06 April, 2022 11:08:01 PM INFO [cli] Building BentoML service "bipa_detection_onnx:licgpmvvxonkfump" from build context
"/home/user/workspace/user/user"
Wednesday, 06 April, 2022 11:08:01 PM INFO [cli] Packing model "bipa_detection_onnx:toyaauft52menp6x" from
"/home/user/bentoml/models/bipa_detection_onnx/toyaauft52menp6x"
Wednesday, 06 April, 2022 11:08:01 PM INFO [cli] Locking PyPI package versions..
Wednesday, 06 April, 2022 11:08:04 PM INFO [cli]
██████╗░███████╗███╗░░██╗████████╗░█████╗░███╗░░░███╗██╗░░░░░
██╔══██╗██╔════╝████╗░██║╚══██╔══╝██╔══██╗████╗░████║██║░░░░░
██████╦╝█████╗░░██╔██╗██║░░░██║░░░██║░░██║██╔████╔██║██║░░░░░
██╔══██╗██╔══╝░░██║╚████║░░░██║░░░██║░░██║██║╚██╔╝██║██║░░░░░
██████╦╝███████╗██║░╚███║░░░██║░░░╚█████╔╝██║░╚═╝░██║███████╗
╚═════╝░╚══════╝╚═╝░░╚══╝░░░╚═╝░░░░╚════╝░╚═╝░░░░░╚═╝╚══════╝
Wednesday, 06 April, 2022 11:08:04 PM INFO [cli] Successfully built Bento(tag="bipa_detection_onnx:licgpmvvxonkfump") at
"/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump/"
$ bentoml list
Tag Service Path Size Creation Time
bipa_detection_onnx:licgpmvvxonkfump cavity_pa_detection_bipa_onnx:service /home/user/bentoml/bentos/bipa_detection_on… 79.75 MiB 2022-04-06 15:08:04
(yolov5) user@user:~/workspace/app/app$ bentoml serve bipa_detection_onnx:licgpmvvxonkfump --production
Wednesday, 06 April, 2022 11:31:00 PM INFO [cli] Service loaded from Bento store: bentoml.Service(tag="bipa_detection_onnx:licgpmvvxonkfump",
path="/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump")
Wednesday, 06 April, 2022 11:31:00 PM INFO [cli] Starting production BentoServer from "bento_identifier" running on http://0.0.0.0:5000 (Press CTRL+C
to quit)
Wednesday, 06 April, 2022 11:31:01 PM INFO [bipa_detection_onnx] Service loaded from Bento store:
bentoml.Service(tag="bipa_detection_onnx:licgpmvvxonkfump",
path="/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump")
Wednesday, 06 April, 2022 11:31:01 PM INFO [api_server] Service loaded from Bento store: bentoml.Service(tag="bipa_detection_onnx:licgpmvvxonkfump",
path="/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump"
(yolov5) user@user:~/workspace/app/app$ bentoml containerize bipa_detection_onnx:latest
Wednesday, 06 April, 2022 11:13:15 PM INFO [cli] Building docker image for Bento(tag="bipa_detection_onnx:licgpmvvxonkfump")...
Wednesday, 06 April, 2022 11:13:39 PM INFO [cli] Successfully built docker image "bipa_detection_onnx:licgpmvvxonkfump
(yolov5) user@user:~/workspace/app/app$ docker run bipa_detection_onnx:licgpmvvxonkfump
04/06/22 15:28:17 INFO [cli] Service loaded from Bento directory: bentoml.Se
rvice(tag="bipa_detection_onnx:licgpmvvxonkfump",
path="/home/bentoml/bento/")
04/06/22 15:28:17 INFO [cli] Starting production BentoServer from
"bento_identifier" running on http://0.0.0.0:5000
(Press CTRL+C to quit)
04/06/22 15:28:18 INFO [bipa_detection_onnx] Service loaded from Bento
directory: bentoml.Service(tag="bipa_detection_onnx:l
icgpmvvxonkfump", path="/home/bentoml/bento/")
04/06/22 15:28:18 INFO [bipa_detection_onnx] Started server process [26]
04/06/22 15:28:18 INFO [bipa_detection_onnx] Waiting for application
startup.
04/06/22 15:28:18 ERROR [bipa_detection_onnx] Traceback (most recent call
last):
File "/opt/conda/lib/python3.7/site-packages/starle
tte/routing.py", line 624, in lifespan
async with self.lifespan_context(app):
File "/opt/conda/lib/python3.7/site-packages/starle
tte/routing.py", line 521, in __aenter__
await self._router.startup()
File "/opt/conda/lib/python3.7/site-packages/starle
tte/routing.py", line 603, in startup
handler()
File "/opt/conda/lib/python3.7/site-packages/bentom
l/_internal/runner/local.py", line 16, in setup
self._runner._setup() # type:
ignore[reportPrivateUsage]
File "/opt/conda/lib/python3.7/site-packages/bentom
l/_internal/frameworks/onnx.py", line 334, in _setup
session_options=session_options,
File "/opt/conda/lib/python3.7/site-packages/simple
_di/__init__.py", line 139, in _
return func(*_inject_args(bind.args),
**_inject_kwargs(bind.kwargs))
File "/opt/conda/lib/python3.7/site-packages/bentom
l/_internal/frameworks/onnx.py", line 126, in load
model = model_store.get(tag)
File "/opt/conda/lib/python3.7/site-packages/bentom
l/_internal/store.py", line 117, in get
f"{self._item_type.__name__} '{tag}' is not found
in BentoML store {self._fs}"
bentoml.exceptions.NotFound: Model
'bipa_detection_onnx:toyaauft52menp6x' is not found
in BentoML store <osfs '/home/bentoml/models'>
── bentos
│ └── bipa_detection_onnx
│ ├── latest
│ └── licgpmvvxonkfump
│ ├── apis
│ │ └── openapi.yaml
│ ├── bento.yaml
│ ├── env
│ │ ├── conda
│ │ ├── docker
│ │ │ ├── Dockerfile
│ │ │ ├── entrypoint.sh
│ │ │ └── init.sh
│ │ └── python
│ │ ├── requirements.lock.txt
│ │ ├── requirements.txt
│ │ └── version.txt
│ ├── models
│ │ └── bipa_detection_onnx
│ │ ├── latest
│ │ └── toyaauft52menp6x
│ │ ├── model.yaml
│ │ └── saved_model.onnx
│ ├── README.md
│ └── src
├── models
│ ├── bipa_detection_onnx
│ │ ├── latest
│ │ └── toyaauft52menp6x
│ │ ├── model.yaml
│ │ └── saved_model.onnx
Doesn't containerizing it also packages the model itself?
What I mean is that, if I am to push the Docker image that I got from bentoml containerize
to, say, dockerhub, does it also "ship" or contain the model (or atleast a way to put the model in the container) when someone pulls it from the dockerhub.
Or is the model shipped separately?
No, the model is not shipped separately.
I haven't seen this error but let me suggest 2 things.
Could you upgrade bentoml to the latest version? We just released a new version last night. You can run "pip install bentoml --pre -U". Then could you attempt to save() the model again, then bentoml build, then bentoml containerize?
If it still doesn't work, could you shell into the docker container and look in the bentos directory to see if the model is there? That should point us in the right direction.
Alright. It works now on the new release (the first in your instructions) of your instructions! Thank you.
But I'd like to add that you cannot directly access the docker from host with just the simple docker run image:tag
command. There is a need for port exposure: docker run --rm -it -p 3000:3000 image:tag
that is expected to expose the port of docker container. https://docs.docker.com/config/containers/container-networking/
I'm running into the same issue when I dockerize my model. bentoml serve
works fine, but running the docker image gives me this:
$ docker run --rm -p 3000:3000 car_price_prediction:ep3yr7aiwgczsoaa
Error: [serve] `serve` failed: no Models with name 'car_price_model' exist in BentoML store
Also, when I check inside the docker container, the models
folder is empty. How do I fix this and get the model into the image as well?"
I followed the instructions for building the bento using
bentoml build
with the ffbentofile.yaml
At this point, I tried serving the bento and it works
Then containerized the Bento using the ff command:
which was pretty successful.
When running the docker, this error shows up