bentoml / BentoML

The easiest way to serve AI apps and models - Build reliable Inference APIs, LLM apps, Multi-model chains, RAG service, and much more!
https://bentoml.com
Apache License 2.0
7.08k stars 784 forks source link

bentoml.exceptions.NotFound: no Models with name Error #2395

Closed sarmientoj24 closed 2 years ago

sarmientoj24 commented 2 years ago

I followed the instructions for building the bento using bentoml build with the ff bentofile.yaml

# bentofile.yaml
service: "my__service.py:service"  
description: "file: ./README.md"
labels:
    owner: team
    stage: demo
include:
 - "*.py"  # A pattern for matching which files to include in the bento
python:
  packages:
   - onnx
   - onnxruntime
   - numpy
   - opencv-python-headless
   - Pillow
   - pyarrow

At this point, I tried serving the bento and it works

$ bentoml serve my_service:latest --production

Then containerized the Bento using the ff command:

$ bentoml containerize my_service:latest

which was pretty successful.

When running the docker, this error shows up

04/06/22 09:53:34 INFO     [cli] Service loaded from Bento directory: bentoml.Se
                           rvice(tag="bipa_detection_onnx:sarlh2vvrwnkfump",    
                           path="/home/bentoml/bento/")                         
04/06/22 09:53:34 INFO     [cli] Starting production BentoServer from           
                           "bento_identifier" running on http://0.0.0.0:5000    
                           (Press CTRL+C to quit)                               
04/06/22 09:53:35 INFO     [bipa_detection_onnx] Service loaded from Bento      
                           directory: bentoml.Service(tag="bipa_detection_onnx:s
                           arlh2vvrwnkfump", path="/home/bentoml/bento/")       
04/06/22 09:53:35 INFO     [api_server] Service loaded from Bento directory: ben
                           toml.Service(tag="bipa_detection_onnx:sarlh2vvrwnkfum
                           p", path="/home/bentoml/bento/")                     
04/06/22 09:53:35 INFO     [bipa_detection_onnx] Started server process [26]    
04/06/22 09:53:35 INFO     [bipa_detection_onnx] Waiting for application        
                           startup.                                             
04/06/22 09:53:35 ERROR    [bipa_detection_onnx] Traceback (most recent call    
                           last):                                               
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/osfs.py", 
                           line 655, in open                                    
                               **options                                        
                           FileNotFoundError: [Errno 2] No such file or         
                           directory:                                           
                           b'/home/bentoml/models/bipa_detection_onnx/latest'   

                           During handling of the above exception, another      
                           exception occurred:                                  

                           Traceback (most recent call last):                   
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/store.py", line 103, in get              
                               _tag.version =                                   
                           self._fs.readtext(_tag.latest_path())                
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/base.py", 
                           line 692, in readtext                                
                               path, mode="rt", encoding=encoding,              
                           errors=errors, newline=newline                       
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/osfs.py", 
                           line 655, in open                                    
                               **options                                        
                             File "/opt/conda/lib/python3.7/site-packages/fs/err
                           or_tools.py", line 89, in __exit__                   
                               reraise(fserror, fserror(self._path,             
                           exc=exc_value), traceback)                           
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/six.py", line
                           718, in reraise                                      
                               raise value.with_traceback(tb)                   
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/osfs.py", 
                           line 655, in open                                    
                               **options                                        
                           fs.errors.ResourceNotFound: resource                 
                           'bipa_detection_onnx/latest' not found               

                           During handling of the above exception, another      
                           exception occurred:                                  

                           Traceback (most recent call last):                   
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 624, in lifespan               
                               async with self.lifespan_context(app):           
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 521, in __aenter__             
                               await self._router.startup()                     
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 603, in startup                
                               handler()                                        
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/runner/local.py", line 16, in setup      
                               self._runner._setup()  # type:                   
                           ignore[reportPrivateUsage]                           
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/frameworks/onnx.py", line 334, in _setup 
                               session_options=session_options,                 
                             File "/opt/conda/lib/python3.7/site-packages/simple
                           _di/__init__.py", line 139, in _                     
                               return func(*_inject_args(bind.args),            
                           **_inject_kwargs(bind.kwargs))                       
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/frameworks/onnx.py", line 126, in load   
                               model = model_store.get(tag)                     
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/store.py", line 106, in get              
                               f"no {self._item_type.__name__}s with name       
                           '{_tag.name}' exist in BentoML store {self._fs}"     
                           bentoml.exceptions.NotFound: no Models with name     
                           'bipa_detection_onnx' exist in BentoML store <osfs   
                           '/home/bentoml/models'>                              

04/06/22 09:53:35 INFO     [api_server] Started server process [27]             
04/06/22 09:53:35 ERROR    [bipa_detection_onnx] Application startup failed.    
                           Exiting.                                             
04/06/22 09:53:35 INFO     [api_server] Waiting for application startup.        
04/06/22 09:53:35 INFO     [api_server] Application startup complete.           
04/06/22 09:53:36 INFO     [bipa_detection_onnx] Service loaded from Bento      
                           directory: bentoml.Service(tag="bipa_detection_onnx:s
                           arlh2vvrwnkfump", path="/home/bentoml/bento/")       
04/06/22 09:53:36 INFO     [bipa_detection_onnx] Started server process [62]    
04/06/22 09:53:36 INFO     [bipa_detection_onnx] Waiting for application        
                           startup.                                             
04/06/22 09:53:36 ERROR    [bipa_detection_onnx] Traceback (most recent call    
                           last):                                               
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/osfs.py", 
                           line 655, in open                                    
                               **options                                        
                           FileNotFoundError: [Errno 2] No such file or         
                           directory:                                           
                           b'/home/bentoml/models/bipa_detection_onnx/latest'   

                           During handling of the above exception, another      
                           exception occurred:                                  

                           Traceback (most recent call last):                   
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/store.py", line 103, in get              
                               _tag.version =                                   
                           self._fs.readtext(_tag.latest_path())                
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/base.py", 
                           line 692, in readtext                                
                               path, mode="rt", encoding=encoding,              
                           errors=errors, newline=newline                       
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/osfs.py", 
                           line 655, in open                                    
                               **options                                        
                             File "/opt/conda/lib/python3.7/site-packages/fs/err
                           or_tools.py", line 89, in __exit__                   
                               reraise(fserror, fserror(self._path,             
                           exc=exc_value), traceback)                           
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/six.py", line
                           718, in reraise                                      
                               raise value.with_traceback(tb)                   
                             File                                               
                           "/opt/conda/lib/python3.7/site-packages/fs/osfs.py", 
                           line 655, in open                                    
                               **options                                        
                           fs.errors.ResourceNotFound: resource                 
                           'bipa_detection_onnx/latest' not found               

                           During handling of the above exception, another      
                           exception occurred:                                  

                           Traceback (most recent call last):                   
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 624, in lifespan               
                               async with self.lifespan_context(app):           
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 521, in __aenter__             
                               await self._router.startup()                     
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 603, in startup                
                               handler()                                        
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/runner/local.py", line 16, in setup      
                               self._runner._setup()  # type:                   
                           ignore[reportPrivateUsage]                           
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/frameworks/onnx.py", line 334, in _setup 
                               session_options=session_options,                 
                             File "/opt/conda/lib/python3.7/site-packages/simple
                           _di/__init__.py", line 139, in _                     
                               return func(*_inject_args(bind.args),            
                           **_inject_kwargs(bind.kwargs))                       
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/frameworks/onnx.py", line 126, in load   
                               model = model_store.get(tag)                     
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/store.py", line 106, in get              
                               f"no {self._item_type.__name__}s with name       
                           '{_tag.name}' exist in BentoML store {self._fs}"     
                           bentoml.exceptions.NotFound: no Models with name     
                           'bipa_detection_onnx' exist in BentoML store <osfs   
                           '/home/bentoml/models'> 
  1. What would be the possible problem on this?
  2. Do I need to save the model before building the Bento?
  3. Doesn't containerizing it also packages the model itself?
timliubentoml commented 2 years ago

Hi @sarmientoj24! Hope I can help.

What would be the possible problem on this? I'm not totally sure why "bentoml serve" would work, but containerizing it then running it would throw this error. If you reference a model name which you haven't saved, then you it will definitely not work and throw this error though

Do I need to save the model before building the Bento? Yes, definitely, you need to save with the same model name that you later load it in your service.

Doesn't containerizing it also packages the model itself?

Are you sure you saved it with consistent naming? If you go look at the bento in ~/bentoml/bentos, is the model present in the bento itself?

sarmientoj24 commented 2 years ago

If you reference a model name which you haven't saved, then you it will definitely not work and throw this error though

So is there a need to save the model locally first? I am under the impression that a new model is saved automatically when you create the bento using bentoml build? Hence, I do not need to create it prior to building the bento.

My tree directory is this

├── bentos
│   └── bipa_detection_onnx
│       ├── latest
│       └── sarlh2vvrwnkfump    <--- Here is the model automatically created by bentoml build
│           ├── apis
│           │   └── openapi.yaml
│           ├── bento.yaml
│           ├── env
│           │   ├── conda
│           │   ├── docker
│           │   │   ├── Dockerfile
│           │   │   ├── entrypoint.sh
│           │   │   └── init.sh
│           │   └── python
│           │       ├── requirements.lock.txt
│           │       ├── requirements.txt
│           │       └── version.txt
│           ├── models
│           │   └── bipa_detection_onnx
│           │       ├── latest
│           │       └── toyaauft52menp6x
│           │           ├── model.yaml
│           │           └── saved_model.onnx
│           ├── README.md
│           └── src
sarmientoj24 commented 2 years ago

Recreated a new one where I used my saved model.

Inside service script

runner = bentoml.onnx.load_runner(
    "bipa_detection_onnx:toyaauft52menp6x", providers=["CPUExecutionProvider"]
)
service = bentoml.Service("bipa_detection_onnx", runners=[runner])

Building (using same YAML as before)

(yolov5) user@user:~/workspace/user/user$ bentoml build
Wednesday, 06 April, 2022 11:08:01 PM  INFO     [cli] Building BentoML service "bipa_detection_onnx:licgpmvvxonkfump" from build context                    
                                                "/home/user/workspace/user/user"                                                                           
Wednesday, 06 April, 2022 11:08:01 PM  INFO     [cli] Packing model "bipa_detection_onnx:toyaauft52menp6x" from                                             
                                                "/home/user/bentoml/models/bipa_detection_onnx/toyaauft52menp6x"                                           
Wednesday, 06 April, 2022 11:08:01 PM  INFO     [cli] Locking PyPI package versions..                                                                       
Wednesday, 06 April, 2022 11:08:04 PM  INFO     [cli]                                                                                                       
                                                ██████╗░███████╗███╗░░██╗████████╗░█████╗░███╗░░░███╗██╗░░░░░                                               
                                                ██╔══██╗██╔════╝████╗░██║╚══██╔══╝██╔══██╗████╗░████║██║░░░░░                                               
                                                ██████╦╝█████╗░░██╔██╗██║░░░██║░░░██║░░██║██╔████╔██║██║░░░░░                                               
                                                ██╔══██╗██╔══╝░░██║╚████║░░░██║░░░██║░░██║██║╚██╔╝██║██║░░░░░                                               
                                                ██████╦╝███████╗██║░╚███║░░░██║░░░╚█████╔╝██║░╚═╝░██║███████╗                                               
                                                ╚═════╝░╚══════╝╚═╝░░╚══╝░░░╚═╝░░░░╚════╝░╚═╝░░░░░╚═╝╚══════╝                                               

Wednesday, 06 April, 2022 11:08:04 PM  INFO     [cli] Successfully built Bento(tag="bipa_detection_onnx:licgpmvvxonkfump") at                               
                                                "/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump/"   
$ bentoml list
 Tag                                   Service                                Path                                           Size       Creation Time       
 bipa_detection_onnx:licgpmvvxonkfump  cavity_pa_detection_bipa_onnx:service  /home/user/bentoml/bentos/bipa_detection_on…  79.75 MiB  2022-04-06 15:08:04

Testing serving the Bento (works)

(yolov5) user@user:~/workspace/app/app$ bentoml serve bipa_detection_onnx:licgpmvvxonkfump --production
Wednesday, 06 April, 2022 11:31:00 PM  INFO     [cli] Service loaded from Bento store: bentoml.Service(tag="bipa_detection_onnx:licgpmvvxonkfump",          
                                                path="/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump")                                     
Wednesday, 06 April, 2022 11:31:00 PM  INFO     [cli] Starting production BentoServer from "bento_identifier" running on http://0.0.0.0:5000 (Press CTRL+C  
                                                to quit)                                                                                                    
Wednesday, 06 April, 2022 11:31:01 PM  INFO     [bipa_detection_onnx] Service loaded from Bento store:                                                      
                                                bentoml.Service(tag="bipa_detection_onnx:licgpmvvxonkfump",                                                 
                                                path="/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump")                                     
Wednesday, 06 April, 2022 11:31:01 PM  INFO     [api_server] Service loaded from Bento store: bentoml.Service(tag="bipa_detection_onnx:licgpmvvxonkfump",   
                                                path="/home/user/bentoml/bentos/bipa_detection_onnx/licgpmvvxonkfump"

Containerize (successful)

(yolov5) user@user:~/workspace/app/app$ bentoml containerize bipa_detection_onnx:latest
Wednesday, 06 April, 2022 11:13:15 PM  INFO     [cli] Building docker image for Bento(tag="bipa_detection_onnx:licgpmvvxonkfump")...                        
Wednesday, 06 April, 2022 11:13:39 PM  INFO     [cli] Successfully built docker image "bipa_detection_onnx:licgpmvvxonkfump

Serve (Error)

(yolov5) user@user:~/workspace/app/app$ docker run bipa_detection_onnx:licgpmvvxonkfump
04/06/22 15:28:17 INFO     [cli] Service loaded from Bento directory: bentoml.Se
                           rvice(tag="bipa_detection_onnx:licgpmvvxonkfump",    
                           path="/home/bentoml/bento/")                         
04/06/22 15:28:17 INFO     [cli] Starting production BentoServer from           
                           "bento_identifier" running on http://0.0.0.0:5000    
                           (Press CTRL+C to quit)                               
04/06/22 15:28:18 INFO     [bipa_detection_onnx] Service loaded from Bento      
                           directory: bentoml.Service(tag="bipa_detection_onnx:l
                           icgpmvvxonkfump", path="/home/bentoml/bento/")       
04/06/22 15:28:18 INFO     [bipa_detection_onnx] Started server process [26]    
04/06/22 15:28:18 INFO     [bipa_detection_onnx] Waiting for application        
                           startup.                                             
04/06/22 15:28:18 ERROR    [bipa_detection_onnx] Traceback (most recent call    
                           last):                                               
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 624, in lifespan               
                               async with self.lifespan_context(app):           
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 521, in __aenter__             
                               await self._router.startup()                     
                             File "/opt/conda/lib/python3.7/site-packages/starle
                           tte/routing.py", line 603, in startup                
                               handler()                                        
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/runner/local.py", line 16, in setup      
                               self._runner._setup()  # type:                   
                           ignore[reportPrivateUsage]                           
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/frameworks/onnx.py", line 334, in _setup 
                               session_options=session_options,                 
                             File "/opt/conda/lib/python3.7/site-packages/simple
                           _di/__init__.py", line 139, in _                     
                               return func(*_inject_args(bind.args),            
                           **_inject_kwargs(bind.kwargs))                       
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/frameworks/onnx.py", line 126, in load   
                               model = model_store.get(tag)                     
                             File "/opt/conda/lib/python3.7/site-packages/bentom
                           l/_internal/store.py", line 117, in get              
                               f"{self._item_type.__name__} '{tag}' is not found
                           in BentoML store {self._fs}"                         
                           bentoml.exceptions.NotFound: Model                   
                           'bipa_detection_onnx:toyaauft52menp6x' is not found  
                           in BentoML store <osfs '/home/bentoml/models'>

Extra (bentoml tree)

── bentos
│   └── bipa_detection_onnx
│       ├── latest
│       └── licgpmvvxonkfump
│           ├── apis
│           │   └── openapi.yaml
│           ├── bento.yaml
│           ├── env
│           │   ├── conda
│           │   ├── docker
│           │   │   ├── Dockerfile
│           │   │   ├── entrypoint.sh
│           │   │   └── init.sh
│           │   └── python
│           │       ├── requirements.lock.txt
│           │       ├── requirements.txt
│           │       └── version.txt
│           ├── models
│           │   └── bipa_detection_onnx
│           │       ├── latest
│           │       └── toyaauft52menp6x
│           │           ├── model.yaml
│           │           └── saved_model.onnx
│           ├── README.md
│           └── src
├── models
│   ├── bipa_detection_onnx
│   │   ├── latest
│   │   └── toyaauft52menp6x
│   │       ├── model.yaml
│   │       └── saved_model.onnx
sarmientoj24 commented 2 years ago

Doesn't containerizing it also packages the model itself?

What I mean is that, if I am to push the Docker image that I got from bentoml containerize to, say, dockerhub, does it also "ship" or contain the model (or atleast a way to put the model in the container) when someone pulls it from the dockerhub.

Or is the model shipped separately?

timliubentoml commented 2 years ago

No, the model is not shipped separately.

I haven't seen this error but let me suggest 2 things.

  1. Could you upgrade bentoml to the latest version? We just released a new version last night. You can run "pip install bentoml --pre -U". Then could you attempt to save() the model again, then bentoml build, then bentoml containerize?

  2. If it still doesn't work, could you shell into the docker container and look in the bentos directory to see if the model is there? That should point us in the right direction.

sarmientoj24 commented 2 years ago

Alright. It works now on the new release (the first in your instructions) of your instructions! Thank you. But I'd like to add that you cannot directly access the docker from host with just the simple docker run image:tag command. There is a need for port exposure: docker run --rm -it -p 3000:3000 image:tag

aarnphm commented 2 years ago

that is expected to expose the port of docker container. https://docs.docker.com/config/containers/container-networking/

mohsenim commented 5 months ago

I'm running into the same issue when I dockerize my model. bentoml serve works fine, but running the docker image gives me this:

$ docker run --rm -p 3000:3000 car_price_prediction:ep3yr7aiwgczsoaa

Error: [serve] `serve` failed: no Models with name 'car_price_model' exist in BentoML store

Also, when I check inside the docker container, the models folder is empty. How do I fix this and get the model into the image as well?"