bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.17k stars 792 forks source link

Using separate model.py file causes ModuleNotFoundError when serving model #1902

Closed vinhtq115 closed 2 years ago

vinhtq115 commented 3 years ago

Describe the bug When I use model.py to define a DL model using PyTorch, BentoML can package it but cannot serve it. It always say ModuleNotFoundError: No module named 'model'. If the model's definition is in the packer file, everything works fine.

To Reproduce Here is the link to the code.

Expected behavior Using separate model.py should work (according to readthedocs).

Screenshots/Logs

(bentoML) vinhtq115@Dell-G7-7588:~/PycharmProjects/bentoML2$ bentoml serve ArcFaceService:latest
[2021-10-18 10:57:21,160] INFO - Getting latest version ArcFaceService:20211018105711_7C5819
[2021-10-18 10:57:21,174] INFO - Starting BentoML API proxy in development mode..
[2021-10-18 10:57:21,176] INFO - Starting BentoML API server in development mode..
[2021-10-18 10:57:21,256] INFO - Your system nofile limit is 1048576, which means each instance of microbatch service is able to hold this number of connections at same time. You can increase the number of file descriptors for the server process, or launch more microbatch instances to accept more concurrent connection.
======== Running on http://0.0.0.0:5000 ========
(Press CTRL+C to quit)
Process Process-1:
Traceback (most recent call last):
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/server/__init__.py", line 76, in _start_dev_server
    bento_service = load_from_dir(saved_bundle_path)
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/saved_bundle/loader.py", line 110, in wrapper
    return func(bundle_path, *args)
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/saved_bundle/loader.py", line 272, in load_from_dir
    return svc_cls()
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/__init__.py", line 490, in __init__
    self._config_artifacts()
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/__init__.py", line 559, in _config_artifacts
    self.artifacts.load_all(artifacts_path)
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/artifacts/__init__.py", line 280, in load_all
    artifact.load(path)
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/artifacts/__init__.py", line 175, in wrapped_load
    ret = original(*args, **kwargs)
  File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/frameworks/pytorch.py", line 123, in load
    model = cloudpickle.load(open(self._file_path(path), 'rb'))
ModuleNotFoundError: No module named 'model'

Environment:

parano commented 2 years ago

this issue has been fixed in BentoML 1.0, here's update PyTorch sample project using a separate model.py file that defines the model: https://github.com/bentoml/gallery/tree/main/pytorch