Describe the bug
When I use model.py to define a DL model using PyTorch, BentoML can package it but cannot serve it. It always say ModuleNotFoundError: No module named 'model'. If the model's definition is in the packer file, everything works fine.
Expected behavior
Using separate model.py should work (according to readthedocs).
Screenshots/Logs
(bentoML) vinhtq115@Dell-G7-7588:~/PycharmProjects/bentoML2$ bentoml serve ArcFaceService:latest
[2021-10-18 10:57:21,160] INFO - Getting latest version ArcFaceService:20211018105711_7C5819
[2021-10-18 10:57:21,174] INFO - Starting BentoML API proxy in development mode..
[2021-10-18 10:57:21,176] INFO - Starting BentoML API server in development mode..
[2021-10-18 10:57:21,256] INFO - Your system nofile limit is 1048576, which means each instance of microbatch service is able to hold this number of connections at same time. You can increase the number of file descriptors for the server process, or launch more microbatch instances to accept more concurrent connection.
======== Running on http://0.0.0.0:5000 ========
(Press CTRL+C to quit)
Process Process-1:
Traceback (most recent call last):
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/server/__init__.py", line 76, in _start_dev_server
bento_service = load_from_dir(saved_bundle_path)
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/saved_bundle/loader.py", line 110, in wrapper
return func(bundle_path, *args)
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/saved_bundle/loader.py", line 272, in load_from_dir
return svc_cls()
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/__init__.py", line 490, in __init__
self._config_artifacts()
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/__init__.py", line 559, in _config_artifacts
self.artifacts.load_all(artifacts_path)
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/artifacts/__init__.py", line 280, in load_all
artifact.load(path)
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/service/artifacts/__init__.py", line 175, in wrapped_load
ret = original(*args, **kwargs)
File "/home/vinhtq115/miniconda3/envs/bentoML/lib/python3.8/site-packages/bentoml/frameworks/pytorch.py", line 123, in load
model = cloudpickle.load(open(self._file_path(path), 'rb'))
ModuleNotFoundError: No module named 'model'
Describe the bug When I use
model.py
to define a DL model using PyTorch, BentoML can package it but cannot serve it. It always sayModuleNotFoundError: No module named 'model'
. If the model's definition is in the packer file, everything works fine.To Reproduce Here is the link to the code.
Expected behavior Using separate
model.py
should work (according to readthedocs).Screenshots/Logs
Environment: