pytorch / serve

Serve, optimize and scale PyTorch models in production
https://pytorch.org/serve/
Apache License 2.0
4.2k stars 859 forks source link

Serving Huggingface Transformers using TorchServe official demo can't run successfully #1695

Open choshiho opened 2 years ago

choshiho commented 2 years ago

🐛 Describe the bug

When I ran the official demo Serving Huggingface Transformers using TorchServe - Sequence Classification, I got the error logs as follows.

Error logs

torchserve --start --model-store model_store --models my_tc=BERTSeqClassification.mar --ncs --ts-config config.properties
(base) [root@localhost Huggingface_Transformers]# WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
2022-06-19T20:37:14,727 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager...
2022-06-19T20:37:14,806 [INFO ] main org.pytorch.serve.ModelServer -
Torchserve version: 0.6.0
TS Home: /root/anaconda3/lib/python3.9/site-packages
Current directory: /root/zhaozhifeng/serve/examples/Huggingface_Transformers
Temp directory: /tmp
Number of GPUs: 2
Number of CPUs: 12
Max heap size: 15988 M
Python executable: /root/anaconda3/bin/python3.9
Config file: config.properties
Inference address: http://127.0.0.1:8083
Management address: http://127.0.0.1:8081
Metrics address: http://127.0.0.1:8082
Model Store: /root/zhaozhifeng/serve/examples/Huggingface_Transformers/model_store
Initial Models: my_tc=BERTSeqClassification.mar
Log dir: /root/zhaozhifeng/serve/examples/Huggingface_Transformers/logs
Metrics dir: /root/zhaozhifeng/serve/examples/Huggingface_Transformers/logs
Netty threads: 0
Netty client threads: 0
Default workers per model: 2
Blacklist Regex: N/A
Maximum Response Size: 6553500
Maximum Request Size: 6553500
Limit Maximum Image Pixels: true
Prefer direct buffer: false
Allowed Urls: [file://.*|http(s)?://.*]
Custom python dependency for model allowed: false
Metrics report format: prometheus
Enable metrics API: true
Workflow Store: /root/zhaozhifeng/serve/examples/Huggingface_Transformers/model_store
Model config: N/A
2022-06-19T20:37:14,812 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager -  Loading snapshot serializer plugin...
2022-06-19T20:37:14,829 [INFO ] main org.pytorch.serve.ModelServer - Loading initial models: BERTSeqClassification.mar
2022-06-19T20:37:20,112 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model my_tc
2022-06-19T20:37:20,113 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model my_tc
2022-06-19T20:37:20,113 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model my_tc loaded.
2022-06-19T20:37:20,113 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: my_tc, count: 2
2022-06-19T20:37:20,133 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/root/anaconda3/bin/python3.9, /root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9000]
2022-06-19T20:37:20,133 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/root/anaconda3/bin/python3.9, /root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9001]
2022-06-19T20:37:20,134 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel.
2022-06-19T20:37:20,185 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8083
2022-06-19T20:37:20,185 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel.
2022-06-19T20:37:20,186 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081
2022-06-19T20:37:20,186 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel.
2022-06-19T20:37:20,187 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082
Model server started.
2022-06-19T20:37:20,332 [WARN ] pool-3-thread-1 org.pytorch.serve.metrics.MetricCollector - worker pid is not available yet.
2022-06-19T20:37:20,749 [WARN ] pool-3-thread-1 org.pytorch.serve.metrics.MetricCollector - Parse metrics failed: Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
2022-06-19T20:37:20,749 [WARN ] pool-3-thread-1 org.pytorch.serve.metrics.MetricCollector - Parse metrics failed: NumExpr defaulting to 8 threads.
2022-06-19T20:37:21,075 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9001
2022-06-19T20:37:21,077 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - [PID]30816
2022-06-19T20:37:21,077 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Torch worker started.
2022-06-19T20:37:21,078 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Python runtime: 3.9.7
2022-06-19T20:37:21,078 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-my_tc_1.0 State change null -> WORKER_STARTED
2022-06-19T20:37:21,082 [INFO ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2022-06-19T20:37:21,082 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000
2022-06-19T20:37:21,083 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - [PID]30815
2022-06-19T20:37:21,083 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Torch worker started.
2022-06-19T20:37:21,083 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Python runtime: 3.9.7
2022-06-19T20:37:21,083 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-my_tc_1.0 State change null -> WORKER_STARTED
2022-06-19T20:37:21,083 [INFO ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2022-06-19T20:37:21,090 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9000.
2022-06-19T20:37:21,090 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9001.
2022-06-19T20:37:21,092 [INFO ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1655642241092
2022-06-19T20:37:21,092 [INFO ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1655642241092
2022-06-19T20:37:21,120 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - model_name: my_tc, batchSize: 1
2022-06-19T20:37:21,120 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - model_name: my_tc, batchSize: 1
2022-06-19T20:37:22,729 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
2022-06-19T20:37:22,729 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
2022-06-19T20:37:22,729 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - NumExpr defaulting to 8 threads.
2022-06-19T20:37:22,729 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - NumExpr defaulting to 8 threads.
2022-06-19T20:37:22,928 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Backend worker process died.
2022-06-19T20:37:22,928 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Backend worker process died.
2022-06-19T20:37:22,928 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:22,928 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 100, in load
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 100, in load
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module, function_name = self._load_handler_file(handler)
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module, function_name = self._load_handler_file(handler)
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 162, in _load_handler_file
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 162, in _load_handler_file
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name)
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name)
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:22,929 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:22,929 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap_external>", line 850, in exec_module
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap_external>", line 850, in exec_module
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/tmp/models/fa1d6fc0c6bb4578a14cf963c6cfb4cb/Transformer_handler_generalized.py", line 18, in <module>
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/tmp/models/fa1d6fc0c6bb4578a14cf963c6cfb4cb/Transformer_handler_generalized.py", line 18, in <module>
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from captum.attr import LayerIntegratedGradients
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from captum.attr import LayerIntegratedGradients
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/__init__.py", line 54, in <module>
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/__init__.py", line 54, in <module>
2022-06-19T20:37:22,930 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from captum.attr._utils import visualization  # noqa
2022-06-19T20:37:22,930 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from captum.attr._utils import visualization  # noqa
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/_utils/visualization.py", line 7, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/_utils/visualization.py", line 7, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import pyplot as plt
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import pyplot as plt
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/__init__.py", line 107, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/__init__.py", line 107, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from . import _api, cbook, docstring, rcsetup
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from . import _api, cbook, docstring, rcsetup
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/rcsetup.py", line 26, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/rcsetup.py", line 26, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.colors import Colormap, is_color_like
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.colors import Colormap, is_color_like
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/colors.py", line 82, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/colors.py", line 82, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import _api, cbook, scale
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import _api, cbook, scale
2022-06-19T20:37:22,931 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/scale.py", line 18, in <module>
2022-06-19T20:37:22,931 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/scale.py", line 18, in <module>
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.ticker import (
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.ticker import (
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/ticker.py", line 179, in <module>
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/ticker.py", line 179, in <module>
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import transforms as mtransforms
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import transforms as mtransforms
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/transforms.py", line 46, in <module>
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/transforms.py", line 46, in <module>
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib._path import (
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib._path import (
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /root/anaconda3/lib/python3.9/site-packages/matplotlib/_path.cpython-39-x86_64-linux-gnu.so)
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /root/anaconda3/lib/python3.9/site-packages/matplotlib/_path.cpython-39-x86_64-linux-gnu.so)
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - During handling of the above exception, another exception occurred:
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - During handling of the above exception, another exception occurred:
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:22,932 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:22,932 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 210, in <module>
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 210, in <module>
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     worker.run_server()
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     worker.run_server()
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 181, in run_server
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 181, in run_server
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     self.handle_connection(cl_socket)
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     self.handle_connection(cl_socket)
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 139, in handle_connection
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 139, in handle_connection
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     service, result, code = self.load_model(msg)
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     service, result, code = self.load_model(msg)
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 104, in load_model
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 104, in load_model
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     service = model_loader.load(
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     service = model_loader.load(
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 102, in load
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 102, in load
2022-06-19T20:37:22,933 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module = self._load_default_handler(handler)
2022-06-19T20:37:22,933 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module = self._load_default_handler(handler)
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 167, in _load_default_handler
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 167, in _load_default_handler
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name, "ts.torch_handler")
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name, "ts.torch_handler")
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:22,934 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
2022-06-19T20:37:22,934 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:22,935 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:22,935 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:22,935 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
2022-06-19T20:37:22,935 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized'
2022-06-19T20:37:22,935 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:22,935 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:22,935 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
2022-06-19T20:37:22,935 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized'
2022-06-19T20:37:23,168 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2022-06-19T20:37:23,169 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2022-06-19T20:37:23,172 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2022-06-19T20:37:23,173 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2022-06-19T20:37:23,173 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException: null
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
        at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
        at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
        at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
        at java.lang.Thread.run(Thread.java:834) [?:?]
2022-06-19T20:37:23,170 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException: null
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
        at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
        at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
        at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
        at java.lang.Thread.run(Thread.java:834) [?:?]
2022-06-19T20:37:23,181 [WARN ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: my_tc, error: Worker died.
2022-06-19T20:37:23,181 [WARN ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: my_tc, error: Worker died.
2022-06-19T20:37:23,181 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-my_tc_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2022-06-19T20:37:23,181 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-my_tc_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2022-06-19T20:37:23,182 [WARN ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-my_tc_1.0-stderr
2022-06-19T20:37:23,182 [WARN ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-my_tc_1.0-stdout
2022-06-19T20:37:23,182 [WARN ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-my_tc_1.0-stderr
2022-06-19T20:37:23,182 [WARN ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-my_tc_1.0-stdout
2022-06-19T20:37:23,182 [INFO ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 1 seconds.
2022-06-19T20:37:23,183 [INFO ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds.
2022-06-19T20:37:23,259 [INFO ] W-9000-my_tc_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-my_tc_1.0-stdout
2022-06-19T20:37:23,259 [INFO ] W-9000-my_tc_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-my_tc_1.0-stderr
2022-06-19T20:37:23,268 [INFO ] W-9001-my_tc_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-my_tc_1.0-stdout
2022-06-19T20:37:23,268 [INFO ] W-9001-my_tc_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-my_tc_1.0-stderr
2022-06-19T20:37:24,183 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/root/anaconda3/bin/python3.9, /root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9000]
2022-06-19T20:37:24,183 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/root/anaconda3/bin/python3.9, /root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9001]
2022-06-19T20:37:25,059 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000
2022-06-19T20:37:25,059 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - [PID]30991
2022-06-19T20:37:25,059 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Torch worker started.
2022-06-19T20:37:25,060 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Python runtime: 3.9.7
2022-06-19T20:37:25,060 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-my_tc_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2022-06-19T20:37:25,060 [INFO ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2022-06-19T20:37:25,061 [INFO ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1655642245061
2022-06-19T20:37:25,061 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9000.
2022-06-19T20:37:25,070 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9001
2022-06-19T20:37:25,071 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - [PID]30992
2022-06-19T20:37:25,071 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Torch worker started.
2022-06-19T20:37:25,071 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Python runtime: 3.9.7
2022-06-19T20:37:25,071 [DEBUG] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-my_tc_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2022-06-19T20:37:25,071 [INFO ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2022-06-19T20:37:25,076 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - model_name: my_tc, batchSize: 1
2022-06-19T20:37:25,076 [INFO ] W-9001-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1655642245076
2022-06-19T20:37:25,076 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9001.
2022-06-19T20:37:25,091 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - model_name: my_tc, batchSize: 1
2022-06-19T20:37:25,468 [ERROR] Thread-1 org.pytorch.serve.metrics.MetricCollector - Traceback (most recent call last):
  File "/root/anaconda3/lib/python3.9/site-packages/pynvml/nvml.py", line 782, in _nvmlGetFunctionPointer
    _nvmlGetFunctionPointer_cache[name] = getattr(nvmlLib, name)
  File "/root/anaconda3/lib/python3.9/ctypes/__init__.py", line 395, in __getattr__
    func = self.__getitem__(name)
  File "/root/anaconda3/lib/python3.9/ctypes/__init__.py", line 400, in __getitem__
    func = self._FuncPtr((name_or_ordinal, self))
AttributeError: /lib64/libnvidia-ml.so.1: undefined symbol: nvmlDeviceGetComputeRunningProcesses_v2

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/root/anaconda3/lib/python3.9/site-packages/ts/metrics/metric_collector.py", line 27, in <module>
    system_metrics.collect_all(sys.modules['ts.metrics.system_metrics'], arguments.gpu)
  File "/root/anaconda3/lib/python3.9/site-packages/ts/metrics/system_metrics.py", line 91, in collect_all
    value(num_of_gpu)
  File "/root/anaconda3/lib/python3.9/site-packages/ts/metrics/system_metrics.py", line 72, in gpu_utilization
    statuses = list_gpus.device_statuses()
  File "/root/anaconda3/lib/python3.9/site-packages/nvgpu/list_gpus.py", line 67, in device_statuses
    return [device_status(device_index) for device_index in range(device_count)]
  File "/root/anaconda3/lib/python3.9/site-packages/nvgpu/list_gpus.py", line 67, in <listcomp>
    return [device_status(device_index) for device_index in range(device_count)]
  File "/root/anaconda3/lib/python3.9/site-packages/nvgpu/list_gpus.py", line 19, in device_status
    nv_procs = nv.nvmlDeviceGetComputeRunningProcesses(handle)
  File "/root/anaconda3/lib/python3.9/site-packages/pynvml/nvml.py", line 2223, in nvmlDeviceGetComputeRunningProcesses
    return nvmlDeviceGetComputeRunningProcesses_v2(handle);
  File "/root/anaconda3/lib/python3.9/site-packages/pynvml/nvml.py", line 2191, in nvmlDeviceGetComputeRunningProcesses_v2
    fn = _nvmlGetFunctionPointer("nvmlDeviceGetComputeRunningProcesses_v2")
  File "/root/anaconda3/lib/python3.9/site-packages/pynvml/nvml.py", line 785, in _nvmlGetFunctionPointer
    raise NVMLError(NVML_ERROR_FUNCTION_NOT_FOUND)
pynvml.nvml.NVMLError_FunctionNotFound: Function Not Found

2022-06-19T20:37:26,686 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
2022-06-19T20:37:26,686 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - NumExpr defaulting to 8 threads.
2022-06-19T20:37:26,712 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
2022-06-19T20:37:26,712 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - NumExpr defaulting to 8 threads.
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Backend worker process died.
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 100, in load
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module, function_name = self._load_handler_file(handler)
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 162, in _load_handler_file
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name)
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:26,867 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap_external>", line 850, in exec_module
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/tmp/models/fa1d6fc0c6bb4578a14cf963c6cfb4cb/Transformer_handler_generalized.py", line 18, in <module>
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from captum.attr import LayerIntegratedGradients
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/__init__.py", line 54, in <module>
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from captum.attr._utils import visualization  # noqa
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/_utils/visualization.py", line 7, in <module>
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import pyplot as plt
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/__init__.py", line 107, in <module>
2022-06-19T20:37:26,868 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from . import _api, cbook, docstring, rcsetup
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/rcsetup.py", line 26, in <module>
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.colors import Colormap, is_color_like
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/colors.py", line 82, in <module>
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import _api, cbook, scale
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/scale.py", line 18, in <module>
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.ticker import (
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/ticker.py", line 179, in <module>
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import transforms as mtransforms
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/transforms.py", line 46, in <module>
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     from matplotlib._path import (
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /root/anaconda3/lib/python3.9/site-packages/matplotlib/_path.cpython-39-x86_64-linux-gnu.so)
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - During handling of the above exception, another exception occurred:
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:26,869 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 210, in <module>
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     worker.run_server()
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 181, in run_server
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     self.handle_connection(cl_socket)
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 139, in handle_connection
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     service, result, code = self.load_model(msg)
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 104, in load_model
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     service = model_loader.load(
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 102, in load
2022-06-19T20:37:26,870 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module = self._load_default_handler(handler)
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 167, in _load_default_handler
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name, "ts.torch_handler")
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:26,871 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:26,872 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
2022-06-19T20:37:26,872 [INFO ] W-9000-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized'
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Backend worker process died.
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 100, in load
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module, function_name = self._load_handler_file(handler)
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 162, in _load_handler_file
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name)
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:26,890 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap_external>", line 850, in exec_module
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/tmp/models/fa1d6fc0c6bb4578a14cf963c6cfb4cb/Transformer_handler_generalized.py", line 18, in <module>
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from captum.attr import LayerIntegratedGradients
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/__init__.py", line 54, in <module>
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from captum.attr._utils import visualization  # noqa
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/captum/attr/_utils/visualization.py", line 7, in <module>
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import pyplot as plt
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/__init__.py", line 107, in <module>
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from . import _api, cbook, docstring, rcsetup
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/rcsetup.py", line 26, in <module>
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.colors import Colormap, is_color_like
2022-06-19T20:37:26,891 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/colors.py", line 82, in <module>
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import _api, cbook, scale
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/scale.py", line 18, in <module>
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib.ticker import (
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/ticker.py", line 179, in <module>
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib import transforms as mtransforms
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/matplotlib/transforms.py", line 46, in <module>
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     from matplotlib._path import (
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /root/anaconda3/lib/python3.9/site-packages/matplotlib/_path.cpython-39-x86_64-linux-gnu.so)
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - During handling of the above exception, another exception occurred:
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 210, in <module>
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     worker.run_server()
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 181, in run_server
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     self.handle_connection(cl_socket)
2022-06-19T20:37:26,892 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 139, in handle_connection
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     service, result, code = self.load_model(msg)
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_service_worker.py", line 104, in load_model
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     service = model_loader.load(
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 102, in load
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module = self._load_default_handler(handler)
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/site-packages/ts/model_loader.py", line 167, in _load_default_handler
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     module = importlib.import_module(module_name, "ts.torch_handler")
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -     return _bootstrap._gcd_import(name[level:], package, level)
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 972, in _find_and_load_unlocked
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG -   File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized'
2022-06-19T20:37:27,111 [INFO ] epollEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2022-06-19T20:37:27,112 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2022-06-19T20:37:27,112 [DEBUG] W-9000-my_tc_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException: null
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
        at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
        at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
        at java.lang.Thread.run(Thread.java:834) [?:?]
2022-06-19T20:37:27,113 [WARN ] W-9000-my_tc_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: my_tc, error: Worker died.

Installation instructions

JDK11 pip install transformers==4.6.0 TorchServe

Install dependencies

cuda is optional

python ./ts_scripts/install_dependencies.py --cuda=cu111

Latest release

pip install torchserve torch-model-archiver torch-workflow-archiver

Model Packaing

downloaded pre-trained models python Download_Transformer_models.py

config.properties

inference_address=http://127.0.0.1:8083

Versions


Environment headers

Torchserve branch:

torchserve==0.6.0 torch-model-archiver==0.6.0

Python version: 3.9 (64-bit runtime) Python executable: /root/anaconda3/bin/python

Versions of relevant python libraries: captum==0.5.0 future==0.18.2 numpy==1.22.4 numpydoc==1.1.0 nvgpu==0.9.0 psutil==5.9.1 pylint==2.9.6 pytest==6.2.4 pytorch-crf==0.7.2 requests==2.28.0 requests-oauthlib==1.3.0 torch==1.9.0+cu111 torch-model-archiver==0.6.0 torch-workflow-archiver==0.2.4 torchaudio==0.9.0 torchserve==0.6.0 torchtext==0.10.0 torchvision==0.10.0+cu111 transformers==4.20.0.dev0 wheel==0.37.1 torch==1.9.0+cu111 torchtext==0.10.0 torchvision==0.10.0+cu111 torchaudio==0.9.0

Java Version:

OS: N/A GCC version: (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44) Clang version: N/A CMake version: version 3.14.2

Repro instructions

torch-model-archiver --model-name BERTSeqClassification --version 1.0 --serialized-file Transformer_model/pytorch_model.bin --handler ./Transformer_handler_generalized.py --extra-files "Transformer_model/config.json,./setup_config.json,./Seq_classification_artifacts/index_to_name.json"

mkdir model_store

mv BERTSeqClassification.mar model_store/

torchserve --start --model-store model_store --models my_tc=BERTSeqClassification.mar --ncs --ts-config config.properties

Possible Solution

No response

msaroufim commented 2 years ago

I see two different issues

Unsuccessful NVIDIA driver installation

AttributeError: /lib64/libnvidia-ml.so.1: undefined symbol: nvmlDeviceGetComputeRunningProcesses_v2

So would suggest trying to run a simpler example with just some PyTorch model on GPU and seeing if it works

Importing a function that does not exist

I also see 2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized' which is expected at examples/Huggingface_Transformers/Transformer_handler_generalized.py but is not ideal and I agree that we should make our handlers easier to import.

TODO for us: Make all handlers in examples possible to import like so import ts.torch_handler.Transformer

rbavery commented 2 years ago

Does this section of the docs need to be updated to reflect that you can't import from ts in the pytorch docker images?

https://pytorch.org/serve/custom_service.html

I'm trying to use a custom handler like so and running into this error

torch-model-archiver --model-name mdv5 --version 1.0.0 --serialized-file ../models/megadetectorv5/md_v5a.0.0.torchscript --extra-files index_to_name.json --handler ../api/megadetectorv5/mdv5_handler.py
choshiho commented 2 years ago

I see two different issues

Unsuccessful NVIDIA driver installation

AttributeError: /lib64/libnvidia-ml.so.1: undefined symbol: nvmlDeviceGetComputeRunningProcesses_v2

So would suggest trying to run a simpler example with just some PyTorch model on GPU and seeing if it works

Importing a function that does not exist

I also see 2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized' which is expected at examples/Huggingface_Transformers/Transformer_handler_generalized.py but is not ideal and I agree that we should make our handlers easier to import.

TODO for us: Make all handlers in examples possible to import like so import ts.torch_handler.Transformer

@msaroufim Thank you for you reply!

BaseHandler from ts.torch_handler.base_handler import BaseHandler is in the official Transformer_handler_generalized.pyhttps://github.com/pytorch/serve/blob/master/examples/Huggingface_Transformers/Transformer_handler_generalized.py

ImportError I have made the official demo Sequence Classification run successfully. The reason may be ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found I solved this ImportError by

  1. conda install libgcc

Try running your code again. If it still does not work try finding the location of libstdc++.so.6. It is usually in /home/user/anaconda3/lib/ and set the LD_LIBRARY_PATH environment variable.

  1. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/miniconda3/lib/
choshiho commented 2 years ago

I see two different issues

Unsuccessful NVIDIA driver installation

AttributeError: /lib64/libnvidia-ml.so.1: undefined symbol: nvmlDeviceGetComputeRunningProcesses_v2

So would suggest trying to run a simpler example with just some PyTorch model on GPU and seeing if it works

Importing a function that does not exist

I also see 2022-06-19T20:37:26,893 [INFO ] W-9001-my_tc_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'ts.torch_handler.Transformer_handler_generalized' which is expected at examples/Huggingface_Transformers/Transformer_handler_generalized.py but is not ideal and I agree that we should make our handlers easier to import.

TODO for us: Make all handlers in examples possible to import like so import ts.torch_handler.Transformer

@msaroufim hi, one more question. How can I import third-party file in some directory when I edit a custom handler. for example, in pytorch_pretrained directory, there are several python files that need to be imported in my custom_handler.py and model.py

in my custom_handler.py and model.py, my import code are as follows: from pytorch_pretrained import BertModel, BertTokenizer

personal directory contains: custom_handler.py
model.pt
model.py
pytorch_pretrained(there are several python files in this directory)

torchserve command lines torch-model-archiver --model-name my_text_classifier --version 1.0 --model-file ./zzf_model/model.py --serialized-file ./zzf_model/model.pt --handler "./zzf_model/custom_handler.py" --extra-files "index_to_name_cm.json,source_vocab.pt"

mkdir model_store && mv my_text_classifier.mar model_store/

torchserve --start --model-store model_store --models my_tc=my_text_classifier.mar --ts-config config.properties

No module named 'pytorch_pretrained' finally, I got error ModuleNotFoundError: No module named 'pytorch_pretrained', but I have imported pytorch_pretrained in custom_handler.py and model.py

msaroufim commented 2 years ago

Hi @rbavery I created a separate issue to track your question

@choshiho regarding your error on no module named pytorch_pretrained my suspicion is this has something to do with how you're importing the library, is this a local directory file or a full package? In which case maybe you need to do something like from .pytorch_pretrained or link to the right directory

On the CUDA front it's tricky for me to debug what's going, to be honest I mostly try to do a fresh cloud machine on AWS or use docker containers otherwise I spend 1/2 day each time on CUDA driver problems.

choshiho commented 2 years ago

@choshiho regarding your error on no module named pytorch_pretrained my suspicion is this has something to do with how you're importing the library, is this a local directory file or a full package? In which case maybe you need to do something like from .pytorch_pretrained or link to the right directory

On the CUDA front it's tricky for me to debug what's going, to be honest I mostly try to do a fresh cloud machine on AWS or use docker containers otherwise I spend 1/2 day each time on CUDA driver problems.

@msaroufim My project is on a cloud machine and my package is a directory, I followed your proposed method, from .pytorch_pretrained import BertModel, BertTokenizer I also tried from .pytorch_pretrained.modeling import BertModel from .pytorch_pretrained.tokenization import BertTokenizer both led to ImportError: attempted relative import with no known parent package.

personal directory contains: custom_handler.py model.pt model.py pytorch_pretrained(there are several python files in this directory)

Will you please give me some other advice? Thank you in advance.

msaroufim commented 2 years ago

Can you share a zip file with all your files so I can reproduce?

choshiho commented 2 years ago

@msaroufim model.pt I got a model.pt file from this github repo as shown below. https://github.com/649453932/Bert-Chinese-Text-Classification-Pytorch step 1. download pre-trained model bert_Chinese from https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz extract bert_config.json, pytorch_model.bin and put them in the directory Bert-Chinese-Text-Classification-Pytorch-master/bert_pretrain/ step 2. download bert-base-chinese-vocab.txt https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt put vocab.txt in the directory Bert-Chinese-Text-Classification-Pytorch-master/bert_pretrain/ step 3. train and test python run.py --model bert step 4. get a bert.ckpt in the directory 'THUCNews/saved_dict/bert.ckpt' I renamed the fine-tuned model bert.ckpt as model.pt

model.py custom_handler.py the above two files are in the Baidu Net Disk https://pan.baidu.com/s/1aqDBNdZmhKNai2lDy34V1w extract code: ly8l

pytorch_pretrained this directory is in the Bert-Chinese-Text-Classification-Pytorch