pytorch / serve

Serve, optimize and scale PyTorch models in production
https://pytorch.org/serve/
Apache License 2.0
4.21k stars 860 forks source link

MaskRCNN not serving when using PyTorchServe Docker #948

Closed nagamanikandank closed 3 years ago

nagamanikandank commented 3 years ago

Steps which led me to the bug

  1. I downloaded PyTorch Serve Docker image from here --> https://hub.docker.com/r/pytorch/torchserve
  2. Downloaded https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth to my computer in folder C:/examples
  3. Downloaded https://github.com/pytorch/serve/tree/master/examples/ to my computer in folder C:/examples
  4. Created empty folder called model-store here --> C:/model-store
  5. Run the command docker run -it -p 8080:8080 -p 8081:8081 --name mar -v C:/model-store:/home/model-server/model-store -v C:/examples:/home/model-server/examples pytorch/torchserve:latest
  6. Run the command docker exec -it <container_name> /bin/bash
  7. From inside the container bash torch-model-archiver --force --model-name maskrcnn --version 1.0 --model-file /home/model-server/examples/object_detector/maskrcnn/model.py --serialized-file /home/model-server/examples/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth --handler object_detector --export-path /home/model-server/model-store --extra-files /home/model-server/examples/object_detector/index_to_name.json
  8. From inside the container bash torchserve --start --model-store /home/model-server/model-store --models maskrcnn=maskrcnn.mar
  9. Download persons.jpg to my desktop from here --> https://github.com/pytorch/serve/tree/master/examples/object_detector/maskrcnn
  10. From my desktop (host machine) curl http://127.0.0.1:8080/predictions/maskrcnn -T persons.jpg
  11. I am getting the following error { "code": 404, "type": "ModelNotFoundException", "message": "Model not found: maskrcnn" }

Note :- I followed the same steps for Densenet image classifier (from the examples from the repo) and it worked perfectly fine. MaskRCNN doesn't work however

alvarobartt commented 3 years ago

Hi @nagamanikandank, could you please check and share the logs while deploying TorchServe in point 8? I guess that the model is not being properly registered due to some error while generating the MAR, but please share the logs and I'll try to help you as much as I can! 👍🏻

dhanainme commented 3 years ago

A good place to debug would be to do a list model to confirm that the model has been registered.

curl -X GET http://127.0.0.1:8081/models should tell you if the model is registered.

nagamanikandank commented 3 years ago

@alvarobartt and @dhanainme , Thanks for your quick response

curl -X GET http://127.0.0.1:8081/models provides the following response

{ "models": [] } Attaching the logs generated after this command torchserve --start --model-store /home/model-server/model-store --models maskrcnn=maskrcnn.mar

$ torchserve --start --model-store /home/model-server/model-store --models maskrcnn=maskrcnn.mar TorchServe is already running, please use torchserve --stop to stop TorchServe. $ torchserve --stop TorchServe has stopped. $ torchserve --start --model-store /home/model-server/model-store --models maskrcnn=maskrcnn.mar $ 2021-01-29 23:07:28,803 [INFO ] main org.pytorch.serve.ModelServer - Torchserve version: 0.3.0 TS Home: /home/venv/lib/python3.6/site-packages Current directory: /home/model-server Temp directory: /home/model-server/tmp Number of GPUs: 0 Number of CPUs: 2 Max heap size: 492 M Python executable: /home/venv/bin/python3 Config file: logs/config/20210129230723061-shutdown.cfg Inference address: http://0.0.0.0:8080 Management address: http://0.0.0.0:8081 Metrics address: http://0.0.0.0:8082 Model Store: /home/model-server/model-store Initial Models: maskrcnn=maskrcnn.mar Log dir: /home/model-server/logs Metrics dir: /home/model-server/logs Netty threads: 32 Netty client threads: 0 Default workers per model: 2 Blacklist Regex: N/A Maximum Response Size: 6553500 Maximum Request Size: 6553500 Prefer direct buffer: false Allowed Urls: [file://.*|http(s)?://.*] Custom python dependency for model allowed: false Metrics report format: prometheus Enable metrics API: true 2021-01-29 23:07:28,812 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Started restoring models from snapshot { "name": "20210129230723061-shutdown.cfg", "modelCount": 0, "created": 1611961643061, "models": {} } 2021-01-29 23:07:28,821 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Validating snapshot 20210129230723061-shutdown.cfg 2021-01-29 23:07:28,822 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Snapshot 20210129230723061-shutdown.cfg validated successfully 2021-01-29 23:07:28,822 [WARN ] main org.pytorch.serve.snapshot.SnapshotManager - Model snapshot is empty. Starting TorchServe without initial models. 2021-01-29 23:07:28,826 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel. 2021-01-29 23:07:28,899 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://0.0.0.0:8080 2021-01-29 23:07:28,899 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel. 2021-01-29 23:07:28,902 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://0.0.0.0:8081 2021-01-29 23:07:28,902 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel. 2021-01-29 23:07:28,910 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://0.0.0.0:8082 Model server started. 2021-01-29 23:07:29,194 [INFO ] pool-2-thread-1 TS_METRICS - CPUUtilization.Percent:0.0|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:29,195 [INFO ] pool-2-thread-1 TS_METRICS - DiskAvailable.Gigabytes:49.893497467041016|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:29,196 [INFO ] pool-2-thread-1 TS_METRICS - DiskUsage.Gigabytes:9.635597229003906|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:29,197 [INFO ] pool-2-thread-1 TS_METRICS - DiskUtilization.Percent:16.2|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:29,199 [INFO ] pool-2-thread-1 TS_METRICS - MemoryAvailable.Megabytes:1169.3359375|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:29,206 [INFO ] pool-2-thread-1 TS_METRICS - MemoryUsed.Megabytes:672.703125|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:29,206 [INFO ] pool-2-thread-1 TS_METRICS - MemoryUtilization.Percent:40.6|#Level:Host|#hostname:e9c7515591a8,timestamp:1611961649 2021-01-29 23:07:44,602 [INFO ] epollEventLoopGroup-3-1 ACCESS_LOG - /172.17.0.1:49308 "GET /models HTTP/1.1" 200 4 2021-01-29 23:07:44,603 [INFO ] epollEventLoopGroup-3-1 TS_METRICS - Requests2XX.Count:1|#Level:Host|#hostname:e9c7515591a8,timestamp:null

alvarobartt commented 3 years ago

It looks like the model is not being registered somehow... I'll try to reproduce those steps tomorrow and let you know which is the issue! 👍🏻 And the solution if possible!

alvarobartt commented 3 years ago

Hi @nagamanikandank and @dhanainme! ✋🏻

I've tested it already and the error seems to be coming from the latest torchserve update, as the MaskRCNN model can be server properly using the v0.2.0 release, but when using the latest one (v0.3.0) it seems to break. The issue comes when retrieving the handler as it throws the following error while registering the model: ModuleNotFoundError: No module named 'object_detector', while loading the module from ts.torch_handler.

Let me know if you want me to dig more in this issue and maybe create a PR if I find the solution! 🔥

alvarobartt commented 3 years ago

Anyway, as a temporary fix for you @nagamanikandank, you can reproduce the following steps as this does work over a plain Ubuntu Docker image:

docker pull ubuntu
docker run -it ubuntu:latest /bin/bash
cd home/
apt-get install python3 python3-dev python3-pip openjdk-11-jre-headless git wget curl -y
python3 -m pip install torch torch-model-archiver torchserve==0.2.0
git clone https://github.com/pytorch/serve
mv serve/examples/object_detector object_detector
rm -rf serve/
wget https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth
torch-model-archiver --model-name maskrcnn --version 1.0 --model-file object_detector/maskrcnn/model.py --serialized-file maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth --handler object_detector --extra-files object_detector/index_to_name.json
mkdir model_store
mv maskrcnn.mar model_store/
torchserve --start --model-store model_store --models maskrcnn=maskrcnn.mar

And then without closing that Docker container, from a new terminal use the following set of commands to make sure that the model was registed:

docker ps # Retrieve the ID of the running Docker container
docker exec -it CONTAINER_ID /bin/bash
curl -X GET http://127.0.0.1:8081/models

And it should output something like:

{
  "models": [
    {
      "modelName": "maskrcnn",
      "modelUrl": "maskrcnn.mar"
    }
  ]
}

Hope this helped you! 🔥 At least as a fix while this issue is tackled.

nagamanikandank commented 3 years ago

@alvarobartt Thanks a lot for your reply.

I followed the steps you have mentioned and curl -X GET http://127.0.0.1:8081/models returns me the following output successfully

{
  "models": [
    {
      "modelName": "maskrcnn",
      "modelUrl": "maskrcnn.mar"
    }
  ]
}

But, there are a lot of errors in the logs and when I try to inference from model using command curl http://127.0.0.1:8080/predictions/maskrcnn -T persons.jpg , the CURL command doesn't get any result and waits indefinitely. Attaching the logs below. The logs till line 1133 was generated after torch-serve start and the logs below that was generated after I tried to inference

2021-02-04 22:39:07,029 [INFO ] main org.pytorch.serve.ModelServer - 
Torchserve version: 0.2.0
TS Home: /usr/local/lib/python3.8/dist-packages
Current directory: /home
Temp directory: /tmp
Number of GPUs: 0
Number of CPUs: 2
Max heap size: 492 M
Python executable: /usr/bin/python3
Config file: N/A
Inference address: http://127.0.0.1:8080
Management address: http://127.0.0.1:8081
Metrics address: http://127.0.0.1:8082
Model Store: /home/model_store
Initial Models: maskrcnn=maskrcnn.mar
Log dir: /home/logs
Metrics dir: /home/logs
Netty threads: 0
Netty client threads: 0
Default workers per model: 2
Blacklist Regex: N/A
Maximum Response Size: 6553500
Maximum Request Size: 6553500
Prefer direct buffer: false
Custom python dependency for model allowed: false
Metrics report format: prometheus
Enable metrics API: true
2021-02-04 22:39:07,043 [INFO ] main org.pytorch.serve.ModelServer - Loading initial models: maskrcnn.mar
2021-02-04 22:39:10,290 [INFO ] main org.pytorch.serve.archive.ModelArchive - eTag 4bb9e5f2038748c4835b0fa1d070e5b7
2021-02-04 22:39:10,303 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model maskrcnn
2021-02-04 22:39:10,304 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model maskrcnn
2021-02-04 22:39:10,305 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model maskrcnn loaded.
2021-02-04 22:39:10,305 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: maskrcnn, count: 2
2021-02-04 22:39:10,324 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel.
2021-02-04 22:39:10,465 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:10,467 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:10,469 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5325
2021-02-04 22:39:10,469 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5324
2021-02-04 22:39:10,471 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:10,471 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:10,474 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change null -> WORKER_STARTED
2021-02-04 22:39:10,477 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:10,478 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change null -> WORKER_STARTED
2021-02-04 22:39:10,479 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:10,486 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:10,488 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:10,516 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8080
2021-02-04 22:39:10,516 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel.
2021-02-04 22:39:10,522 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:10,531 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081
2021-02-04 22:39:10,532 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel.
2021-02-04 22:39:10,535 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:10,539 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082
2021-02-04 22:39:10,604 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:10,605 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:10,605 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:10,605 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:10,605 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:10,606 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:10,606 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:10,606 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:10,606 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:10,606 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:10,607 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:10,607 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:10,607 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:10,608 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:10,608 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:10,608 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:10,608 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:10,608 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:10,609 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:10,612 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:10,612 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:10,613 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:10,614 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:10,615 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:10,615 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:10,617 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:10,618 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:10,619 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:10,622 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:10,624 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:10,624 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:10,625 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:10,626 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:10,630 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:10,632 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:10,633 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:10,634 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:10,640 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:10,636 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:10,635 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:10,646 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:10,636 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:10,646 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:10,644 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:10,649 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:10,650 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:10,651 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:10,652 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:10,653 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:10,653 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:10,654 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:10,654 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:10,655 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:10,656 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:10,656 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:10,656 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:10,658 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:10,658 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:10,658 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:10,659 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:10,661 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:10,661 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:10,662 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:10,662 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:10,664 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:10,664 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:10,666 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:10,666 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:10,666 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:10,667 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:10,669 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:10,671 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds.
2021-02-04 22:39:10,674 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 1 seconds.
2021-02-04 22:39:10,743 [ERROR] Thread-1 org.pytorch.serve.metrics.MetricCollector - --- Logging error ---
Traceback (most recent call last):
  File "/usr/lib/python3.8/logging/__init__.py", line 1085, in emit
    self.flush()
  File "/usr/lib/python3.8/logging/__init__.py", line 1065, in flush
    self.stream.flush()
BrokenPipeError: [Errno 32] Broken pipe
Call stack:
  File "ts/metrics/metric_collector.py", line 18, in <module>
    check_process_mem_usage(sys.stdin)
  File "/usr/local/lib/python3.8/dist-packages/ts/metrics/process_memory_metric.py", line 40, in check_process_mem_usage
    logging.info("%s:%d", process, get_cpu_usage(process))
Message: '%s:%d'
Arguments: ('5324', 0)
--- Logging error ---
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/psutil/_common.py", line 447, in wrapper
    ret = self._cache[fun]
AttributeError: _cache

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/psutil/_pslinux.py", line 1576, in wrapper
    return fun(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/psutil/_common.py", line 450, in wrapper
    return fun(self)
  File "/usr/local/lib/python3.8/dist-packages/psutil/_pslinux.py", line 1618, in _parse_stat_file
    with open_binary("%s/%s/stat" % (self._procfs_path, self.pid)) as f:
  File "/usr/local/lib/python3.8/dist-packages/psutil/_common.py", line 711, in open_binary
    return open(fname, "rb", **kwargs)
FileNotFoundError: [Errno 2] No such file or directory: '/proc/5325/stat'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/psutil/__init__.py", line 354, in _init
    self.create_time()
  File "/usr/local/lib/python3.8/dist-packages/psutil/__init__.py", line 710, in create_time
    self._create_time = self._proc.create_time()
  File "/usr/local/lib/python3.8/dist-packages/psutil/_pslinux.py", line 1576, in wrapper
    return fun(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/psutil/_pslinux.py", line 1788, in create_time
    ctime = float(self._parse_stat_file()['create_time'])
  File "/usr/local/lib/python3.8/dist-packages/psutil/_pslinux.py", line 1583, in wrapper
    raise NoSuchProcess(self.pid, self._name)
psutil.NoSuchProcess: psutil.NoSuchProcess process no longer exists (pid=5325)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/ts/metrics/process_memory_metric.py", line 20, in get_cpu_usage
    process = psutil.Process(int(pid))
  File "/usr/local/lib/python3.8/dist-packages/psutil/__init__.py", line 326, in __init__
    self._init(pid)
  File "/usr/local/lib/python3.8/dist-packages/psutil/__init__.py", line 367, in _init
    raise NoSuchProcess(pid, None, msg)
psutil.NoSuchProcess: psutil.NoSuchProcess no process found with pid 5325

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.8/logging/__init__.py", line 1085, in emit
    self.flush()
  File "/usr/lib/python3.8/logging/__init__.py", line 1065, in flush
    self.stream.flush()
BrokenPipeError: [Errno 32] Broken pipe
Call stack:
  File "ts/metrics/metric_collector.py", line 18, in <module>
    check_process_mem_usage(sys.stdin)
  File "/usr/local/lib/python3.8/dist-packages/ts/metrics/process_memory_metric.py", line 40, in check_process_mem_usage
    logging.info("%s:%d", process, get_cpu_usage(process))
  File "/usr/local/lib/python3.8/dist-packages/ts/metrics/process_memory_metric.py", line 22, in get_cpu_usage
    logging.error("Failed get process for pid: %s", pid, exc_info=True)
Message: 'Failed get process for pid: %s'
Arguments: ('5325',)
--- Logging error ---
Traceback (most recent call last):
  File "/usr/lib/python3.8/logging/__init__.py", line 1085, in emit
    self.flush()
  File "/usr/lib/python3.8/logging/__init__.py", line 1065, in flush
    self.stream.flush()
BrokenPipeError: [Errno 32] Broken pipe
Call stack:
  File "ts/metrics/metric_collector.py", line 18, in <module>
    check_process_mem_usage(sys.stdin)
  File "/usr/local/lib/python3.8/dist-packages/ts/metrics/process_memory_metric.py", line 40, in check_process_mem_usage
    logging.info("%s:%d", process, get_cpu_usage(process))
Message: '%s:%d'
Arguments: ('5325', 0)
Exception ignored in: <_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>
BrokenPipeError: [Errno 32] Broken pipe

2021-02-04 22:39:11,753 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:11,756 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5343
2021-02-04 22:39:11,756 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:11,756 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:11,757 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:11,758 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:11,763 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:11,777 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:11,777 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5342
2021-02-04 22:39:11,778 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:11,778 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:11,778 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:11,779 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:11,790 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:11,790 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:11,790 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:11,791 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:11,791 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:11,791 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:11,792 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:11,792 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:11,792 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:11,793 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:11,793 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:11,794 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:11,794 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:11,798 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:11,798 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:11,798 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:11,798 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:11,801 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:11,801 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:11,801 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:11,802 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:11,802 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:11,806 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:11,806 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:11,806 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:11,811 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:11,811 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:11,812 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:11,812 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:11,816 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:11,816 [INFO ] epollEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:11,816 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:11,820 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:11,823 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:11,824 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:11,836 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:11,824 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:11,837 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:11,838 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:11,838 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:11,838 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:11,839 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:11,850 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:11,850 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:11,851 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:11,852 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds.
2021-02-04 22:39:11,865 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:11,865 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:11,866 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:11,866 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:11,872 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:11,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:11,874 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:11,874 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:11,874 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:11,875 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:11,876 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:11,877 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:11,880 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:11,880 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:11,881 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:11,881 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:11,892 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:11,892 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:11,892 [INFO ] epollEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:11,895 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:11,895 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:11,895 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:11,895 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:11,896 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:11,896 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:11,896 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 1 seconds.
2021-02-04 22:39:13,017 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:13,017 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5352
2021-02-04 22:39:13,018 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:13,018 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:13,018 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:13,019 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:13,021 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:13,032 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:13,032 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:13,033 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:13,035 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:13,035 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:13,036 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:13,036 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:13,037 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:13,038 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:13,039 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:13,049 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:13,049 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:13,049 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:13,050 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:13,050 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:13,050 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:13,050 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:13,051 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:13,052 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:13,053 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:13,053 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:13,054 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:13,054 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:13,054 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:13,054 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:13,075 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:13,076 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:13,077 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:13,082 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:13,084 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 2 seconds.
2021-02-04 22:39:13,089 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:13,096 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:13,109 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:13,110 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5355
2021-02-04 22:39:13,110 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:13,110 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:13,111 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:13,112 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:13,114 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:13,118 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:13,119 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:13,120 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:13,121 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:13,121 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:13,124 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:13,125 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:13,126 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:13,127 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:13,129 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:13,130 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:13,130 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:13,130 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:13,130 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:13,131 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:13,131 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:13,131 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:13,131 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:13,131 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:13,131 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:13,132 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:13,132 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:13,137 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:13,137 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:13,137 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:13,145 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:13,145 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:13,147 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:13,147 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:13,149 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:13,149 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:13,150 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:13,150 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:13,150 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 2 seconds.
2021-02-04 22:39:13,153 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:13,155 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:15,163 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:15,164 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5360
2021-02-04 22:39:15,164 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:15,165 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:15,166 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:15,167 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:15,169 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:15,172 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:15,173 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:15,174 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:15,174 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:15,174 [INFO ] epollEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:15,175 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:15,175 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:15,176 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:15,176 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:15,177 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:15,177 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:15,177 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:15,178 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:15,179 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:15,179 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 3 seconds.
2021-02-04 22:39:15,185 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:15,235 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:15,235 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5364
2021-02-04 22:39:15,236 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:15,236 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:15,236 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:15,237 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:15,239 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:15,242 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:15,242 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:15,242 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:15,243 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:15,243 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:15,243 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:15,243 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:15,243 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:15,243 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:15,244 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:15,244 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:15,244 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:15,244 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:15,244 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:15,244 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:15,245 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:15,245 [INFO ] epollEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:15,246 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:15,249 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:15,249 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:15,250 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:15,251 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:15,251 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:15,252 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:15,253 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 3 seconds.
2021-02-04 22:39:15,251 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:15,255 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:15,257 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:18,277 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:18,278 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5368
2021-02-04 22:39:18,279 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:18,279 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:18,281 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:18,281 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:18,287 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:18,293 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:18,294 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:18,295 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:18,295 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:18,295 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:18,295 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:18,295 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:18,298 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:18,298 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:18,298 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:18,298 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:18,298 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:18,299 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:18,301 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:18,301 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:18,302 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:18,302 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:18,302 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:18,303 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:18,303 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:18,303 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 5 seconds.
2021-02-04 22:39:18,308 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:18,311 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:18,311 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:18,427 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:18,427 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5372
2021-02-04 22:39:18,427 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:18,427 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:18,428 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:18,428 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:18,431 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:18,433 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:18,433 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:18,434 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:18,435 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:18,436 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:18,436 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:18,436 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:18,437 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:18,437 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:18,438 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:18,438 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:18,438 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 5 seconds.
2021-02-04 22:39:18,440 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:18,440 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:23,384 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:23,385 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5376
2021-02-04 22:39:23,385 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:23,385 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:23,386 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:23,391 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:23,394 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:23,396 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:23,397 [INFO ] epollEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:23,397 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:23,398 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:23,398 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:23,399 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:23,400 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:23,401 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:23,401 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:23,403 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:23,404 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:23,405 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:23,405 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:23,406 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:23,406 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:23,407 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 8 seconds.
2021-02-04 22:39:23,407 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:23,519 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:23,519 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5380
2021-02-04 22:39:23,519 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:23,520 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:23,520 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:23,520 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:23,523 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:23,528 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:23,528 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:23,529 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:23,530 [INFO ] epollEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:23,530 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:23,530 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:23,532 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:23,532 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:23,532 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:23,533 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:23,534 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:23,534 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:23,535 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:23,535 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:23,535 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:23,536 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:23,536 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 8 seconds.
2021-02-04 22:39:23,538 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:23,542 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:31,501 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:31,501 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5384
2021-02-04 22:39:31,501 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:31,501 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:31,501 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:31,501 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:31,504 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:31,507 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:31,509 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:31,509 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:31,511 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:31,511 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:31,511 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:31,511 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:31,512 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:31,515 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:31,516 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:31,516 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:31,517 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:31,518 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:31,518 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:31,518 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:31,518 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:31,519 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:31,520 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:31,520 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:31,520 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:31,520 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:31,524 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 13 seconds.
2021-02-04 22:39:31,525 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:31,525 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:31,628 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:31,628 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5388
2021-02-04 22:39:31,628 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:31,628 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:31,629 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:31,631 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:31,633 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:31,642 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:31,643 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:31,643 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:31,643 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:31,644 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:31,644 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:31,646 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:31,646 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:31,646 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:31,647 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:31,647 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:31,647 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:31,647 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:31,647 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:31,648 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:31,648 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:31,648 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:31,649 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:31,650 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:31,650 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:31,650 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:31,650 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:31,650 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:31,650 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:31,651 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:31,651 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:31,651 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:31,651 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:31,651 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:31,658 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:31,659 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:31,659 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:31,659 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:31,665 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 13 seconds.
2021-02-04 22:39:31,669 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:31,669 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:44,592 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:39:44,592 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5404
2021-02-04 22:39:44,592 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:44,592 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:44,592 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:44,593 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:39:44,596 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:44,599 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:39:44,600 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:39:44,601 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:39:44,601 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:39:44,601 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:39:44,601 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:39:44,601 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:39:44,602 [INFO ] epollEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:44,602 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:44,603 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:44,603 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:44,603 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:44,603 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:44,603 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:44,603 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 21 seconds.
2021-02-04 22:39:44,605 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:39:44,605 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:39:44,764 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:39:44,764 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5408
2021-02-04 22:39:44,764 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:39:44,765 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:39:44,765 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:39:44,765 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:39:44,768 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:39:44,772 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:39:44,772 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:44,772 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:39:44,773 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:39:44,774 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:39:44,774 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:39:44,774 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:39:44,774 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:39:44,780 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:39:44,781 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:39:44,781 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:39:44,783 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:39:44,788 [INFO ] epollEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:39:44,790 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:39:44,790 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:39:44,792 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:39:44,792 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:39:44,793 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:39:44,793 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:44,794 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 21 seconds.
2021-02-04 22:39:44,794 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:39:44,796 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:39:44,798 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:40:05,680 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:40:05,681 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5416
2021-02-04 22:40:05,681 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:40:05,681 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:40:05,681 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:40:05,681 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:40:05,690 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:40:05,693 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:40:05,693 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:40:05,694 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:40:05,695 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:40:05,695 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:40:05,695 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:40:05,696 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:40:05,697 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:40:05,697 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:40:05,697 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:40:05,697 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:40:05,697 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:40:05,697 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:40:05,697 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 34 seconds.
2021-02-04 22:40:05,698 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:40:05,698 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:40:05,701 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:40:05,866 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:40:05,866 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5420
2021-02-04 22:40:05,866 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:40:05,867 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:40:05,867 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:40:05,867 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:40:05,869 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:40:05,872 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:40:05,872 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:40:05,872 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:40:05,873 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:40:05,874 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:40:05,874 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:40:05,877 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:40:05,877 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:40:05,878 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:40:05,878 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:40:05,879 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:40:05,879 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:40:05,879 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 34 seconds.
2021-02-04 22:40:05,879 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:40:05,879 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:40:05,882 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:40:39,760 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:40:39,761 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5424
2021-02-04 22:40:39,762 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:40:39,762 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:40:39,763 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:40:39,767 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:40:39,768 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:40:39,771 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:40:39,775 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:40:39,775 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:40:39,775 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:40:39,776 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:40:39,776 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:40:39,776 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:40:39,776 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:40:39,777 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:40:39,777 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:40:39,777 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:40:39,777 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:40:39,777 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:40:39,778 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:40:39,778 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:40:39,779 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:40:39,779 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:40:39,779 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:40:39,780 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:40:39,780 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:40:39,780 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:40:39,780 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:40:39,781 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:40:39,781 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:40:39,781 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:40:39,781 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:40:39,781 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:40:39,781 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:40:39,782 [INFO ] epollEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:40:39,782 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:40:39,783 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:40:39,785 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:40:39,786 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:40:39,786 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:40:39,787 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:40:39,787 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 55 seconds.
2021-02-04 22:40:39,790 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:40:39,790 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:40:39,947 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:40:39,947 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5428
2021-02-04 22:40:39,948 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:40:39,950 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:40:39,951 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:40:39,951 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:40:39,956 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:40:39,959 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:40:39,961 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:40:39,961 [INFO ] epollEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:40:39,961 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:40:39,963 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:40:39,963 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:40:39,972 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:40:39,977 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:40:39,977 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:40:39,972 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:40:39,977 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:40:39,978 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:40:39,982 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:40:39,983 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:40:39,983 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:40:39,984 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:40:39,984 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:40:39,984 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:40:39,985 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     worker.run_server()
2021-02-04 22:40:39,985 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 148, in run_server
2021-02-04 22:40:39,985 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     self.handle_connection(cl_socket)
2021-02-04 22:40:39,985 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 112, in handle_connection
2021-02-04 22:40:39,986 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service, result, code = self.load_model(msg)
2021-02-04 22:40:39,986 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 85, in load_model
2021-02-04 22:40:39,987 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     service = model_loader.load(model_name, model_dir, handler, gpu, batch_size)
2021-02-04 22:40:39,987 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 88, in load
2021-02-04 22:40:39,987 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name, 'ts.torch_handler')
2021-02-04 22:40:39,988 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:40:39,989 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:40:39,989 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:40:39,990 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:40:39,990 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
2021-02-04 22:40:39,990 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
2021-02-04 22:40:39,990 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap_external>", line 783, in exec_module
2021-02-04 22:40:39,991 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
2021-02-04 22:40:39,991 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/object_detector.py", line 4, in <module>
2021-02-04 22:40:39,991 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     import torch
2021-02-04 22:40:39,991 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
2021-02-04 22:40:39,992 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
2021-02-04 22:40:39,992 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:40:39,996 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:40:39,996 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:40:39,997 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:40:39,997 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:40:39,998 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 55 seconds.
2021-02-04 22:41:34,879 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9000
2021-02-04 22:41:34,879 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5442
2021-02-04 22:41:34,879 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:41:34,879 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:41:34,880 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:41:34,880 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000
2021-02-04 22:41:34,883 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9000.
2021-02-04 22:41:34,886 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:41:34,887 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:41:34,887 [INFO ] epollEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
2021-02-04 22:41:34,888 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:41:34,890 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:41:34,890 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
2021-02-04 22:41:34,890 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     return _bootstrap._gcd_import(name[level:], package, level)
2021-02-04 22:41:34,890 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
2021-02-04 22:41:34,891 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
2021-02-04 22:41:34,891 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
2021-02-04 22:41:34,891 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'object_detector'
2021-02-04 22:41:34,891 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:41:34,892 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - During handling of the above exception, another exception occurred:
2021-02-04 22:41:34,892 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - 
2021-02-04 22:41:34,888 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:41:34,892 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:41:34,892 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:41:34,896 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:41:34,897 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:41:34,897 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stderr
2021-02-04 22:41:34,897 [WARN ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-maskrcnn_1.0-stdout
2021-02-04 22:41:34,897 [INFO ] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 89 seconds.
2021-02-04 22:41:34,899 [INFO ] W-9000-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stderr
2021-02-04 22:41:34,893 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_service_worker.py", line 176, in <module>
2021-02-04 22:41:34,902 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-maskrcnn_1.0-stdout
2021-02-04 22:41:35,081 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Listening on port: /tmp/.ts.sock.9001
2021-02-04 22:41:35,081 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - [PID]5446
2021-02-04 22:41:35,081 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Torch worker started.
2021-02-04 22:41:35,082 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Python runtime: 3.8.5
2021-02-04 22:41:35,082 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STOPPED -> WORKER_STARTED
2021-02-04 22:41:35,082 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001
2021-02-04 22:41:35,084 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Connection accepted: /tmp/.ts.sock.9001.
2021-02-04 22:41:35,085 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Backend worker process died.
2021-02-04 22:41:35,086 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Traceback (most recent call last):
2021-02-04 22:41:35,086 [INFO ] epollEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
2021-02-04 22:41:35,086 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/model_loader.py", line 84, in load
2021-02-04 22:41:35,087 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
2021-02-04 22:41:35,088 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
    at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133)
    at java.base/java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432)
    at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:129)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
2021-02-04 22:41:35,088 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: maskrcnn, error: Worker died.
2021-02-04 22:41:35,088 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-maskrcnn_1.0 State change WORKER_STARTED -> WORKER_STOPPED
2021-02-04 22:41:35,088 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stderr
2021-02-04 22:41:35,088 [WARN ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-maskrcnn_1.0-stdout
2021-02-04 22:41:35,088 [INFO ] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 89 seconds.
2021-02-04 22:41:35,089 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     module = importlib.import_module(module_name)
2021-02-04 22:41:35,090 [INFO ] W-9001-maskrcnn_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stderr
2021-02-04 22:41:35,091 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-maskrcnn_1.0-stdout
alvarobartt commented 3 years ago

Yeah @nagamanikandank, the error seems to be coming from the object_detector Torch handler, but I don't really know why this is about... I'll do more research on this issue and will let you know the outcome of it! 💪🏻 And a solution in the best-case scenario.

alvarobartt commented 3 years ago

Did you installed torch? @nagamanikandank Because besides the ModuleNotFound error over the object_detector there also seems to be another one regarding torch as it follows:

2021-02-04 22:40:39,991 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'torch'
alvarobartt commented 3 years ago

Please, test this and let me know if that works:

docker pull ubuntu:18.04
docker run -it ubuntu:18.04 /bin/bash
cd home/
apt-get install python3 python3-dev python3-pip openjdk-11-jre-headless git wget curl -y
python3 -m pip install torch torchvision torch-model-archiver torchserve==0.2.0
git clone https://github.com/pytorch/serve
mv serve/examples/object_detector object_detector
rm -rf serve/
wget https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth
torch-model-archiver --model-name maskrcnn --version 1.0 --model-file object_detector/maskrcnn/model.py --serialized-file maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth --handler object_detector --extra-files object_detector/index_to_name.json
mkdir model_store
mv maskrcnn.mar model_store/
torchserve --start --model-store model_store --models maskrcnn=maskrcnn.mar

And the once deployed share with me the output of the following commands:

pip show torch
pip show torchserve
ls -la /usr/local/lib/python3.8/dist-packages/ts/torch_handler/

Thank you!

nagamanikandank commented 3 years ago

@alvarobartt Thanks for quick reply

You are correct, PyTorch is not getting installed correctly, I don't know why

Attaching the commands I am executing to install PyTorch. At the end of the install, it shows Killed. I am not sure why that happens

root@8652d3dcc21f:/# pip3 install torch
Collecting torch
  Downloading torch-1.7.1-cp38-cp38-manylinux1_x86_64.whl (776.8 MB)
     |████████████████████████████████| 776.8 MB 7.5 MB/s eta 0:00:01Killed
root@8652d3dcc21f:/# pip3 show torch
WARNING: Package(s) not found: torch
alvarobartt commented 3 years ago

Try:

python3 -m pip install torch
python3 -m pip show torch
rm -rf /home/logs

And then start TorchServe again!

nagamanikandank commented 3 years ago

@alvarobartt I re-installed Torch with following command python3 -m pip install torch --no-cache-dir

I started torch serve after that and there was no error!!!

But, the prediction is failing with the persons.jpg provided in examples folder

The error is,

root@8652d3dcc21f:/home# curl http://127.0.0.1:8080/predictions/maskrcnn -T persons.jpg  
{  
  "code": 503,
  "type": "InternalServerException",
  "message": "Prediction failed"
}

Errors from the log file when trying to inference

2021-02-04 23:25:42,526 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/service.py", line 100, in predict
2021-02-04 23:25:42,527 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     ret = self._entry_point(input_batch, self.context)
2021-02-04 23:25:42,527 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/base_handler.py", line 125, in handle
2021-02-04 23:25:42,527 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     data = self.preprocess(data)
2021-02-04 23:25:42,526 [DEBUG] W-9001-maskrcnn_1.0 org.pytorch.serve.wlm.Job - Waiting time ns: 171900, Inference time ns: 96178532
2021-02-04 23:25:42,531 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/vision_handler.py", line 21, in preprocess
2021-02-04 23:25:42,532 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     image = Image.open(io.BytesIO(image))
2021-02-04 23:25:42,533 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/PIL/Image.py", line 2958, in open
2021-02-04 23:25:42,533 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     raise UnidentifiedImageError(
2021-02-04 23:25:42,534 [INFO ] W-9001-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0e39877ea0>
alvarobartt commented 3 years ago

Can you try:

curl -X POST http://127.0.0.1:8080/predictions/maskrcnn -T persons.jpg

Maybe the documentation is not properly updated... let me know if that works and I'll create a PR to update the documentation! 🔥

nagamanikandank commented 3 years ago

@alvarobartt ,

Same error when I run curl -X POST http://127.0.0.1:8080/predictions/maskrcnn -T persons.jpg

Attaching the error

2021-02-04 23:39:39,221 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/service.py", line 100, in predict
2021-02-04 23:39:39,223 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     ret = self._entry_point(input_batch, self.context)
2021-02-04 23:39:39,223 [DEBUG] W-9000-maskrcnn_1.0 org.pytorch.serve.wlm.Job - Waiting time ns: 68200, Inference time ns: 4311001
2021-02-04 23:39:39,223 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/base_handler.py", line 125, in handle
2021-02-04 23:39:39,225 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     data = self.preprocess(data)
2021-02-04 23:39:39,225 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/vision_handler.py", line 21, in preprocess
2021-02-04 23:39:39,227 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     image = Image.open(io.BytesIO(image))
2021-02-04 23:39:39,227 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -   File "/usr/local/lib/python3.8/dist-packages/PIL/Image.py", line 2958, in open
2021-02-04 23:39:39,227 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle -     raise UnidentifiedImageError(
2021-02-04 23:39:39,228 [INFO ] W-9000-maskrcnn_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f36112e5e00>
alvarobartt commented 3 years ago

Mmm that's a PIL error, but if you are using the image provided in the docs you should be good! Anyway, you can also try this Python code to test whether the error is coming from the Torch Handler or from the Inference API.

Install the requirements:

python3 -m pip install opencv-python pillow requests

Run this Python code:

# Download a sample image from the available examples
import urllib
url, filename = ("https://raw.githubusercontent.com/pytorch/serve/master/examples/object_detector/persons.jpg", "persons.jpg")
try: urllib.URLopener().retrieve(url, filename)
except: urllib.request.urlretrieve(url, filename)

# Transform the input image into a bytes object
import cv2
from PIL import Image
from io import BytesIO

image = Image.fromarray(cv2.imread(filename))
image2bytes = BytesIO()
image.save(image2bytes, format="PNG")
image2bytes.seek(0)
image_as_bytes = image2bytes.read()

# Send the HTTP POST request to TorchServe
import requests

req = requests.post("http://localhost:8080/predictions/maskrcnn", data=image_as_bytes)
if req.status_code == 200: res = req.json()
nagamanikandank commented 3 years ago

@alvarobartt I had to redownload the image and it worked !!!

Attaching all the steps that worked for me. Thanks a lot !!!

docker pull ubuntu:18.04
docker run -it -p 9080:8080 -p 9081:8081 -p 9082:8082 -p 9070:7070 -p 9071:7071 --name maskrcnn_pytorch ubuntu:18.04
cd home/
apt-get update
apt-get install python3 python3-dev python3-pip openjdk-11-jre-headless git wget curl -y
python3 -m pip install torch --no-cache-dir
python3 -m pip install torchvision --no-cache-dir
python3 -m pip install torch-model-archiver --no-cache-dir
python3 -m pip install torchserve==0.2.0 --no-cache-dir
pip3 show torch
pip3 show torchvision
pip3 show torchserve
git clone https://github.com/pytorch/serve
wget https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth
torch-model-archiver --model-name maskrcnn --version 1.0 --model-file /home/serve/examples/object_detector/maskrcnn/model.py --serialized-file /home/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth --handler object_detector --extra-files /home/serve/examples/object_detector/index_to_name.json
mkdir model_store
mv maskrcnn.mar model_store/
python3 -m pip install opencv-python pillow requests --no-cache-dir
torchserve --start --model-store /home/model_store --models maskrcnn=maskrcnn.mar

===================================
In a New Docker cli of maskrcnn_pytorch container :-
===================================
curl -X GET http://127.0.0.1:8081/models
cd /home/serve/examples/object_detector
curl -X POST http://127.0.0.1:8080/predictions/maskrcnn -T persons.jpg
alvarobartt commented 3 years ago

Cool! I'm glad I could help you 👍🏻 I think you can close the issue now, or maybe @dhanainme wants to add anything else before you do so.

msaroufim commented 3 years ago

@alvarobartt thank you so much for patiently helping out folks - we really appreciate it!