pytorch / serve

Serve, optimize and scale PyTorch models in production
https://pytorch.org/serve/
Apache License 2.0
4.22k stars 861 forks source link

when to use ArgumentParser, raise "unrecognized arguments: --sock-type unix --sock-name /tmp/.ts.sock.9000" #3299

Open james-joobs opened 2 months ago

james-joobs commented 2 months ago

🐛 Describe the bug

  1. when to use ArgumentParser in custom handler from Basehandler, raise an error "model_service_worker.py: error: unrecognized arguments: --sock-type unix --sock-name /tmp/.ts.sock.9000"

Error logs

model_service_worker.py: error: unrecognized arguments: --sock-type unix --sock-name /tmp/.ts.sock.9000

Installation instructions

python ./ts_scripts/install_dependencies.py --cuda=cu121

Model Packaging

Torchserve --start --ncs

config.properties

Nothing

Versions

torch 2.3.0+cu121 torch-model-archiver 0.11.1 torch-workflow-archiver 0.2.14 torchaudio 2.3.0+cu121 torchmetrics 1.4.1 torchserve 0.11.1

Repro instructions

torch --start

Possible Solution

To do exception when to use dependencies to read Argparse in model_service_worker.py, there're nothing to except to call Argparse classes.

Or replace Argparse class in model_service_worker.py to passing in arguments with some variables

mreso commented 2 months ago

Hi @james-joobs thanks for reporting your issue. For a bit more context, could you add a complete log showing your error? What platform are you on? Any information on your custom handler you can share? Are you able to run one of the examples like this one https://github.com/pytorch/serve/tree/master/examples/image_classifier/mnist?

james-joobs commented 2 months ago
import argparse

def parse_arguments():
    parser = argparse.ArgumentParser(description='CRAFT Text Detection with TrOCR and LayoutLMv3')
    parser.add_argument('--trained_model', default='ocr/full_ocr/craft_tr_ocr/craft_mlt_25k.pth', type=str, help='pretrained CRAFT model')
    parser.add_argument('--text_threshold', default=0.7, type=float, help='text confidence threshold')
    parser.add_argument('--low_text', default=0.4, type=float, help='text low-bound score')
    parser.add_argument('--link_threshold', default=0.4, type=float, help='link confidence threshold')
    parser.add_argument('--cuda', default=True, type=lambda x: x.lower() in ['true', '1'], help='Use cuda for inference')
    parser.add_argument('--canvas_size', default=1280, type=int, help='image size for inference')
    parser.add_argument('--mag_ratio', default=1.5, type=float, help='image magnification ratio')
    parser.add_argument('--test_folder', default='test_img', type=str, help='folder path to input images')
    parser.add_argument('--poly', default=False, action='store_true', help='enable polygon type')
    args = parser.parse_args()
    return args

class CraftTrOCRHandler(BaseHandler):
    def __init__(self):
        super(CraftTrOCRHandler, self).__init__()

        # default fields
        self.initialized = False
        self.device = None
        self.parser = parse_arguments()

Starting with this argument's parser, trying to execute a cli "torchserve --start --ncs ...", to got an error:

2024-08-23T10:32:58,399 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - usage: model_service_worker.py [-h] [--low_text LOW_TEXT] 2024-08-23T10:32:58,400 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - [--text_threshold TEXT_THRESHOLD] 2024-08-23T10:32:58,401 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - [--link_threshold LINK_THRESHOLD] [--cuda CUDA] 2024-08-23T10:32:58,401 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - [--canvas_size CANVAS_SIZE] 2024-08-23T10:32:58,401 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - [--mag_ratio MAG_RATIO] 2024-08-23T10:32:58,401 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - [--test_folder TEST_FOLDER] [--poly] 2024-08-23T10:32:58,401 [WARN ] W-9000-ocr_1.0-stderr MODEL_LOG - model_service_worker.py: error: unrecognized arguments: --sock-type unix --sock-name /tmp/.ts.sock.9000 --metrics-config /home/ubuntu/miniforge3/envs/ocr310/lib/python3.10/site-packages/ts/configs/metrics.yaml 2024-08-23T10:33:01,217 [INFO ] W-9000-ocr_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9000, pid=2549830 2024-08-23T10:33:01,218 [INFO ] W-9000-ocr_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000 2024-08-23T10:33:01,227 [INFO ] W-9000-ocr_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniforge3/envs/ocr310/lib/python3.10/site-packages/ts/configs/metrics.yaml. 2024-08-23T10:33:01,228 [INFO ] W-9000-ocr_1.0-stdout MODEL_LOG - [PID]2549830 2024-08-23T10:33:01,228 [INFO ] W-9000-ocr_1.0-stdout MODEL_LOG - Torch worker started.

To remove these arguments, there's no more raising such this error and going well.

mreso commented 2 months ago

Thanks @james-joobs for the additional information. Now its more clear to me where the issue is. The BaseHandler or a derived class is not executed directly from the cli. Its the model_service_worker.py that gets called. I you want to give additional parameters to your handler you can use the model_config.yaml file as described here. Its included in the model packaging step and it contains pre-specified elements (like pt2 + parallelism configs) but you can also add custom parameter in there too. The BaseHandler reads the file during the initialization and if you do not call super().initialization() in your handler you can load the file's content like this from the request context: https://github.com/pytorch/serve/blob/a2ba1c7127b96f4d14e1d79529e1f973c0fde3ee/ts/torch_handler/base_handler.py#L151-L152

Going through the doc right now and I am afraid this needs a bit of a polish. I'll self assign the issue and try to document the model_config.yaml file in the next days. Let me know if this does not fit you r use case or you have further questions.