Hello, I am trying to deploy a 3D segmentation model that operates on ~9MB binary data inputs, so I appreciate the recent option to increase max_request_size in cb174b08a3a96b60f3fff0a5eb80fd1b86fe381d. I updated my mms installation to 1.0.2, added max_request_size=10000000 to config.properties, and confirmed this setting is recognized in the mms startup log. However, I am still getting the following error trace upon POSTing an inference request:
Hello, I am trying to deploy a 3D segmentation model that operates on ~9MB binary data inputs, so I appreciate the recent option to increase
max_request_size
in cb174b08a3a96b60f3fff0a5eb80fd1b86fe381d. I updated my mms installation to 1.0.2, addedmax_request_size=10000000
toconfig.properties
, and confirmed this setting is recognized in the mms startup log. However, I am still getting the following error trace upon POSTing an inference request:It looks like it may be still hardcoded in one spot?
https://github.com/awslabs/mxnet-model-server/blob/09c7665d5991727d55b411017257f8d716359448/mms/protocol/otf_message_handler.py#L21
https://github.com/awslabs/mxnet-model-server/blob/09c7665d5991727d55b411017257f8d716359448/mms/protocol/otf_message_handler.py#L125-L129
Thanks for looking into this. -John