triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.14k stars 1.46k forks source link

Exception: expecting input to have 3 dimensions, model 'densenet_onnx' input has 4 #6529

Closed mahesh11T closed 11 months ago

mahesh11T commented 11 months ago

python /workspace/client/src/python/examples/image_client.py image_filename=/workspace/images/mug.jpg -m densenet_onnx -b=0 -s=INCEPTION -v

image

    if len(input_metadata.shape) != expected_input_dims:
        raise Exception(
            "expecting input to have {} dimensions, model '{}' input has {}".format(
                expected_input_dims, model_metadata.name, len(input_metadata.shape)
            )
        )

line 116 in image_client.py is expecting (3,2..x, 2.3xx) shape, but its getting (1,3,2.x,2.x) , y is this happening?

Yobol commented 11 months ago

Hi. You can modify 'max_batch_size param' in config.pbtxt of densenet_onnx. Default value is 0. It must be more than 0. https://github.com/triton-inference-server/server/blob/main/docs/examples/model_repository/densenet_onnx/config.pbtxt

mahesh11T commented 11 months ago

Hi. You can modify 'max_batch_size param' in config.pbtxt of densenet_onnx. Default value is 0. It must be more than 0. https://github.com/triton-inference-server/server/blob/main/docs/examples/model_repository/densenet_onnx/config.pbtxt

I had set it to 1 itself and it was giving this error.

Yobol commented 11 months ago

I encountered this error yesterday. I solve it because I found config.pbtxt was missing.