Open starsky opened 3 years ago
I think this is an issue with your PyTorch model. Each of the following lines causes a cast error:
traced_model(np.random.randint(0, 255, (500, 500, 3)))
traced_model(np.random.randint(0, 255, (200, 200, 3)))
traced_model(np.random.randint(0, 255, (1000, 1000, 3)))
🐞Describe the bug
flexible_shape_utils
, the resulting tensor sizes obtained from the adaptive pooling layer are incorrect.To Reproduce
Trace
Now when I run prediction on the model.
This works fine because as you can see I requested output
output_size=(2,2)
in PyTorch model.But when I use input size of (200,200):
The output is wrong, I get 1x1 tensor instead of 2x2.
Same for the bigger size (1000, 1000)
The output is wrong, I get 4x4 tensor instead of 2x2.
The problem is that during the conversion, the adaptive pooling layer is converted to regular pooling, thus kernel size is not computed dynamically based on the current input. I tried dynamic conversion with
EnumeratedShapes
but it also fails (see https://github.com/apple/coremltools/issues/976)System environment (please complete the following information):