Closed lauracanalini closed 3 years ago
Hi, This is now solved in develop branch. There was a problem with the UpSampling layer, because in ONNX the UpSample operator is deprecated, so we export the UpSample with a Resize operator in ONNX (which is the equivalent operator). And when loading the model, as the operator is a Resize, it is loaded as a Resize layer and the operation is not exactly the same than the UpSampling (that explains the differences in the predictions).
After the fix: The UpSampling layer is now a wrapper of the Resize layer, which is the one that performs the correct operation. Now there are no differences after importing the model.
Thanks for spotting the issue!
Describe the bug Working with the use-case-pipeline, we notice an unexpected behavior, apparently due to the Upsampling layer. We reproduce the issue in EDDL modifying your
test_onnx_upsample2D
test.If we train a network, evaluate it and save the onnx, and then import the onnx just saved and re-evaluate it, we obtain different predictions. Therefore, the calculated metric is slightly different between runs.
If we replace the Upsampling layer with the Resize layer, the re-evaluation shows correct results instead. Of course we can't really use the Resize layer because we want to work with scale factor values in the network.
We tested with both master and development EDDL branches, both with CUDA and cuDNN.
To Reproduce Steps to reproduce the behavior:
Screenshots Qualitatively, in the pipeline, after the import from onnx file of SegNet with Upsampling layers, predictions seem to be translated.