Closed FabriceAuz closed 3 years ago
Hi, This should be fixed in the latest commit. The issue was that the variance of the BatchNorm is < 1e-12 for all output channels, thus a division by zero was occurring. Now the export should run correctly, however the fact that the variance is < 1e-12 for all channels indicates that there may be an issue with the parameters of the network. Was the network correctly trained? Closing for now, please feel free to re-open the issue if you feel that it is not properly solved.
I tried to export a trained MobileNet network using the commant line
./n2d2 "./MobileNet_v1_batchnorm_export_DNeuro.ini" -seed 1 -test -w /dev/null -export DNeuro_V2 -fuse -nbbits 8 -calib -1 -act-clipping-mode KL-Divergence -db-export 10 -export-parameters DNeuro1.ini
but I get a floating point exception. Here is the complete output from n2d2: