patlevin / tfjs-to-tf

A TensorFlow.js Graph Model Converter
MIT License
138 stars 18 forks source link

MobilenetV1 conversion from tfjs graph model to frozen model problem with quantization #26

Closed pjaholkowski closed 3 years ago

pjaholkowski commented 3 years ago

Hi

I have models (not mine) in tfjs graph model it is MobilenetV1 in float and quantized to int16 and int8. The original with float after conversion works okay but that two other ones are inproperly converted. You can see that by comparing original graph model and generated model in Netron. I put original graph model as attachment

original_model converted_model mobilenet_quant2_075_stride16.zip

patlevin commented 3 years ago

This seems tobe an issue with tensorflowjs.

The reason for the difference is data loss during de-quantisation of the weights which leads to all packed bias values being zero after loading. The optimiser will then happily drop the BiasAdd-operation, since it detects a no-op due to the bias values being zero.

I will do some further investigating and fix the problem in tensorflowjs if possible.

patlevin commented 3 years ago

I have identified the problem in tensorflowjs and will submit a pull request to fix the problem.

I'll let you know once the tensorflowjs has merged the changes. I could add a workaround here, but I feel it's better to fix the problem at its root, especially since that will benefit every user of tensorflowjs.

Thanks for bringing the problem to my attention, though!

pjaholkowski commented 3 years ago

OK, thank you

patlevin commented 3 years ago

The fix for tensorflowjs has been approved and a future version (maybe even the next one?) will solve the issue.

It took a while due to a US holiday, but eventually the issue got sorted. It's not merged into the master branch yet, so I keep this issue open.

pjaholkowski commented 3 years ago

That's good news. Thank you for letting me know

patlevin commented 3 years ago

The bugfix has been merged into TFJS master just now: Pull Request.

It should be part of the next TFJS release, which would fix this issue. You can also install the TFJS master version to get the bug fixed immediately.