larq / compute-engine

Highly optimized inference engine for Binarized Neural Networks
https://docs.larq.dev/compute-engine
Apache License 2.0
242 stars 34 forks source link

Fix default ranges in saved model converter #671

Closed Tombana closed 3 years ago

Tombana commented 3 years ago

What do these changes do?

This fixes the default int8 ranges in the saved model converter. I've added comments to the code to explain what causes this with a solution.

How Has This Been Tested?

Tested manually.