whizzmobility / models

Models and examples built with TensorFlow. Added support for more recent SOTA, and implementing on private datasets.
Apache License 2.0
0 stars 0 forks source link

[Feature] Quantization aware training #2

Open rehohoho opened 3 years ago

rehohoho commented 3 years ago

Prerequisites

Please answer the following question for yourself before submitting an issue.

1. The entire URL of the file you are using

https://github.com/tensorflow/models/tree/master/official/vision/beta

2. Describe the feature you request

Add workflow for quantization aware training. Can reference from some available research implementations, but make sure it is based on https://www.tensorflow.org/model_optimization/guide/quantization/training_comprehensive_guide#define_quantization_aware_model

3. Additional context

May get major speedups (up to 4x) and allow running on NNAPI or hexagon delegate. https://www.tensorflow.org/lite/performance/delegates#delegates_by_model_type Tracking from whizzscooters/data-engine#114.

4. Are you willing to contribute it? (Yes or No)

rehohoho commented 3 years ago

Current difficulty: little support for subclassed models, have to apply quantization annotation recursively manually.

rehohoho commented 3 years ago

Think this is better done when tensorflow supports quantization aware training better. Moving to not urgent, since thermal issues is not that bad now.