mozilla / DeepSpeech

DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Raspberry Pi 4 to high power GPU servers.
Mozilla Public License 2.0
25.17k stars 3.95k forks source link

Feature request: Full integer quantization for tflite: Coral edge TPU compatibility #2332

Open jacobjennings opened 5 years ago

jacobjennings commented 5 years ago

For support and discussions, please use our Discourse forums.

If you've found a bug, or have a feature request, then please create an issue with the following information:

Instructions from https://coral.withgoogle.com/docs/edgetpu/compiler/ Download pretrained model for DeepSpeech 0.5.1

edgetpu_compiler output_graph.tflite

Feature request

I picked up the Coral USB ML accelerator which can run inference on tflite models with additional restrictions:

https://coral.withgoogle.com/products/accelerator https://coral.withgoogle.com/docs/edgetpu/models-intro/

"Note: Starting with our July 2019 release (v12 of the Edge TPU runtime), the Edge TPU supports models built with TensorFlow's post-training quantization, but only when using full integer quantization (you must use the TensorFlow 1.15 "nightly" build and set both the input and output type to uint8). Previously, we supported only quantization-aware training, which uses "fake" quantization nodes to simulate the effect of 8-bit values during training. So although you now have the option to use post-training quantization, keep in mind that quantization-aware training generally results in a higher accuracy model because it makes the model more tolerant of lower precision values."

Include any logs or source code that would be helpful to diagnose the problem. For larger logs, link to a Gist, not a screenshot. If including tracebacks, please include the full traceback. Try to provide a reproducible test case.

deepspeech/deepspeech-0.5.1-models$ edgetpu_compiler output_graph.tflite Edge TPU Compiler version 2.0.258810407 INFO: Initialized TensorFlow Lite runtime. Invalid model: output_graph.tflite Model not quantized

jacobjennings commented 5 years ago

If it helps, I would be happy to fund a Coral USB for you to try. jacob.r.jennings@gmail.com I think there's potential to cross the real-time barrier on a Pi with this thing. Would be fun for my DIY projects.

lissyx commented 5 years ago

I've already tried to get that working but the intersection between what is supported for EdgeTPU and our current model makes it incompatible. Please see existing threads on discourse and also NNAPI and GPU delegations issues on github.

jacobjennings commented 5 years ago

Unfortunate. Thanks for the info.

lissyx commented 5 years ago

Yeah, don't worry, I'd like to get it working so I'll keep testing on some spare cycles

rhamnett commented 4 years ago

Given we are now using TF 1.15 do you think it would be possible to try the quantization again? What is the current output type, is it uint8?

lissyx commented 4 years ago

Given we are now using TF 1.15 do you think it would be possible to try the quantization again? What is the current output type, is it uint8?

I used 1.15 during my previous experiments