Closed uikilin100 closed 3 years ago
Hi @uikilin100 in 18.08 we added experimental support for Model.relaxComputationFloat32toFloat16
as documented here: https://source.android.com/devices/interaction/neural-networks. That support was improved in 18.11. The last time I tested it, when enabled it still didn't pass all of the VTS tests, although we have been working actively with Google on this issue and I hope it will pass them soon.
There will be further improvements to FP16 in the 19.11 release, for both Model.relaxComputationFloat32toFloat16
and direct usage of FP16 at the NNAPI level.
Closing old issue, I hope this was helpful.
In 18.08 release note, I could see FP16 support. What's the difference between this version and previous version of FP32 and FP16 support?