Open nonetrix opened 11 months ago
Just got a bare metal Pixel 7 Pro running Graphene OS Android 13 - same for me. I assume the main reason for this is that model I decided to use for Local Diffusion is not optimized to run on NNAPI.
Next steps to try will be removing CPU_DISABLED flag for NNAPI, if it won't work there is a need to build a model that runs well on NNAPI.
Also can be possibly connected to issue with onnxruntime : https://github.com/microsoft/onnxruntime/issues/12793
It working on my Pixel7, android14, app ver.0.5.2-dev. works with 384x512, but crashes with 512x512
I am unable to test if this has been fixed on my device due to https://github.com/ShiftHackZ/Stable-Diffusion-Android/issues/100
i can confirm NNAPI works on the galaxy fold 4.
Still a issue unfortunately
I am starting to think it might be a white list or black list issue, not sure though
Describe the bug When attempting to use NNAPI on at least my device (Pixel 6a) it fails run the model on the target device for whatever reason. I suspect this is due to the Tensor chips found in Pixel devices, it's a obviously rare and unusual CPU/TPU (based on Samsung Exynos apparently but with a cut down TPU added) so probably due to that I would imagine. Could be wrong though because as the app suggests it's a experimental feature on a already experimental backend might be unrelated to the TPU
To Reproduce Steps to reproduce the behavior:
Expected behavior It is able to load the model and hopefully be much faster in theory
Screenshots![Screenshot_20230808-173608](https://github.com/ShiftHackZ/Stable-Diffusion-Android/assets/45698918/ddc623fd-3853-4681-b943-2de2cc6c3400)
Smartphone (please complete the following information):