Hi there! Thank you for great guide, if found it very useful.
After some trial and error I was able to compile Mediapipe 0.8.4 for 64-bit Raspberry Pi OS. I have two questions though:
1) I had to skip the optimizations in order to compile successfully - from what I understand, some of compilation options are not necessary, as they're enabled by default on aarch64, e.g.
'--copt=-mfpu=neon-vfpv3',
'--copt=-mfloat-abi=hard',
having these two in bazel build parameters throws errors basically saying that these flags are not supported by compiler.
The inference time is still pretty fast - for example for Facemesh I get 18.5 FPS with average inference time per frame 54 ms., as seen here
https://twitter.com/HardwareAi/status/1408354889214971904
Have you run tests on your compiled package? What FPS can it achieve?
2) Do you know if Google Mediapipe teams plans to release pip package versions for more Python/architectures?
Hi there! Thank you for great guide, if found it very useful.
After some trial and error I was able to compile Mediapipe 0.8.4 for 64-bit Raspberry Pi OS. I have two questions though: 1) I had to skip the optimizations in order to compile successfully - from what I understand, some of compilation options are not necessary, as they're enabled by default on aarch64, e.g. '--copt=-mfpu=neon-vfpv3', '--copt=-mfloat-abi=hard', having these two in bazel build parameters throws errors basically saying that these flags are not supported by compiler. The inference time is still pretty fast - for example for Facemesh I get 18.5 FPS with average inference time per frame 54 ms., as seen here https://twitter.com/HardwareAi/status/1408354889214971904
Have you run tests on your compiled package? What FPS can it achieve?
2) Do you know if Google Mediapipe teams plans to release pip package versions for more Python/architectures?