Closed AlexYiningLiu closed 4 years ago
These models are meant to be run offline, meaning on a previously recorded video. To run the code from this repo see the Binder example. If you need performance- check out the original Mediapipe repo, it’s written in C++ and can run on GPU on Linux.
These models are meant to be run offline, meaning on a previously recorded video. To run the code from this repo see the Binder example. If you need performance- check out the original Mediapipe repo, it’s written in C++ and can run on GPU on Linux.
I tried running the code on a pre-recorded mp4 file, but the result was the same. The frames updated very slowly rather than in any kind of continuous stream. I'm simply executing run.py without changing any code. Are there any special requirements for the video that it handles best?
No, the process is just slow because, like I mentioned earlier, it runs on CPU only.
No, the process is just slow because, like I mentioned earlier, it runs on CPU only.
Alright thanks for your response. On a side note, I would like to obtain the pixel coordinates of the finger tips. Based on the graph in the code, are those coordinates basically the 4th, 8th. 12th, 16th and 20th points in the points outputted by the detector?
Hah, I haven’t worked on this project since 2019, so it’s way easier for you to just check it yourself.
The frame rate that I get when running metalwhale's repo's hand tracking model is much extremely slow. The output frames don't form a smooth stream, instead, I see jittery, rough transitions on my output frames as I move my hand slowly across the camera. My output is significantly slower than the GIF output in the repository. Is there a way to fix this issue? If this is optimized better in this repo it would be great if I can try it. Please let me know how to run your model. Thanks.