Open SutirthaChakraborty opened 3 years ago
Hi all, I am also trying to do similar activity. I also want to use OpenPose and predict rhythm. Can we get the csv for in that way ? thanks..
That is a very good question! Thanks for mentioning it. I guess you mean tracking the average, observable pulse @SutirthaChakraborty? That should be possible to do based on the QoM values. This is implemented in the (old) Max-version of MGT, but not (yet) implemented in the Python version. The challenging part is to select a good way of doing the thresholding of the signal and window size to average over. Any thoughts on that?
@balintlaczko perhaps you can try to implement a basic pulse tracker (+ pulse clarity). @benediktewallace may also have some ideas about how to do that?
As for rhythm, I think that is more challenging @senemaktas. Do you mean the pulse (as mentioned above) or actual rhythm? I guess it would be possible to try to estimate some kind of rhythmic structure based on beat patterns, but that is not straightforward at all. I guess meter would be easier to estimate (3/4, 4/4, etc.), particularly also if we include a beat tracker.
While I remember it, the dissertation of Carlos Guedes has a nice discussion of challenges related to finding tempo and rhythm from dance. He also implemented an FFT-based tempo tracker in Max that it would be good to look at.
Also the work of Federico Visi is relevant, such as this article and his implementation of periodic quantity of motion in Max (through modosc).
That is a very good question! Thanks for mentioning it. I guess you mean tracking the average, observable pulse @SutirthaChakraborty? That should be possible to do based on the QoM values. This is implemented in the (old) Max-version of MGT, but not (yet) implemented in the Python version. The challenging part is to select a good way of doing the thresholding of the signal and window size to average over. Any thoughts on that?
@balintlaczko perhaps you can try to implement a basic pulse tracker (+ pulse clarity). @benediktewallace may also have some ideas about how to do that?
As for rhythm, I think that is more challenging @senemaktas. Do you mean the pulse (as mentioned above) or actual rhythm? I guess it would be possible to try to estimate some kind of rhythmic structure based on beat patterns, but that is not straightforward at all. I guess meter would be easier to estimate (3/4, 4/4, etc.), particularly also if we include a beat tracker.
Hi Dr. Alexander, Thank you for your reply. Yes, I was talking about pulse beat. Kind of beat tracking. I am very new in this. Can you also suggest any way to do ? Which CSV output could be more useful to it ? Thanks,
Senem
The easiest way to get started is to look at the QoM column (quantity of motion) and use a binary operation to check whether the value is above or below a specific threshold value. You may want to do some filtering first (low-pass filter), to remove false triggering. Another improvement would be to average the values over a moving window.
Hi Dr. @alexarje , Thank you for your detailed replies. I would be studying the QoM values in detail.
No problem. I am reopening the issue so that we can remember to include an example of how to do this. Great if you would like to share your code @SutirthaChakraborty, if you figure out how to do it. Otherwise, we can add an example when we have time for it.
The easiest way to get started is to look at the QoM column (quantity of motion) and use a binary operation to check whether the value is above or below a specific threshold value. You may want to do some filtering first (low-pass filter), to remove false triggering. Another improvement would be to average the values over a moving window.
Thank you very much for advice. I will try to share it when I apply it also.
Hi all! Thanks for the suggestions! To get the quantity of motion, you can do for example:
from musicalgestures import *
vid = MgObject('/path/to/vid.mp4')
vid.motiondata()
This will output a csv file where every row is: \<time in ms> \<quantity of motion> \<centroid of motion normalized X coordinate> \<centroid of motion normalized Y coordinate>
To get the normalized coordinates of all body keypoints using OpenPose:
from musicalgestures import *
vid = MgObject('/path/to/vid.mp4')
vid.pose() # save_video=False if you only want the csv
As of now there is no built-in beat- or pulse-detection included in the package, but I will add this feature to the list. :-)
Hi, This is my code, I am trying to find a correlation between the beat phase of musician body movement and audio beats. Colab link
@balintlaczko I don't know why the 'pose' was not working. Let me know if I am going in the right direction, or any way to reach my goal.
@SutirthaChakraborty that is because these methods only exist in this repo, and will be available through the pip install very soon when we release the updated package. Until that time feel free to fork this repo and test things out, though it is not fully tested yet. Stay tuned... :)))
ah! Okay. I will clone the repo. Thanks.
I got an error in downloading the pose model. Any clue ?
Hi @SutirthaChakraborty, it looks like the shell script was not allowed to run ("Permission denied"). This function is still not completely implemented, currently only tested on Windows. As a workaround for now, try manually executing the _getMPIhere.sh with elevated privileges (sudo). I will mark this issue related to #160. Generally, there is a lot of testing and bug-fixing happening on the repo these days (:-) so I recommend you forking it again in a few days, or just wait for the full release (also soon).
I wonder whether making a tempogram function for video (in addition to the audio one currently there) could be a first step to pulse detection from the video. It could be run on the QoM data to start with and then we can look for something more advanced later (as mentioned above).
Hi Dr @alexarje and @balintlaczko, it all started from here, we published our work on Beat Estimation from Musician Visual Cues. Thank you so much for this wonderful library. Looking forward to do more similar works.
Great, thanks for sharing the article @SutirthaChakraborty. Nice work! Feel free to set up a pull request if you have any code that you think it would be useful to embed in the MGT.
Sure, thanks
Hi, Thank you for developing this tool. I was trying to find out the tempo from the body movements of the musician. Another thing I could not figure out was how to use the OpenPose feature. Any code example on that?
Thanks, Regards, Suti