fourMs / MGT-python

Musical Gestures Toolbox for Python
https://www.uio.no/ritmo/english/research/labs/fourms/downloads/software/musicalgesturestoolbox/mgt-python/index.html
GNU General Public License v3.0
52 stars 11 forks source link

More motion analysis #210

Open alexarje opened 3 years ago

alexarje commented 3 years ago

It would be interesting to explore some more analysis, including:

joachimpoutaraud commented 1 year ago

According to Cross et al., 2021, motion energy consists of a difference image for consecutive frames pairs computed on each video so that any pixel with more than 10 units luminance change gets classified as “moving”. The mean numbers of moving pixels per frame and movie is then summed to give a ME index for that video.

After displaying the average of sample value difference between all values of the Y (luminance) plane in the current frame and corresponding values of the previous input frame using ffmpeg, I found that motion energy could be interpreted as the quantity of motion. Here is a simple python code:

import subprocess
import re
import matplotlib.pyplot as plt
import numpy as np
import cv2

command = 'ffmpeg -i input_files/video/test.mp4 -vf "signalstats,metadata=print:key=lavfi.signalstats.YDIF" -an -f null -'
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
out, err = process.communicate()

ydif = re.split(r'\s', out)
matching = [float(s.split('=')[1]) for s in ydif if "lavfi.signalstats.YDIF" in s]

cap= cv2.VideoCapture('input_files/video/test.mp4')
fps = int(cap.get(cv2.CAP_PROP_FPS))

plt.figure(figsize=(12,2))
plt.bar(np.arange(len(matching)-1)/fps, np.asarray(matching[1:])/max(matching[1:]));

However, I will not implement it in the toolbox as it renders the same result as the QoM. On the other hand, I found an interesting project related to the extraction of motion energy features from video using a pyramid of spatio-temporal Gabor filters. I will now focus on extrating motion smoothness.

joachimpoutaraud commented 1 year ago

Motion smoothness

I have implemented a new velocity parameter in the dense optical flow function.

When set to True, it allows to compute dense optical flow velocity according to a distance in meters to the image (focal length) for returning flow in meters per second. The distance parameter is set to None as default but if known in advance it can be useful. Additional parameter related to the angle_of_view, can be set for reporting flow in meters per second. This is set by default to 0. As a result, it is now possible to compute motion smoothness using the number of velocity peaks per meter (NoP) as an index as described here.

Main drawback of this new parameter is that it is based on optical flow using OpenCV which takes a lot of time to process a video.

More information on how to implement it can be found in the MGT wiki documentation.

joachimpoutaraud commented 1 year ago

Motion entropy

Based on the velocity parameter, it is also possible to compute [acceleration]() of motion between every frames as follow:

  def get_acceleration(self, velocity, fps):

      acceleration = np.zeros(len(velocity))
      velocity = np.abs(velocity)

      for i in range(len(acceleration)-1):
          acceleration[i] = ((velocity[i+1] + velocity[i]) - velocity[i]) / (1/fps)

      return acceleration[:-1]

That way, if the distance and angle_of_view parameters are accurately filled out, one can have a precise idea of the acceleration of motion (expressed in meters per second). Finally, entropy of acceleration is calculated on the acceleration of motion array in order to get the motion entropy as described here.

Here is an overview of the results obtained for the dance.avi video with experimental parameters set to distance=3.5 and angle_of_view=80. Finding precise angle of view to compute optical flow velocity can be calculated with the camera’s effective focal length. Here is more information on how to calculate it.

velocity

alexarje commented 1 year ago

Very cool! I wonder whether this MV tractus could be a way to get motion vectors ala optical flow without OpenCV?