balintlaczko / MGT-python

Musical Gestures Toolbox for Python
https://www.uio.no/ritmo/english/research/labs/fourms/downloads/software/musicalgesturestoolbox/mgt-python/index.html
GNU General Public License v3.0
1 stars 0 forks source link

Greyscale mode? #33

Closed alexarje closed 4 years ago

alexarje commented 4 years ago

I cannot recall whether there is a color/greyscale function in the python version of the toolbox? If yes, it should also be documented in the notebook. If not, it should be implemented, probably at the preprocessing stage.

One important benefit of greyscale videos is that processing takes 1/4 of the time, and sometimes they are easier to look at than color versions.

balintlaczko commented 4 years ago

It is partly implemented as flag for motion() and motionhistory() - but I haven't really tested it since I am working on the repo. (The code looks reasonable though.) Right now the implementation is a bit in-between: self.color is a boolean class attribute of MgObject, but nothing ever happens with that if you don't call motion() or motionhistory() which pick this up and do the conversion accordingly internally. I guess it might be clearer if we implement it in the preprocessing stage and upgrade all processes (like motion(), history(), etc) to just adapt to the planecount of their input? Will focus on this now.

balintlaczko commented 4 years ago

So after some research I found out that (just as in Max) rendered video files should always have 3 planes - so there is no way to automatically infer planecount based on the source video. We can however implement grayscale mode in the preprocessing and processing modules, as it is already done in motion() and motionhistory(). Here are my notes on the progress of grayscale implementation across the repo:

Module Status Notes
preprocessing needs implementation to get a grayscale video with the trimmed, cropped etc video
motion() done
motionhistory() done
history() needs implementation now it always works on 3 planes
flow.sparse() needs implementation only to render the colored dots and tracks onto a grayscale video (making the tracks and dots grayscale doesn't make sense in my opinion)
flow.dense() doesn't make sense since the colors themselves are meant to represent the optical flow
average() needs implementation now it always works on 3 planes