Closed alexarje closed 2 years ago
Here is the visbeat
GitHub repo designed by Abe Davis: https://github.com/abedavis/visbeat
Here is the video-synchronization
GitHub repo designed by Brian Levis: https://github.com/brianlevis/video-synchronization
Cool, did you try to make it work?
For the moment, I managed to compute directograms and design impact envelopes with peak detection. I am now starting to work on synchronizing the audio with the video frames. I can send a notebook to you and Balint tomorrow!
I have been trying to run the main.py
file of Brian Levis repository on my terminal in order to test the video-synchronization
but I was first facing a MemoryError
I have then tried to change the video and audio file in the main.py
file synchronize_video('low_audio.wav', 'low_video.mp4', 'test.mp4')
in order to see if a lower file size would work, but still, I get a RuntimeError
because the SoundFile
package failed to load the audio synchronized file.
I'll be very interested to see if it works on your side :)
I see that the media files are hard-coded in main.py. I put an audio file in a folder called input_files/audio/ and that seemed to have worked. Perhaps try another audio file? I have a 44k, 8-bit file.
On my system (Uubuntu) I now get an error with OpenCV:
A new function to render directograms is now available here. You can find more information on how to implement it in the wiki page.
It would be cool to implement something like the directograms and related tempograms for both audio and video, as described in this paper: http://www.abedavis.com/files/papers/VisualRhythm_Davis18.pdf