Closed hewh16 closed 3 years ago
Hi @hewh16
We used vid2e (https://github.com/uzh-rpg/rpg_vid2e) to generate synthetic events from the original 30 fps vimeo90k dataset. Vid2e first upsamples the video (using SuperSlow-Mo) and then uses ESIM (an event camera simulator) to generate events from high-speed video.
Hi @hewh16
We used vid2e (https://github.com/uzh-rpg/rpg_vid2e) to generate synthetic events from the original 30 fps vimeo90k dataset. Vid2e first upsamples the video (using SuperSlow-Mo) and then uses ESIM (an event camera simulator) to generate events from high-speed video.
could you provide some code or an colab file? got stuck to convert a mp4 video file to event. thx a lot
The instruction are here for upsampling the mp4: https://github.com/uzh-rpg/rpg_vid2e/blob/master/upsampling/README.md
and here to generate events: https://github.com/uzh-rpg/rpg_vid2e/blob/master/esim_torch/README.md
Let me know if there is something unclear at that repo.
The instruction are here for upsampling the mp4: https://github.com/uzh-rpg/rpg_vid2e/blob/master/upsampling/README.md
and here to generate events: https://github.com/uzh-rpg/rpg_vid2e/blob/master/esim_torch/README.md
Let me know if there is something unclear at that repo.
Got it, but when I convert videos using ffmpeg, it didn't generate any timestamp file or fps file.
Thanks :)
Is there any chance you can share the event-generated Vimeo90K dataset? This would encourage consistency between the research done by different people.
Hi ~
It's really a nice work but I have several questions. I found that in the paper you used the middlebury and vimeo90k dataset. However, different from adobe240 or GOPRO dataset, the middlebury and vimeo90k datasets are captured under 30fps. So how did you simulte the event for middlebury and vimeo90k datasets? Directly use the 30fps original videos?
Thanks a lot!