zzh-tech / BiT

[CVPR2023] Blur Interpolation Transformer for Real-World Motion from Blur
https://zzh-tech.github.io/BiT/
MIT License
223 stars 8 forks source link

Questions about Pre-BiT++ #4

Closed AIVFI closed 1 year ago

AIVFI commented 1 year ago

Many thanks for the RBI dataset with real motion blur. In my opinion, this is a real revolution! It will finally be possible to train Joint Video Deblurring and Frame Interpolation models on a dataset with real motion blur. Also thanks for developing the BiT models and making them available for download.

I am creating on GitHub Video Frame Interpolation Rankings and Video Deblurring Rankings, where each ranking includes only the single best model for one method.

I now intend to add rankings based on the RBI dataset and I make no secret that these will be the most important rankings in my repository. The best results based on your paper were achieved by the Pre-BiT++ model. I have a couple of questions in relation to this:

  1. Did I understand correctly that this Pre-BiT++ model is the model trained on Adobe240 and then on RBI?

  2. Does Pre-BiT++ also achieve better results visually compared to BiT++(RBI) on the RBI dataset? I mean does Pre-BiT++ not introduce artifacts such as shown in Figure 5 in your paper as in the case of BiT++(Adobe240)?

  3. If you were to apply your method to a real movie to be judged by human vision would you choose Pre-BiT++ instead of BiT++(RBI)?

zzh-tech commented 1 year ago

Thanks for your interests!

Did I understand correctly that this Pre-BiT++ model is the model trained on Adobe240 and then on RBI?

Yes.

Does Pre-BiT++ also achieve better results visually compared to BiT++(RBI) on the RBI dataset? I mean does Pre-BiT++ not introduce artifacts such as shown in Figure 5 in your paper as in the case of BiT++(Adobe240)?

Yes.

If you were to apply your method to a real movie to be judged by human vision would you choose Pre-BiT++ instead of BiT++(RBI)?

I will choose Pre-BiT++.

AIVFI commented 1 year ago

Thank you very much for your answers to my questions.