tarun005 / FLAVR

Code for FLAVR: A fast and efficient frame interpolation technique.
Apache License 2.0
430 stars 69 forks source link

16x or higher factor trained model #32

Closed Youngtrue closed 2 years ago

Youngtrue commented 2 years ago

Hi Tarun,

It is remarkable that you make inference speed for 16x or higher factor faster than Super SloMo while performing well. Will you publicate the trained model of 16x and higher and add support afterward?

tarun005 commented 2 years ago

For the 16x model, we do not train a new model due to lack of training data. Instead, we cascade (8x,2x) to form 16x models and (8x,8x) model to form 64x model. Note that it is still completely possible to train a complete end to end 64x interpolation model, just that we did not do it because we do not have enough data to train.

Astrostellar commented 2 months ago

Hi, could you please share the pretrained model? The current links seem to be unavailable. Thanks a lot!