hzwer / Practical-RIFE

We are developing more practical frame interpolation approach.
MIT License
542 stars 60 forks source link

Do you still use block_tea while you train it? (not any issues, but just a question) #47

Closed masahiroteraoka closed 8 months ago

masahiroteraoka commented 8 months ago

Thank you for this awesome application. This is just a quesiton( not any issues at all), Do you still use block_tea in training phase, which we can see in IFNet of your original RIFE repository. And if so, Did you train the 'block_tea' at outside of those codes? According to your paper, I thought you had been using guided mask something. So I was thinking you were doing it when the 'block_tea' was in training phase, prior to training the main model 'flownet'. Is this correct?

hzwer commented 8 months ago

https://github.com/hzwer/Practical-RIFE#model-training You may check our code. I haven't manipulated block_tea before training. In my view, I just implemented some kind of deep supervision. BTW, deleting block_tea in the current framework only very slightly reduces performance.

masahiroteraoka commented 8 months ago

Thank you so much! Oh you've already released it for v4 series. Actually I didn't notice it:) I just looked into it a bit, and I understand it. Thank you!

Siziff commented 6 months ago

Add this function to train.py

def flow2rgb(flow_map_np):
    h, w, _ = flow_map_np.shape
    rgb_map = np.ones((h, w, 3)).astype(np.float32)
    normalized_flow_map = flow_map_np / (np.abs(flow_map_np).max())

    rgb_map[:, :, 0] += normalized_flow_map[:, :, 0]
    rgb_map[:, :, 1] -= 0.5 * (normalized_flow_map[:, :, 0] + normalized_flow_map[:, :, 1])
    rgb_map[:, :, 2] += normalized_flow_map[:, :, 1]
    return rgb_map.clip(0, 1)