Closed niqodea closed 8 months ago
Hi, I'm worried about a new problem when using privileged distillation, that is, flow_teacher will model part of the noise. IFRNet seems to have a solution to assign mask. We further found that as long as we can construct a stronger teacher so that each previous student block can be supervised, we can get most of the benefits. The teacher of practical_RIFE is actually the conf weighted fusion of the results of each student block. The advantage of this is that there is no extra calculation required for teacher.
I see, thank you for the response. What does conf
stand for? Is it configuration?
Moreover, does that mean that the teacher is no longer privileged and can thus be used at test time as well? What would be the reason to not use the teacher at inference time instead of the student at this point?
Got it. Thanks a lot!
Hello RIFE authors! Thank you for sharing your training code, very helpful.
I am trying to better understand the training code of Practical RIFE (here the link to v4.15) and I am struggling to understand the changes to the teacher compared to the original paper.
From the paper we read:
In the original repository, it is pretty straightforward to observe this in the code: link
However, in the new training code, we have this snippet instead, where
gt
is never used:It seems to me that the teacher no longer has the privilege to access the intermediate frame when computing the flows. Instead, it seems to leverage the
conf
tensor to refine the flows of the student. The ground truth does not seem to be leveraged when computing thisconf
tensor.Am I missing something, or is this teacher no longer privileged as before? Also, what is the meaning of the
conf
tensor?Also, what is the meaning of the
conf
tensor? Why is it