JihyongOh / XVFI

[ICCV 2021, Oral 3%] Official repository of XVFI
280 stars 39 forks source link

Questions about Shared Parameters #10

Open nemoHy opened 2 years ago

nemoHy commented 2 years ago

Hi! Congratulations! I got a question that why you try to share paramters between those sub-networks. Is there any other motivations except for just reducing the number of paramters, or maybe some theories, explainations and experiments on it? I will be appreciated if you could reply as soon as you could .

nemoHy commented 2 years ago

Thank you for your relpy! @JihyongOh Acutally, I wonder if parameters can be shared among those sub-networks. Well, results shows that this method is useful. I think, there shuold be a reason why those parameters can be shared. Maybe this method can be applied to other similar tasks.

JihyongOh commented 2 years ago

@nemoHy To clarify, three sub-networks (BiFlownet, TFlownet, Refinement Block) of Fig. 4 are not shared each other, but can be shared across scale levels as in Fig. 3. You can also check that those three sub-networks are independent (not shared) in provided PyTorch code.

nemoHy commented 2 years ago

Thank you for your reply again! Sorry for my inappropriate expression. @hjSim @JihyongOh I know that three sub-networks (BiFlownet, TFlownet, Refinement Block) of Fig. 4 are not shared each other, but my question is about the sharing across scale levels. I know it can save parameters and works well in pratice, but is there any theoretical explanation why it can be shared across scales and works so well.