MarcoForte / FBA_Matting

Official repository for the paper F, B, Alpha Matting
MIT License
464 stars 95 forks source link

No train code #1

Closed xup16 closed 4 years ago

xup16 commented 4 years ago

Hi~ Thank you for the wonderful work. I find your work on the alphamatting.com, and I am very glad that you open source your code so quickly. However, I do not find the training code in the repository. Would you like also open source the training code?

MarcoForte commented 4 years ago

Hi unfortunately I cannot release the training code at this time. I've updated the readme with some key tips. I encourage people to try apply these and our network architecture to existing open source training codes. For example,

ucb-pb commented 4 years ago

Thanks for everything you have provided. Very impressive. What do you mean by clipping the alpha? What do you do?

MarcoForte commented 4 years ago

torch.clamp(alpha,0,1)

On Wed, Aug 19, 2020, 5:51 PM ucb-pb notifications@github.com wrote:

What do you mean by clipping the alpha? What do you do?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/MarcoForte/FBA_Matting/issues/1#issuecomment-676541074, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAED5MXUNRODH7DYELGV72LSBP7H7ANCNFSM4LPDGYSA .

ucb-pb commented 4 years ago

Ok, I'm unclear about where that would happen. Like is it applied to the output alpha before the loss is calculated?

MarcoForte commented 4 years ago

Hi yes it's applied to the output of the network for the alpha channel before the loss is calculated.

Occasionally this can cause issues at the beginning of training. And the output defaults to either all zeroes or all ones. If this happens, disable the clamping for the first few hundred iterations and then switch it back on.

On Wed, Aug 19, 2020, 7:34 PM ucb-pb notifications@github.com wrote:

Ok, I'm unclear about where that would happen. Like is it applied to the output alpha before the loss is calculated?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/MarcoForte/FBA_Matting/issues/1#issuecomment-676591406, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAED5MXJCMD57MBRAKBSJBLSBQLLDANCNFSM4LPDGYSA .

ucb-pb commented 4 years ago

thank you. I see the clamping and group normalization are already incorporated into your network code, which is great. is the weight standardization happening too? I can't tell where that would/should happen at first glance if it is.

ucb-pb commented 4 years ago

actually, i see clamping in a few places. which one(s) do you disable?