Closed robinchm closed 6 months ago
https://github.com/johndpope/MegaPortrait-hack/blob/6540a70b938aab7ec24972240eeb1c64ece960e5/model.py#L1021
Here w_em_s2c already includes w_rt_s2c (added in WarpGenerator), we have effectively added w_rt_s2c twice. Should not really matter since dense network usually can take care of such scaling.
w_em_s2c
w_rt_s2c
Thanks Robin. you're right. There was so much code flux last week moving parts of logic around to match the paper. I also came across this - https://github.com/johndpope/MegaPortrait-hack/issues/15
https://github.com/johndpope/MegaPortrait-hack/blob/6540a70b938aab7ec24972240eeb1c64ece960e5/model.py#L1021
Here
w_em_s2c
already includesw_rt_s2c
(added in WarpGenerator), we have effectively addedw_rt_s2c
twice. Should not really matter since dense network usually can take care of such scaling.