Qiulin-W / SAFA

Official Pytorch Implementation of 3DV2021 paper: SAFA: Structure Aware Face Animation.
Other
178 stars 29 forks source link

role of 'param loss' in pre-train stage #12

Open yjsong-alchera opened 2 years ago

yjsong-alchera commented 2 years ago

Hello. I love your great work.

I have a question.

When I tried to pre-train 3DMM estimator, I found 'param loss' (the line 305 in model.py)

https://github.com/Qiulin-W/SAFA/blob/189856a0cb94bf744c8f333658fa84ec5ce21609/modules/model.py#L305

But, I couldn't understand the role of this loss term...

Could you explain more about this loss term? (How param loss work)

For 3DMM parameters (Shape, Expression), how can we regulate this parameters although there are no GT (Ground Truth). (I understand 'ldmk_loss' because we prepare ldmk GT before training)

Please forgive me

Thank you.

Qiulin-W commented 2 years ago

Hi, The ldmk loss is a re-projection loss. Therefore, for a given set of landmarks on the 2D images, there are countless solutions of 3DMM parameters (shape, exp, pose). This is where the param loss come in. The param loss is used to regulate the 3DMM so that the 3DMM parameters and the result 3D faces are plausible.

yjsong-alchera commented 2 years ago

Thanks for quick replying :D

Then, mean of 'shape' parameters and mean of 'exp' parameters should be as small as possible, right?

That is, the role of 'param loss' is to make the mean of 'shape' parameters and mean of 'exp' parameters close to 0?