USTC3DV / FlashAvatar-code

[CVPR 2024] The official repo for FlashAvatar
MIT License
143 stars 18 forks source link

The result begins to corrupt around 140,000 iterations. #22

Closed aixiaodewugege closed 3 months ago

aixiaodewugege commented 4 months ago

image What is the reason?

xiangjun-xj commented 4 months ago

Yes, this happens occasionally. Methods like learning rate decay may help stablize the training process. And adding more regularization term on Gaussians like scaling should help. A direct way is to reduce iteration number as earlier result is good enough. If you find the exact reason or solution, please let us know.

aixiaodewugege commented 4 months ago

I think it is because the different learning rate decay between gaussian model and deformation model. Is that reasonable?