Maclory / Deep-Iterative-Collaboration

Pytorch implementation of Deep Face Super-Resolution with Iterative Collaboration between Attentive Recovery and Landmark Estimation (CVPR 2020)
MIT License
295 stars 63 forks source link

About the pretrained HG model #12

Closed yukichou closed 4 years ago

yukichou commented 4 years ago

Is it possible to change the number of key points for training DIC without adding any extra works? I want to run some cross-domain experiments with the network you proposed, however, the number of key points(landmark) of my dataset is less than 68 which is 20 and therefore I can't use the HG pre-train model which is pretrained in 68 key points. Can I just not use the HG pre-train model to train DIC or I need to train the HG pre-train model from scratch with my own key points? Thanks for your reply.

Steve-Tod commented 4 years ago

I suggest training the pre-train model from scratch with your own key points. Training with pre-trained HG would be more stable. For the HG training, you can refer to the repo.