lmb-freiburg / contra-hand

Code in conjunction with the publication 'Contrastive Representation Learning for Hand Shape Estimation'
53 stars 4 forks source link

Clarification on checkpoints #7

Closed andreaziani closed 2 years ago

andreaziani commented 2 years ago

Hi, First of all, thank you for this great work and dataset! I truly believe that this will help the community exploring semi/self-supervised approaches for HPE.

I have a question related to the released checkpoints, which contrastive approach has been used to produce contra-hand-ckpt/ckpt/model_moco.pth ? Is this the multi-view approach described in the paper? Also, would it be possible to release the checkpoints for all contrastive trainings mentioned in the paper?

Thanks in advance, Andrea

zimmerm commented 2 years ago

Hi you are exactly right _modelmoco.pth is what we refer to as Ours-Multi view in Table 1 of our paper. It would require some digging in our internal codebase but theoretically that is possible. What are you planning to do with these inferior initialisation checkpoints?

andreaziani commented 2 years ago

Hi, thanks for the reply!

I'm working on my Master's Thesis and I was interested in visualizing what are the main differences in terms of qualitative and quantitative results using your different pre-training strategies.

That's why it'd be great having all of the checkpoints 😄