facebookresearch / UmeTrack

UmeTrack Unified multi-view end-to-end hand tracking for VR
Other
88 stars 16 forks source link

Training repetition #11

Open f1yfisher opened 1 year ago

f1yfisher commented 1 year ago

I have tried many ways to reproduce your method, but the results always much different from those in the paper. For full model with the pose loss, the results are 19.214mm for known hands and 19.756mm for unknown hands. Can you provide more details for training?

  1. Did you adopt some learning schedulers during training?
  2. Could you provide sepcific hyper-parameters in data augmentation?
  3. Whether you use only 2-view images for training?
  4. Needn't you compute the mean of the first item of the pose loss in eq4?
  5. Did you direct sum the pose loss of known hands and unknown hands without any hyper-parameters?
guker commented 1 year ago

do you just use umetrack_data to train model?

f1yfisher commented 1 year ago

do you just use umetrack_data to train model?

Yes, I use the real and synthetic training data of umetrak_data for training.

zhanxiaopan commented 1 year ago

Hi @f1yfisher @guker, Could you share or open source you training python script. Since, we now try to re-train the model too. Maybe we can help with each other if theauthor has no plans to open source the training code.

email: panzi2017@gmail.com

guker commented 1 year ago

can you provide training code for me? thanks very much email: huqunwei2607@gmail.com

cumtchenchang commented 3 days ago

can you provide training code for me? thanks very much email: chen1chang2@gmail.com