hgaurav2k / hop

Hand-object interaction Pretraining From Videos
64 stars 3 forks source link

The pretraining and finetuning time #7

Open 1024AILab opened 5 days ago

1024AILab commented 5 days ago

Hello, I am very curious about how long the pretraining will take? I run the finetuning on two 4090 with 10w epoch, which takes almost three days. What types of GPU do you use?

The 10w epoch finetuning results look good !

https://github.com/user-attachments/assets/50c7c427-d35e-4aaa-8775-3b1c895486b6

1024AILab commented 5 days ago

I double the mini_batchsize and num_envs. Will it shorten the finetuning time? Thank you~

1024AILab commented 5 days ago

These are the visualization results. Were the results correct?

reward_analysis_AllegroXarmGrasping_Finetuned success_analysis_AllegroXarmGrasping_Finetuned

hgaurav2k commented 3 days ago

The finetuning (~600M agent steps with total batch size (number of agents) 4096) takes us less than 12 hours with 8 A5000s. For pretraining, any checkpoint beyond 50K gradient steps is sufficient for the finetuning stage.