Closed pengzhi1998 closed 3 months ago
Your config looks accurate to me. What version of robosuite are you using, and which branch? It might be worth trying the offline_study
branch of robosuite, along with v0.2
of robomimic (you would need to re-download the data though, since the image observations will look different). See this note.
Thank you so much for your response and help! After training with the image_v141.hdf5 dataset, the performance has improved significantly.
I noticed that the performance of the model trained using image-based demonstrations previously, downloaded directly from robomimic v0.2 was poor. I'm wondering what differences in the data (the image.hdf5 file from robomimic v0.2 and the image_v141.hdf5 extracted from demo_v141.hdf5 in robomimic v0.3) might be causing this discrepancy? Do they use different observations?
Additionally, I found your tutorial on using pretrained models. However, I only located the link for the lift task (http://downloads.cs.stanford.edu/downloads/rt_benchmark/model_zoo/lift/bc_rnn/lift_ph_low_dim_epoch_1000_succ_100.pth). Where may I find pretrained models for other tasks with image inputs?
Thank you so much again for your guidance!
Re (1): some textures in the environment have changed between robosuite v1.2 and v1.4 - this would explain why the model performance degraded (since you were training with image observations from v1.2 and evaluating with image observations from v1.4).
Re (2): The links should be here - but these were trained on robosuite v1.2.
Thank you so much for your help!
Dear Authors,
Thank you so much again for providing such a great and helpful repo!!
I have been following this page to reproduce the results of Square (Nut Assembly Task). Here is the generated config file I'm using:
However, despite multiple evaluations during training and testing, the highest success rates I achieved were between 0.4 and 0.64. I haven't seen a result of 0.82 as reported in the paper. Could you advise on how to achieve this success rate? Did I make a mistake?
Thank you so much for your help and your precious time! Look forward to your reply!
Best regards, Pengzhi