Open AlextheEngineer opened 11 months ago
Hi,
Thanks for your interest.
.obj
using script of step4_load_mano_diffbg.py
Thanks for the response! For starter I'm trying to go through the rendering steps (including steps before step7).
I downloaded the interhand 5fps data and ran python utils/dataset_gen/interhand.py --data_path PATH_OF_INTERHAND2.6M --save_path ./interhand2.6m/ --gen_anno 1 python utils/dataset_gen/interhand.py --data_path ./interhand2.6m/ --gen_anno 0 This allowed me to have "img", "anno" and "ori_handdict" folders in "interhand2.6m/train" directory
Then I try to run "rendering_code/step2_remove_duplicate_pose.py" and set the “path” variable to point to the "interhand2.6m/train/anno". I had to modify L218 to load .pkl file instead of .npy because the pickle files in "interhand2.6m/train/anno" has the structure shown in L56~L58. The .npy files don't share this structure. However there is no key for "all_cam" at L59. There is only a "camera" key. So I try to only run for 1 camera but got stuck at L80 because the required mano pose variable should have 48 values but the loaded one only has 45.
I'm trying to connect some dots for the complete rendering pipeline...What am I doing incorrectly for step2? Thanks!
Hi Alex,
Sorry for the late response. Since the Interhand dataset has many interacting hands in the same pose with different camera views, I collect all the hand meshes with same pose in the same '.pkl', with different camera parameters in the key of all_cam
. You could take the files in nodup_sample folder of the materials.zip
for an example. Thanks.
Hi, I'm the author of Ego3DHands. Great works! May I ask how to render the same poses but with new egocentric view angles (I probably need to figure out how to define egocentric)? Much appreciated if instructions for these steps could be provided!