-
Hi! I have a question about evaluation.
human36m protocol2 execute on subject9,11 with 64 frame steps.
but i think your code is on 5 frame steps.
ex) _000001.jpg, _000006.jpg, _000011.jpg, ...
…
-
Thanks for your great work!
When I try to run on internet videos, the 'imgname' in "h36m_random_sample_center_10_10.pt" is needed. However I extract the images of h36m by "python process_data.py --da…
-
Hello. Thanks for your great work!
Can you offer the preprocessing file of changing videos in h36m to images whose format is like `s_01_act_02_subact_01_ca_01*.jpg` and generating corresponding json …
-
Hello. Thanks for your great work!
It seemed that I missed the preprocessing file of changing videos in h36m to images whose format is like `s_01_act_02_subact_01_ca_01*.jpg` and generating correspon…
-
Error pre-processing data
`python h36m.py /datasets/Human36M/ ../../../mmdetection/data/h36m/rcnn --split=train`
```
Traceback (most recent call last):
File "h36m.py", line 135, in
h36…
-
Hi, I want to try to evalute on human3.6m, but I can't find the annotation file 'Sample_64_test_Human36M_protocol_2.json'. Have you provided this file?
-
Your json file provides the H36M joint coordinates obtained from the SMPL gt parameters.
But I found that the yours provided joints3D data is different from which given in Human3.6 official site.
…
Z-Z-J updated
3 years ago
-
Hi,
I noticed something peculiar in the augmentation part and hence, I'd be graetful if you could clarify this for me. Basically, in your dataset class (for example in Human36M.py), when you do rot…
-
You said "The ground truth SMPL parameters in Human3.6M are generated by applying MoSH [34] to the sparse 3D MoCap marker data, as done in Kanazawa et al. "
I have angle files,3D points of Human3.6M.…
-
Thank you for your code. How can I get the data set?