The saved preds are actually so different from my thought.
For example,
array([[array([[ 2.57000000e+02, 6.73000000e+02, 2.11715072e-01],
[ 2.52000000e+02, 2.44000000e+02, -0.00000000e+00],
[ 2.57000000e+02, 6.73000000e+02, 1.27586707e-01],
[ 2.57000000e+02, 6.73000000e+02, 1.27375469e-01],
[ 2.52000000e+02, 2.44000000e+02, -0.00000000e+00], ...
May you please tell me how to get predicted joints coordinates and save them?
Hello. Thanks for sharing your great work.
I try to train SH on Human3.6M dataset. I wanted to see the predictions on imgs, so saved the preds which are used in mpii_eval function.
https://github.com/princeton-vl/pytorch_stacked_hourglass/blob/ed91059e874f35089dd3a8e692fa895929785b91/test.py#L230
The saved preds are actually so different from my thought. For example, array([[array([[ 2.57000000e+02, 6.73000000e+02, 2.11715072e-01], [ 2.52000000e+02, 2.44000000e+02, -0.00000000e+00], [ 2.57000000e+02, 6.73000000e+02, 1.27586707e-01], [ 2.57000000e+02, 6.73000000e+02, 1.27375469e-01], [ 2.52000000e+02, 2.44000000e+02, -0.00000000e+00], ...
May you please tell me how to get predicted joints coordinates and save them?