Open Adilrn opened 4 years ago
Check the file which is being loaded for 'data'. If you used an automatic landmark detector like OpenPose, there could be empty poses in some frames, so this error should be appear for those json files.
Hi @AdRever, @jgoenetxea already indicates it. An interpolation script is needed to fill in these missing poses. The code can solve missing body parts, but not the 2D skeleton itself. Or you can take the missing poses as a cut and make the 3D generation base on these units.
Thank you very much. But It seems there is another issue to run this script @ArashHosseini! The Stacked Hourglass detections is not available anymore so it the function read_2d_prediction() of data_utils blocks the whole process. Any thoughts on how to get around that?
Yes. I have the same problem. I am using the "openpose_3dpose_sandbox_realtime.py" sample to load the 2d joints, but the landmark normalisation values (data_mean_2d, data_std_2d, data_mean_3d and data_std_3d) are missing.
In my opinion, the normalisation values should be included with the pre-trained model, because they are directly related values.
Could somebody include those values as text files, for example?
@jgoenetxea where you able to overcome the issue finally?
Hello @ArashHosseini ! I have been experiencing issues to run your repository. My operating system is Lunux Ubuntu My TensorFlow error is 2.0, but I have tried to replace the import of TensorFlow by tensorflow.compat.v1 My python version is 3.7 The error I get when I run the openpose_3dpose_sandbox.py over the directory in which my json files are stored is the following: File "src/openpose_3dpose_sandbox.py", line 230, in main smoothed = read_openpose_json() File "src/openpose_3dpose_sandbox.py", line 56, in read_openpose_json _data = data["people"][0]["pose_keypoints_2d"] if "pose_keypoints_2d" in data["people"][0] else data["people"][0]["pose_keypoints"] IndexError: list index out of range
This is what one of the json files contains: {"people": [{"pose_keypoints_2d": [603, 149, 621, 188, 529, 192, 463, 297, 446, 184, 717, 184, 800, 290, 817, 176, 573, 458, 507, 622, 0.0, 0.0, 686, 454, 752, 622, 0.0, 0.0, 586, 133, 621, 133, 568, 133, 647, 133]}]}