SizheAn / mRI

Data repo for mRI: Multi-modal 3D Human Pose Estimation Dataset using mmWave, RGB-D, and Inertial Sensors
Creative Commons Zero v1.0 Universal
52 stars 8 forks source link

How to visualize the 3D pose? #9

Open tsbiosky opened 1 year ago

tsbiosky commented 1 year ago

May I ask how to visualize the 3D pose like demo , and in results of mmWave Radar like "random_split_protocol1.npy" is that pose prediction result ? Which pretraned model should I load to reproduce this result ? Thanks

SizheAn commented 1 year ago

Under model/mmWave/results, the .npy is only the MPJPE and PA-MPJPE metric result. To get the 3D key point, you may want to check aligned_data/pose_labels/subjectx.cpl. For example,

import pickle
temp = pickle.load(open('aligned_data/pose_labels/subject1_all_labels.cpl', 'rb'))
print(temp['radar_est_kps'].shape)

You then should see all the radar estimated 3D kps results. To plot it, you can use any 3D scatter plot tools in python or matlab.

tsbiosky commented 1 year ago

Under model/mmWave/results, the .npy is only the MPJPE and PA-MPJPE metric result. To get the 3D key point, you may want to check aligned_data/pose_labels/subjectx.cpl. For example,

import pickle
temp = pickle.load(open('aligned_data/pose_labels/subject1_all_labels.cpl', 'rb'))
print(temp['radar_est_kps'].shape)

You then should see all the radar estimated 3D kps results. To plot it, you can use any 3D scatter plot tools in python or matlab.

Thank you for your reply! But I'm still confused about how to get predicted 3D pose from radar point cloud input , I tried the eval.py and get the eval_result.pkl but there are only 'video-id', 't-start', 't-end', 'label', 'score' .