YoungSeng / UnifiedGesture

UnifiedGesture: A Unified Gesture Synthesis Model for Multiple Skeletons (ACM MM 2023 Oral)
BSD 2-Clause "Simplified" License
52 stars 3 forks source link

评估代码 #3

Closed 223d closed 10 months ago

223d commented 1 year ago

你好! 请问有评估指标的代码吗?

YoungSeng commented 1 year ago

Yes, you can refer here: https://github.com/YoungSeng/ReprGesture#45-evaluation or to the code for the objective evaluation of metrics calculations for the GENEA Challenge.

223d commented 1 year ago

Hi ,

May I ask how to obtain these files? python calc_cca.py --cond_dir "<..your path/GENEA/genea_challenge_2022/baselines/Tri/output/infer_sample/output_2_new_wavlm/npy/>" --gt_dir "<..your path/GENEA/genea_challenge_2022/dataset/v1_18_1/val/npy/>"

python train_AE.py FileNotFoundError: [Errno 2] No such file or directory: '/ceph/hdd/yangsc21/Python/My_3/GT_Gesture_npy/'

YoungSeng commented 1 year ago

You need to first convert the GT bvh to position for training the AE, and then use the GT position and the generated result's position to calculate distances, etc. For specific details I strongly recommend you refer to and understand the original GENEA Challenge code for calculating objective metrics: https://github.com/genea-workshop/genea_numerical_evaluations, because the bvh to position code is different for different data, and the details of calculation will be different.