OpenMotionLab / MotionGPT

[NeurIPS 2023] MotionGPT: Human Motion as a Foreign Language, a unified motion-language generation model using LLMs
https://motion-gpt.github.io
MIT License
1.45k stars 92 forks source link

Questions about testing results #49

Open weleen opened 11 months ago

weleen commented 11 months ago

Thank you for your great job! I have tried to reproduce the results and encountered some issues.

Following instructions, I evaluate the provided checkpoint downloaded from huggingface.

I run the following commands:

python -m test --cfg configs/config_h3d_stage3.yaml --task t2m
python -m test --cfg configs/config_h3d_stage3.yaml --task m2t

The evaluation results are not consistent with the results reported in the paper. The attachments are the log and metrics.

t2m results:

image

log_2023-10-04-19-56-23_test.log

image

Would you happen to have any idea about what's wrong with the configuration?

weleen commented 11 months ago

About m2t task, the testing process is stuck at the 4th replication since the SIGTERM signal.

image

Similar to t2m, the testing result is behind the results reported in the paper. Especially Bleu@4 and CIDEr, only around 6 and 7.

image

I would appreciate it if you have time to help fix my issue.😄

LinghaoChan commented 10 months ago

@weleen hi! Has this issue been resolved? We met the same issue.

weleen commented 10 months ago

@weleen hi! Has this issue been resolved? We met the same issue.

@LinghaoChan I think there are some mistakes in get_motion_embeddings.

In m2t.py https://github.com/OpenMotionLab/MotionGPT/blob/0499f16df4ddde44dfd72a7cbd7bd615af1b1a94/mGPT/metrics/m2t.py#L325-L329

In t2m.py https://github.com/OpenMotionLab/MotionGPT/blob/0499f16df4ddde44dfd72a7cbd7bd615af1b1a94/mGPT/metrics/t2m.py#L251-L254

m_lens are divided two times.

However, even I fix these errors, the results are still different. Have you solved this issus?

Spark001 commented 9 months ago

same issue

GuangtaoLyu commented 2 months ago

@weleen hi! Has this issue been resolved? We met the same issue.

hi, me too.