lmb-freiburg / freihand

A dataset for estimation of hand pose and shape from single color images.
https://lmb.informatik.uni-freiburg.de/projects/freihand/
379 stars 77 forks source link

can you public eval code?? #29

Open liwenssss opened 2 years ago

liwenssss commented 2 years ago

hi,I want to test my result on your Codalab, but it seams something wrong and I cant get the score. Can you public the evaluation code so that I can get my eval result.

clashroyaleisgood commented 2 years ago

Hi, I met the same question.

Maybe you can try to use the evaluation code provided by freihand:

freihand/eval.py, https://github.com/lmb-freiburg/freihand/blob/master/eval.py

And the evaluation groud-truth is also released:

https://github.com/lmb-freiburg/freihand#evaluate-on-the-dataset, Update

Note that, the code needs some modifications in my case

If you have file import problem, from utils.fh_utils import *... You can try this:

import sys
sys.path.append('..')
liumc14 commented 1 year ago

Can I have a look at your pred.py file? Thank you

clashroyaleisgood commented 1 year ago

In fact, I don't really write a pred.py myself. I use codes (model) edited from https://github.com/SeanChenxy/HandMesh, with mobrecon model.

here is the snippet to dump result: https://github.com/clashroyaleisgood/HandMesh/blob/8fbd3a89fa655e095fedcf2c29baec01a9e54666/mobrecon/runner.py#L349-L358

liumc14 commented 1 year ago

Ok, thank you, but can you take the liberty to ask how to get the corresponding mano parameters according to the pictures in the test set?

clashroyaleisgood commented 1 year ago

I'm not sure, but maybe it's in the offcial eval zip file: FreiHAND_pub_v2_eval.zip file: evaluation_mano.json


updated: oh! I just find it: https://github.com/lmb-freiburg/freihand/blob/5ea4ab9763fea0eec988a52bfa563333cb16523f/view_samples.py#L28 maybe you can try this.

liumc14 commented 1 year ago

But this is the. mono file in the training directory. Our dataset has this file, but only images and k.json and scale.json are available in the evaluation directory. What bothers me is how to use these files to predict xyz.json. If you are free, please help me. Thank you

clashroyaleisgood commented 1 year ago

NONONO! FreiHAND have already released their evaluation annotations on official dataset website: https://lmb.informatik.uni-freiburg.de/resources/datasets/FreihandDataset.en.html#:~:text=Download%20FreiHAND%20Dataset%20v2%20%2D%20Evaluation%20set%20with%20annotations%20(724MB)

the zip file contains _mano, _verts, _xyz, ... for evaluation set

liumc14 commented 1 year ago

But his pred.py needs to write its own prediction code. Thank you for your advice. The official one does provide it, but I would like to ask how to predict xyz by myself: https://github.com/lmb-freiburg/freihand/blob/master/pred.py#:~:text=%23%20TODO%3A%20Put%20your%20algorithm%20here%2C%20which%20computes%20(metric)%203D%20joint%20coordinates%20and%203D%20vertex%20positions

clashroyaleisgood commented 1 year ago

Sorry for not getting the point. Do you mean, how to predict xyz from a single RGB image? Or how to combine your prediction algorithm with the code snippet you provided?


What HandMesh does is just copying codes about( converting prediction result to json ) from the pred.py to his codes. So he don't need to call or edit this pred.py - pred_template()

liumc14 commented 1 year ago

My question is: how to predict xyz from a single RGB image? If you have some suggestions, please explain them to me. Thank you

clashroyaleisgood commented 1 year ago

There are so many researches about Hand Pose Prediction. trying to get prediction result more and more accurate

liumc14 commented 1 year ago

OK, thank you for your suggestion. You can ask about the modle in this project Is py trained? Position Can hand directly predict xyz and vert? What I don't understand here is pose What is the hand parameter mano? Is it a 61 bit mano parameter? How to obtain from a pending image? https://github.com/lmb-freiburg/freihand/blob/master/utils/model.py#:~:text=def%20pose_hand(mano%2C%20K%2C%20use_mean_pose%3DTrue)%3A

clashroyaleisgood commented 1 year ago

I think it's far away from the original issue title, so this is my last reply.

You can ask about the modle in this project Is py trained?

if this means: if the model in this project is trained in python? the answer is Yes, but I don,t know which method is used to trained this renderer

Position Can hand directly predict xyz and vert? What I don't understand here is pose

Sorry I really can't understand it...

What is the hand parameter mano? Is it a 61 bit mano parameter?

You can see this: https://github.com/hassony2/manopth or other online resources to get a better understand about what a mano parameter is. I don't have a full understand about it. As far as I know, it's a 61-d feature contains 10 shape parameters, 48 rotation parameters, and 3 global_t parameters shape means how fat(thin) this hand is. rotation means the rotation angles in each joints(15*3), additional (3) is the rotation of the whole hand global_t means the wrist joint potision in camera coordinate

How to obtain from a pending image?

I guess it can only be regressed from ground-truth hand vertices.