CalciferZh / minimal-hand

A minimal solution to hand motion capture from a single color camera at over 100fps. Easy to use, plug to run.
MIT License
978 stars 172 forks source link

How can I use the model output quaternion to unity? #66

Closed wangtss closed 3 years ago

wangtss commented 3 years ago

Thank you for your great work! I'm trying to use the model output to animate a virtual hand in unity, I tried to set the quaternion into unity's localrotation but it did not work. Could you share some insight about how I can achieve that?

CalciferZh commented 3 years ago

I don't know unity very well. Guess from the name, "localrotation" is the rotation of the child joint relative to its parent, while the output quaternions are global rotations. Maybe you need to convert them to relative quaternions first.

yunho-c commented 3 years ago

Thanks for releasing this great model! I want to use this to animate a hand in Blender (a 3d modeling program), but I keep encountering difficulties, so here I am continuing this thread :)

Visualizing model output in Blender

The encountered problem is that each finger is pointing in almost the same direction. This happens both when I use output quaternions from wrappers.ModelPipeline.process() or when I incorporate parts of hand_mesh.py to use joint_xyz values (I interpreted them as XYZ Euler absolute rotations.) Multiplying joint_xyz by some integer value before putting in Blender, I get something like this:

comparisoncomparison (multiplied by 15)

but the same does not hold for other poses; I speculate that it was either coincidental or the specific pose was related to the reference pose. By the way, I have set up the joints in Blender to be independent of each other.

I tried a lot of things and basically came to think that the output from your model is not the absolute orientation of each joint, but set_abs_quat() in hand_mesh.py does something to make it work. My programming skill isn't quite there to take this further. Help will be very much appreciated!

Attaching some relevant files below: github question.zip

CalciferZh commented 3 years ago

@k2m5t2 Hi thanks for your interest. I guess you have some bug in the skeleton visualization. I tested the image pose3.jpg on my side and the skeleton looks different to yours.

github
yunho-c commented 3 years ago

@CalciferZh I appreciate the response. Could you share some details about how you have visualized it? Did you set the output quaternion to the absolute orientation of each bone, or is there some other processing to be done? (And what software/library is it in the image?)

I am attaching the quaternion outputs (MANO) from pose3.jpg - I'd appreciate it if you could compare them with what you got. Thanks again! pose3_quaternion_output.txt

CalciferZh commented 3 years ago

Does this work for you? @k2m5t2

The quaternions are different. Maybe you need to flip the image.

import numpy as np

from config import *
from kinematics import *
from wrappers import *

import vctoolkit as vc
from hand_mesh import HandMesh

if __name__ == '__main__':
  img = vc.load_img('./pose3.jpg')
  img = vc.imresize(img, (128, 128))
  img = np.flip(img, axis=1).copy()
  model = ModelPipeline()

  _, theta_mpii = model.process(img)
  theta_mano = mpii_to_mano(theta_mpii)

  hand_mesh = HandMesh(HAND_MESH_MODEL_PATH)
  # remember to modify hand_mesh.py to also return joint_xyz
  joint_xyz, _ = hand_mesh.set_abs_quat(theta_mano)

  v, f = vc.joints_to_mesh(joint_xyz, MANOHandJoints.parents)
  vc.save_obj('./xyz.obj', v, f)