GuyTevet / motion-diffusion-model

The official PyTorch implementation of the paper "Human Motion Diffusion Model"
MIT License
3.07k stars 330 forks source link

How do I export the generated actions as a bvh file? #32

Closed bojin-nwafu closed 1 year ago

bojin-nwafu commented 1 year ago

This is an outstanding job! But I want to know how to export the generated action file as a .bvh file? Looking forward to your reply!

GuyTevet commented 1 year ago

Hello @24kbijin ! I run into this: https://github.com/KosukeFukazawa/smpl2bvh and this: https://github.com/akanazawa/human_dynamics/issues/50 Yet didn't have the chance to try them myself. Please let us know if any of them work for you. If so, I invite you to send us a pull request for integrating one of them into our code.

Thanks,

sigal-raab commented 1 year ago

Here is how I convert smpl to bvh:

  1. Clone https://github.com/sigal-raab/Motion.
  2. Read the npy file that sample/generate.py in this repo outputs. extract the 'motion' component out of it and use for the following method. The following method expects an npy file so you can either change it or save the aforementioned 'motion' component as an npy file.
  3. run this method: def smpl2bvh(): from Motion.InverseKinematics import animation_from_positions from Motion import BVH npy_file = 'smpl_3D_joints.npy' motion_path = f'/path/{npy_file}' pos = np.load(motion_path) pos = pos.transpose(0, 3, 1, 2) # samples x joints x coord x frames ==> samples x frames x joints x coord parents = [-1, 0, 0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 9, 9, 12, 13, 14, 16, 17, 18, 19, 20, 21] bvh_path = motion_path[:-4] + 'anim{}.bvh' SMPL_JOINT_NAMES = [ 'Pelvis', # 0 'L_Hip', # 1 'R_Hip', # 2 'Spine1', # 3 'L_Knee', # 4 'R_Knee', # 5 'Spine2', # 6 'L_Ankle', # 7 'R_Ankle', # 8 'Spine3', # 9 'L_Foot', # 10 'R_Foot', # 11 'Neck', # 12 'L_Collar', # 13 'R_Collar', # 14 'Head', # 15 'L_Shoulder', # 16 'R_Shoulder', # 17 'L_Elbow', # 18 'R_Elbow', # 19 'L_Wrist', # 20 'R_Wrist', # 21 'L_Hand', # 22 'R_Hand', # 23 ] for i, p in enumerate(pos): print(f'starting anim no. {i}') anim, sortedorder, = animation_from_positions(p, parents) BVH.save(bvh_path.format(i), anim, names=np.array(SMPL_JOINT_NAMES)[sorted_order])
bojin-nwafu commented 1 year ago

Thank you very much for your reply!

Siziff commented 1 year ago

https://github.com/GuyTevet/motion-diffusion-model/issues/32#issuecomment-1321549699 but there are only 22 points in the npy file and you have 24, where can I get 2 more?

sigal-raab commented 1 year ago

Sorry for the late reply. The code above is correct when using the Humanact12 and the UESTC datasets. If you are using the HumanML3D dataset, you need to drop the last two items from the parents and from the SMPL_JOINT_NAMES arrays. Let me know if this helped.

lassemt commented 1 year ago

@sigal-raab The export script works great, but is it supposed to miss the head from the bone hierarchy? Screenshot 2023-03-16 at 14 28 53

sigal-raab commented 1 year ago

@lassemt, the end effectors are not shown in the topology tree in Blender. More accurately, joints defined as "End Site" in the bvh file, will not be shown in blender. The end effectors' location is still shown in Blender because they are the endpoints of the armature defined by their parent. There is a trick to make the end effectors show in Blender: define the end effector as a regular joint, and add an additional "End Site" with offsets of 0. I'm not too fond of this trick but some BVH.save() methods use it.

lassemt commented 1 year ago

@sigal-raab Aha! Thank you for clarifying.

Jolinbaby commented 1 year ago

Here is how I convert smpl to bvh:

  1. Clone https://github.com/sigal-raab/Motion.
  2. Read the npy file that sample/generate.py in this repo outputs. extract the 'motion' component out of it and use for the following method. The following method expects an npy file so you can either change it or save the aforementioned 'motion' component as an npy file.
  3. run this method: def smpl2bvh(): from Motion.InverseKinematics import animation_from_positions from Motion import BVH npy_file = 'smpl_3D_joints.npy' motion_path = f'/path/{npy_file}' pos = np.load(motion_path) pos = pos.transpose(0, 3, 1, 2) # samples x joints x coord x frames ==> samples x frames x joints x coord parents = [-1, 0, 0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 9, 9, 12, 13, 14, 16, 17, 18, 19, 20, 21] bvh_path = motion_path[:-4] + 'anim{}.bvh' SMPL_JOINT_NAMES = [ 'Pelvis', # 0 'L_Hip', # 1 'R_Hip', # 2 'Spine1', # 3 'L_Knee', # 4 'R_Knee', # 5 'Spine2', # 6 'L_Ankle', # 7 'R_Ankle', # 8 'Spine3', # 9 'L_Foot', # 10 'R_Foot', # 11 'Neck', # 12 'L_Collar', # 13 'R_Collar', # 14 'Head', # 15 'L_Shoulder', # 16 'R_Shoulder', # 17 'L_Elbow', # 18 'R_Elbow', # 19 'L_Wrist', # 20 'R_Wrist', # 21 'L_Hand', # 22 'R_Hand', # 23 ] for i, p in enumerate(pos): print(f'starting anim no. {i}') anim, sortedorder, = animation_from_positions(p, parents) BVH.save(bvh_path.format(i), anim, names=np.array(SMPL_JOINT_NAMES)[sorted_order])

In step2, does "the npy file"refer to "results.npy" which is created by sample/generate.py? or the "sample08_rep00_smpl_params.npy"? I used the "results.npy" and extract the 'motion' as a new npy, import numpy as np

data = np.load('D:/text-to-motion-main/motion-diffusion-model/save/render_test/render_obj/sample08_rep00_smpl_params.npy', allow_pickle=True)

data_dict = data.item()
motion_data = data_dict['motion']

output_path='./my/smpl_motion_data.npy'
np.save(output_path, motion_data)

and then I run your code, The Error Info is as follows:

starting anim no. 0 Traceback (most recent call last): File "smpl2bvh.py", line 47, in smpl2bvh() File "smpl2bvh.py", line 42, in smpl2bvh anim, sortedorder, = animation_from_positions(p, parents) File "D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py", line 554, in animation_from_positions new_anim = ik() File "D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py", line 105, in call assert np.allclose(Quaternions.from_angle_axis(angles, np.cross(jdirs, ddirs)) * jdirs, ddirs) AssertionError

sigal-raab commented 1 year ago

Hi @Jolinbaby , Please comment out line 105 in D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py ("assert no.allclose...") and run again. Let me know if you received a nice bvh file.

[Explanation: Sometimes the threshold when using np.allclose should be smaller than the default. Since the code in that method is already debugged, there is no need for this assertion.]

Jolinbaby commented 1 year ago

Hi @Jolinbaby , Please comment out line 105 in D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py ("assert no.allclose...") and run again. Let me know if you received a nice bvh file.

[Explanation: Sometimes the threshold when using np.allclose should be smaller than the default. Since the code in that method is already debugged, there is no need for this assertion.]

Thanks a lot!!! I comment line 104 and 105 and then run again, and I get the bvh files successfully!!! I will try to check if the bvh files are valid or not, Thank you again!

Jolinbaby commented 1 year ago

https://user-images.githubusercontent.com/60809535/234281453-f4388d15-dd98-4c9c-8046-ba6f35bc2c69.mp4

xnmt9 commented 1 year ago

Hey @sigal-raab - am trying this on my end, after commenting out line 104 and 105, I get the following:

starting anim no. 0
[BasicInverseKinematics] Iteration 1 Error: 0.008360

Any idea where I should look to start debugging this? I've extracted the 'motion' component of my 'results.npy' (I tried on sample00_rep00_smpl_params.npy but got only 3 axis on my motion npy)

(Not sure if it matters, but this is how I'm extracting motion python -c "import numpy as np; data = np.load('sample00_rep00_smpl_params.npy', allow_pickle=True).item(); motion = data['motion']; np.save('motion_result.npy', motion)", which gives me an .npy with shape (1, 22, 3, 410), which seems off)

Thanks for the amazing work!!

lassemt commented 1 year ago
starting anim no. 0
[BasicInverseKinematics] Iteration 1 Error: 0.008360

@naeemtee I believe this indicates the "difference" between positions before vs. after inverse kinematics has been resolved. So if you get this while running the above script the BVH should have been successfully saved unless you get other errors?

sigal-raab commented 1 year ago

@naeemtee, I confirm what @lassemt wrote (thanks). An error of 0.008 is actually very good. Have you browsed the output BVH file? Does it look right?

tshrjn commented 1 year ago

Seems like the conversion only have 17 joints, and the 5 EndSite or EndEffectors ['L_Foot', 'R_Foot', 'Head', 'L_Wrist', 'R_Wrist'] are just names with offsets and not have rotation data. This is in regards to SMPL generated for HumanML3D dataset, hence 22(=24-2) joints.

Is this a bug in conversion or is this the expected behavior?

Edit:

Seems like this export has some error in regards to EndEffectors. I used SMPL-to-FBX, then convert the fbx to bvh using Blender and got the expected 22 joints (24-2 for HumanML3D).

sigal-raab commented 1 year ago

@tshrjn, apologies for the late response, I was on vacation. Can you share the two bvh files that you got? (one using the direct conversion from locations to bvh, and the other using SMPL-to-FBX and then converting to bvh using blender)? I am guessing that the latter adds dummy end-effectors with an offset of zero, but need to see your files to be sure about it.

dj-kefir-siorbacz commented 9 months ago

I think there's a problem with this IK method: it keeps failing (even with high iterations count or with changing BasicInverseKinematics class to BasicJacobianIK) on animations that change their root orientation by almost 360 degrees

https://github.com/GuyTevet/motion-diffusion-model/assets/135321152/7f0c48dc-84a2-4d50-9b1b-4c429fcfa216

Is there some solution to this issue? Or must I find some other IK method?

sigal-raab commented 9 months ago

It appears you've identified a bug in the IK method. It's important to note that exporting motions to BVH is not within the scope of the MDM package. In addressing this issue, we've shared our team's insights and techniques on how to accomplish it.

The class we employ, BasicInverseKinematics, is built upon an existing package, with additional enhancements from our end. You're welcome to rectify the identified bug and submit a pull request to https://github.com/sigal-raab/Motion, or suggest an alternative IK package that may better suit your needs.

dj-kefir-siorbacz commented 9 months ago

btw. would it be possible to get rotations (thetas) from here: https://github.com/GuyTevet/motion-diffusion-model/blob/af061ca7c7077fb144c0094a5a72932b967647b6/visualize/simplify_loc2rot.py#L45C8-L45C8 image

then convert them from angle-axis to euler or quaternion and then transfer these rotations to the generated .bvh animation?

What's not clear to me is how to convert the rotation relative to SMPL rest pose to the rotation relative to my 3D model's rest pose, hmmm. With https://github.com/sigal-raab/Motion I can arbitrarily override the rest pose of the .bvh, however I can't really override the rest pose of my 3D model, I think

sigal-raab commented 9 months ago

Yes, you can try that. It would be great if you let us know whether it worked better for changes of root orientation by almost 360. As for how to convert the rotations that are relative to SMPL rest pose to the ones relative to yours, I can suggest a workflow:

  1. Extract SMPL rest pose (aka, offsets) using the following code:
    smpl_model = SMPL().eval().to(device)
    rest = smpl_model()
    rest_joints = rest['smpl'][0]
    rootindex = 0
    offsets_smpl = rest_joints - rest_joints[parents]
    offsets_smpl[rootindex] = rest_joints[rootindex]
    offsets_smpl = offsets_smpl.detach().cpu().numpy()
  2. Use linear algebra to convert the rotations from SMPL rest pose to yours.
dj-kefir-siorbacz commented 9 months ago
  1. Use linear algebra to convert the rotations from SMPL rest pose to yours.

Would love to, but I don't really know the equations : ( If you know some resource that describes such an operation, I would be really glad)

Is it as simple as:

  1. calculate, for each joint j, a transformation T_j from the current rest transform to the new rest transform
  2. then for each joint j, I would multiply its pose matrix (=the matrix relative to j's rest matrix) by T_j (left-sided matrix multiplication: pose matrix x T_j)

I assume that the rest transform is defined with respect to the world space.

Would that work?

sigal-raab commented 9 months ago

It might be more complicated because the rotation matrix for each joint is relative to its parent, so you should also take the parent rotation (and its rest pose) into account. Then you need to consider the parent's parent and so forth. I don't have a handy code for this task.

rprodan1 commented 9 months ago

@sigal-raab Aha! Thank you for clarifying.

Did you manage to export an animation in BVH format with all joints? Currently, when I export animations, some joints are missing and are considered kind of end effectors.

sigal-raab commented 9 months ago

Yes, I managed to export animations to BVH with all their joints. Normally, some of the joints are considered end effectors, so it sounds like your BVH is fine.

rprodan1 commented 9 months ago

Yes, I managed to export animations to BVH with all their joints. Normally, some of the joints are considered end effectors, so it sounds like your BVH is fine.

Can you show a piece of code, how did you do that?

sigal-raab commented 9 months ago

I used the code at the beginning of this issue, here. Or maybe I misunderstood your question? What kind of animations are you trying to export and what format are they given at?

rprodan1 commented 9 months ago

Thanks for your reply!

I use animations generated by this repo. I managed to convert SMPL to BVH using your script but excluded the last two joints from the SMPL_JOINT_NAMES list. When I open the converted animation in Blender, I notice some joints are missing (head, right/left foot, right/left hands), although the SMPL file includes these joints. From what I understand, the BVH script considers these joints as end effectors, which is why I don't see them in Blender. You mentioned that in the BVH script, there is an ability to save animations in a way that these missing joints will be visible in Blender. Could you provide the code for this?

sigal-raab commented 9 months ago

Naturally, any joint at the end of a kinematic chain (feet, hands, head) is set as an end effector in a BVH format. Such end effectors should be visible in Blender, in the graphic visualization. However, you will not see the end effector joint name in the outliner window, in the textual list of armatures. There is a trick that I don't like, where you can convert the end effectors to regular joints, and add a dummy end effector with offsets 0,0,0 after each such joint. If you use this trick, the previous end effectors will be visible also in the textual list in Blender. I hope this answers your questions. Since this discussion is not related to the MDM project, let's continue outside of this github site. As a fellow researcher, I am happy to assist. You are welcome to contact me at sigal.raab@gmail.com.