Closed bojin-nwafu closed 2 years ago
Hello @24kbijin ! I run into this: https://github.com/KosukeFukazawa/smpl2bvh and this: https://github.com/akanazawa/human_dynamics/issues/50 Yet didn't have the chance to try them myself. Please let us know if any of them work for you. If so, I invite you to send us a pull request for integrating one of them into our code.
Thanks,
Here is how I convert smpl to bvh:
Thank you very much for your reply!
https://github.com/GuyTevet/motion-diffusion-model/issues/32#issuecomment-1321549699 but there are only 22 points in the npy file and you have 24, where can I get 2 more?
Sorry for the late reply. The code above is correct when using the Humanact12 and the UESTC datasets. If you are using the HumanML3D dataset, you need to drop the last two items from the parents and from the SMPL_JOINT_NAMES arrays. Let me know if this helped.
@sigal-raab The export script works great, but is it supposed to miss the head from the bone hierarchy?
@lassemt, the end effectors are not shown in the topology tree in Blender. More accurately, joints defined as "End Site" in the bvh file, will not be shown in blender. The end effectors' location is still shown in Blender because they are the endpoints of the armature defined by their parent. There is a trick to make the end effectors show in Blender: define the end effector as a regular joint, and add an additional "End Site" with offsets of 0. I'm not too fond of this trick but some BVH.save() methods use it.
@sigal-raab Aha! Thank you for clarifying.
Here is how I convert smpl to bvh:
- Clone https://github.com/sigal-raab/Motion.
- Read the npy file that sample/generate.py in this repo outputs. extract the 'motion' component out of it and use for the following method. The following method expects an npy file so you can either change it or save the aforementioned 'motion' component as an npy file.
- run this method: def smpl2bvh(): from Motion.InverseKinematics import animation_from_positions from Motion import BVH npy_file = 'smpl_3D_joints.npy' motion_path = f'/path/{npy_file}' pos = np.load(motion_path) pos = pos.transpose(0, 3, 1, 2) # samples x joints x coord x frames ==> samples x frames x joints x coord parents = [-1, 0, 0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 9, 9, 12, 13, 14, 16, 17, 18, 19, 20, 21] bvh_path = motion_path[:-4] + 'anim{}.bvh' SMPL_JOINT_NAMES = [ 'Pelvis', # 0 'L_Hip', # 1 'R_Hip', # 2 'Spine1', # 3 'L_Knee', # 4 'R_Knee', # 5 'Spine2', # 6 'L_Ankle', # 7 'R_Ankle', # 8 'Spine3', # 9 'L_Foot', # 10 'R_Foot', # 11 'Neck', # 12 'L_Collar', # 13 'R_Collar', # 14 'Head', # 15 'L_Shoulder', # 16 'R_Shoulder', # 17 'L_Elbow', # 18 'R_Elbow', # 19 'L_Wrist', # 20 'R_Wrist', # 21 'L_Hand', # 22 'R_Hand', # 23 ] for i, p in enumerate(pos): print(f'starting anim no. {i}') anim, sortedorder, = animation_from_positions(p, parents) BVH.save(bvh_path.format(i), anim, names=np.array(SMPL_JOINT_NAMES)[sorted_order])
In step2, does "the npy file"refer to "results.npy" which is created by sample/generate.py? or the "sample08_rep00_smpl_params.npy"? I used the "results.npy" and extract the 'motion' as a new npy, import numpy as np
data = np.load('D:/text-to-motion-main/motion-diffusion-model/save/render_test/render_obj/sample08_rep00_smpl_params.npy', allow_pickle=True)
data_dict = data.item()
motion_data = data_dict['motion']
output_path='./my/smpl_motion_data.npy'
np.save(output_path, motion_data)
and then I run your code, The Error Info is as follows:
starting anim no. 0 Traceback (most recent call last): File "smpl2bvh.py", line 47, in smpl2bvh() File "smpl2bvh.py", line 42, in smpl2bvh anim, sortedorder, = animation_from_positions(p, parents) File "D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py", line 554, in animation_from_positions new_anim = ik() File "D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py", line 105, in call assert np.allclose(Quaternions.from_angle_axis(angles, np.cross(jdirs, ddirs)) * jdirs, ddirs) AssertionError
Hi @Jolinbaby , Please comment out line 105 in D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py ("assert no.allclose...") and run again. Let me know if you received a nice bvh file.
[Explanation: Sometimes the threshold when using np.allclose should be smaller than the default. Since the code in that method is already debugged, there is no need for this assertion.]
Hi @Jolinbaby , Please comment out line 105 in D:\text-to-motion-main\motion-diffusion-model\Motion\InverseKinematics.py ("assert no.allclose...") and run again. Let me know if you received a nice bvh file.
[Explanation: Sometimes the threshold when using np.allclose should be smaller than the default. Since the code in that method is already debugged, there is no need for this assertion.]
Thanks a lot!!! I comment line 104 and 105 and then run again, and I get the bvh files successfully!!! I will try to check if the bvh files are valid or not, Thank you again!
Hey @sigal-raab - am trying this on my end, after commenting out line 104 and 105, I get the following:
starting anim no. 0
[BasicInverseKinematics] Iteration 1 Error: 0.008360
Any idea where I should look to start debugging this? I've extracted the 'motion' component of my 'results.npy' (I tried on sample00_rep00_smpl_params.npy but got only 3 axis on my motion npy)
(Not sure if it matters, but this is how I'm extracting motion python -c "import numpy as np; data = np.load('sample00_rep00_smpl_params.npy', allow_pickle=True).item(); motion = data['motion']; np.save('motion_result.npy', motion)"
, which gives me an .npy with shape (1, 22, 3, 410)
, which seems off)
Thanks for the amazing work!!
starting anim no. 0 [BasicInverseKinematics] Iteration 1 Error: 0.008360
@naeemtee I believe this indicates the "difference" between positions before vs. after inverse kinematics has been resolved. So if you get this while running the above script the BVH should have been successfully saved unless you get other errors?
@naeemtee, I confirm what @lassemt wrote (thanks). An error of 0.008 is actually very good. Have you browsed the output BVH file? Does it look right?
Seems like the conversion only have 17 joints, and the 5 EndSite or EndEffectors ['L_Foot', 'R_Foot', 'Head', 'L_Wrist', 'R_Wrist']
are just names with offsets and not have rotation data. This is in regards to SMPL generated for HumanML3D dataset, hence 22(=24-2) joints.
Is this a bug in conversion or is this the expected behavior?
Edit:
Seems like this export has some error in regards to EndEffectors. I used SMPL-to-FBX, then convert the fbx to bvh using Blender and got the expected 22 joints (24-2 for HumanML3D).
@tshrjn, apologies for the late response, I was on vacation. Can you share the two bvh files that you got? (one using the direct conversion from locations to bvh, and the other using SMPL-to-FBX and then converting to bvh using blender)? I am guessing that the latter adds dummy end-effectors with an offset of zero, but need to see your files to be sure about it.
I think there's a problem with this IK method: it keeps failing (even with high iterations count or with changing BasicInverseKinematics
class to BasicJacobianIK
) on animations that change their root orientation by almost 360 degrees
Is there some solution to this issue? Or must I find some other IK method?
It appears you've identified a bug in the IK method. It's important to note that exporting motions to BVH is not within the scope of the MDM package. In addressing this issue, we've shared our team's insights and techniques on how to accomplish it.
The class we employ, BasicInverseKinematics, is built upon an existing package, with additional enhancements from our end. You're welcome to rectify the identified bug and submit a pull request to https://github.com/sigal-raab/Motion, or suggest an alternative IK package that may better suit your needs.
btw. would it be possible to get rotations (thetas) from here: https://github.com/GuyTevet/motion-diffusion-model/blob/af061ca7c7077fb144c0094a5a72932b967647b6/visualize/simplify_loc2rot.py#L45C8-L45C8
then convert them from angle-axis to euler or quaternion and then transfer these rotations to the generated .bvh animation?
What's not clear to me is how to convert the rotation relative to SMPL rest pose to the rotation relative to my 3D model's rest pose, hmmm. With https://github.com/sigal-raab/Motion I can arbitrarily override the rest pose of the .bvh, however I can't really override the rest pose of my 3D model, I think
Yes, you can try that. It would be great if you let us know whether it worked better for changes of root orientation by almost 360. As for how to convert the rotations that are relative to SMPL rest pose to the ones relative to yours, I can suggest a workflow:
smpl_model = SMPL().eval().to(device)
rest = smpl_model()
rest_joints = rest['smpl'][0]
rootindex = 0
offsets_smpl = rest_joints - rest_joints[parents]
offsets_smpl[rootindex] = rest_joints[rootindex]
offsets_smpl = offsets_smpl.detach().cpu().numpy()
- Use linear algebra to convert the rotations from SMPL rest pose to yours.
Would love to, but I don't really know the equations : ( If you know some resource that describes such an operation, I would be really glad)
Is it as simple as:
j
, a transformation T_j
from the current rest transform to the new rest transformj
, I would multiply its pose matrix (=the matrix relative to j
's rest matrix) by T_j
(left-sided matrix multiplication: pose matrix x T_j
)I assume that the rest transform is defined with respect to the world space.
Would that work?
It might be more complicated because the rotation matrix for each joint is relative to its parent, so you should also take the parent rotation (and its rest pose) into account. Then you need to consider the parent's parent and so forth. I don't have a handy code for this task.
@sigal-raab Aha! Thank you for clarifying.
Did you manage to export an animation in BVH format with all joints? Currently, when I export animations, some joints are missing and are considered kind of end effectors.
Yes, I managed to export animations to BVH with all their joints. Normally, some of the joints are considered end effectors, so it sounds like your BVH is fine.
Yes, I managed to export animations to BVH with all their joints. Normally, some of the joints are considered end effectors, so it sounds like your BVH is fine.
Can you show a piece of code, how did you do that?
I used the code at the beginning of this issue, here. Or maybe I misunderstood your question? What kind of animations are you trying to export and what format are they given at?
Thanks for your reply!
I use animations generated by this repo. I managed to convert SMPL to BVH using your script but excluded the last two joints from the SMPL_JOINT_NAMES list. When I open the converted animation in Blender, I notice some joints are missing (head, right/left foot, right/left hands), although the SMPL file includes these joints. From what I understand, the BVH script considers these joints as end effectors, which is why I don't see them in Blender. You mentioned that in the BVH script, there is an ability to save animations in a way that these missing joints will be visible in Blender. Could you provide the code for this?
Naturally, any joint at the end of a kinematic chain (feet, hands, head) is set as an end effector in a BVH format. Such end effectors should be visible in Blender, in the graphic visualization. However, you will not see the end effector joint name in the outliner window, in the textual list of armatures. There is a trick that I don't like, where you can convert the end effectors to regular joints, and add a dummy end effector with offsets 0,0,0 after each such joint. If you use this trick, the previous end effectors will be visible also in the textual list in Blender. I hope this answers your questions. Since this discussion is not related to the MDM project, let's continue outside of this github site. As a fellow researcher, I am happy to assist. You are welcome to contact me at sigal.raab@gmail.com.
This is an outstanding job! But I want to know how to export the generated action file as a .bvh file? Looking forward to your reply!