GuyTevet / motion-diffusion-model

The official PyTorch implementation of the paper "Human Motion Diffusion Model"
MIT License
3.17k stars 348 forks source link

Feature Request: SMPL format export #5

Open jaykup1 opened 2 years ago

jaykup1 commented 2 years ago

Exported .npy has SMPL parameters but it would be great if it also get exported as pkl file with SMPL parameters.

GuyTevet commented 2 years ago

Hi @jaykup1 ,

You can replace https://github.com/GuyTevet/motion-diffusion-model/blob/14d0f7976e15ed232775688a19ff38f4b2926c08/visualize/vis_utils.py#L66

With pickle.dump(*) if that was your intention.

jjp9624022 commented 2 years ago

this is blender addon to opt smpl model here https://github.com/Meshcapade/SMPL_blender_addon The last version allows import animations in .npz format how to trans?It would be great if it could work

jaykup1 commented 2 years ago

Hi @jaykup1 ,

You can replace

https://github.com/GuyTevet/motion-diffusion-model/blob/14d0f7976e15ed232775688a19ff38f4b2926c08/visualize/vis_utils.py#L66

With pickle.dump(*) if that was your intention.

Not exactly: https://meshcapade.wiki/SMPL

image

Anyway to export with this skeleton layout? (so we can use https://github.com/softcat477/SMPL-to-FBX to convert that to FBX for game engines)

TREE-Ind commented 2 years ago

Definitely going to try out the SMPL to FBX converter later tonight. As a workaround we've had great success simply running the rendered mesh sequence through AI mocap solutions to get game ready rigged animations, Plask, Deepmotion, etc.

jjp9624022 commented 2 years ago

Hi @jaykup1 , You can replace https://github.com/GuyTevet/motion-diffusion-model/blob/14d0f7976e15ed232775688a19ff38f4b2926c08/visualize/vis_utils.py#L66

With pickle.dump(*) if that was your intention.

Not exactly: https://meshcapade.wiki/SMPL

image

Anyway to export with this skeleton layout? (so we can use https://github.com/softcat477/SMPL-to-FBX to convert that to FBX for game engines)

not work! different shape results.npy:motion (3, 22, 3, 120) SMPL-to-FBX sample.pkl motion (127, 72)

jaykup1 commented 2 years ago

not work! different shape results.npy:motion (3, 22, 3, 120) SMPL-to-FBX sample.pkl motion (127, 72)

The export file has to be rearranged to SMPL skeleton layout before fbx conversion, help would be greatly appreciated.

GuyTevet commented 2 years ago

I'll try to clarify, please let me know if it helps:

results.npy - only includes joint positions - this is not what you are looking for.

after running visualize.render_mesh you will get sample##_rep##_smpl_params.npy with smpl parameters. In details:

            'motion': [25, 6, frames], - this is both SMPL, thetas, and root translation, you can ignore it.
            'thetas': [24, 6, frames] - SMPL thetas represented in 6d rotations
            'root_translation': [3, frames] - Root translation
            'faces':  - SMPL faces list
            'vertices': - SMPL vertices locations per frame
            'text': - text prompt
            'length': - number of frames

Can you detail here the data structure that the SMPL addon is expecting, and we will try to code an adapter between the two formats (and of course welcome you to send a pull request if you figure it out yourself).

Hope this helps,

jaykup1 commented 2 years ago

Thank you for the reply,

GuyTevet commented 2 years ago

Cool! If that's the case, the thetas and root_translation have all the information you need. You will just need to convert 6d into axis angles. One way to do so is to concatenate those two functions in our repo:

https://github.com/GuyTevet/motion-diffusion-model/blob/14d0f7976e15ed232775688a19ff38f4b2926c08/utils/rotation_conversions.py#L513-L534

https://github.com/GuyTevet/motion-diffusion-model/blob/14d0f7976e15ed232775688a19ff38f4b2926c08/utils/rotation_conversions.py#L434-L447

Tony2371 commented 2 years ago

So, will there be a script that converts sample##_rep##_smpl_params.npy to FBX or something like that?

BTW, awesome project, guys! Thanks a lot for your efforts.

amb commented 2 years ago

results.npy - only includes joint positions - this is not what you are looking for.

after running visualize.render_mesh you will get sample##_rep##_smpl_params.npy with smpl parameters. In details:

Does this process add new information? I've been building armatures and animations just from the results.npy manually with some quaternion magic. Would sample##_rep##_smpl_params.npy yield better results?

TREE-Ind commented 2 years ago

@amb very cool! Are you able to share any insight or code snippet as to how you accomplished this? We're getting close but it would help us greatly, no worries either way thx.

amb commented 2 years ago

@TREE-Ind Sure, it's just a proof of concept Blender script tho. Here you go: https://gist.github.com/amb/69cc8396bc61cc59a7e819aed6b21f34 The script has "import bpy" so technically it's also GPL3 license.

TREE-Ind commented 2 years ago

@amb Wow nice work :D

jjp9624022 commented 2 years ago

@TREE-Ind Sure, it's just a proof of concept Blender script tho. Here you go: https://gist.github.com/amb/69cc8396bc61cc59a7e819aed6b21f34 The script has "import bpy" so technically it's also GPL3 license.

nice work! but....... Like you said ,# TODO: Is this really the best way to create armature in Blender? you can trans the pose from the initial posture of the human skeleton someting like rigfly

jaykup1 commented 2 years ago

Work in progess...: 10_11_22_19_47Unity_swYUZykwal

@GuyTevet How can I get the 6d out of 'thetas': self.motions['motion'][0, :-1, :, :self.real_num_frames], for rotation_6d_to_matrix ?

GuyTevet commented 2 years ago

@jaykup1 those are the 6d rotations, but in SMPL convention. I can make a guess why this is not working - (1) it can be an issue with the conversion (2) SMPL angles are relative to SMPL rest pose (T-pose I think), maybe the rest pose of your character is different. In this case, I can suggest, you put the SMPL character (using SMPL addon) side by side with your character, and align between them.

You said rotation_6d_to_matrix doesn't work well, what exactly is the issue there?

GuyTevet commented 2 years ago

@jaykup1 There is knowledge regarding how to convert SMPL angles to other characters (not in my brain, unfortunately). I suggest contacting the SMPL team or watching their tutorials on youtube. If you manage to resolve this, I will be very happy to know about it!

jaykup1 commented 2 years ago

@jaykup1 those are the 6d rotations, but in SMPL convention. I can make a guess why this is not working - (1) it can be an issue with the conversion (2) SMPL angles are relative to SMPL rest pose (T-pose I think), maybe the rest pose of your character is different. In this case, I can suggest, you put the SMPL character (using SMPL addon) side by side with your character, and align between them.

You said rotation_6d_to_matrix doesn't work well, what exactly is the issue there?

@GuyTevet I only get TypeErrors, AttributeErrors (because I don't know what I'm doing), I just reshaped it (reshape(self.real_num_frames, 72), (reshape(self.real_num_frames, 3) to see if it's working.

GuyTevet commented 2 years ago

The shape of thetas is [24, 6, frames]. 6 stands for the 6d, you can look in our code or in ACTOR or MotionCLIP repos for references on how to use those functions.

mmdrahmani commented 2 years ago

Hi Thank you for the excellent work. I have tried the code and it works just fine. I even used the mesh object sequence in blender and that was also fine (really interesting results). I only had a basic question, regarding how to use the npy file in the blender SMPL or SMPL-X add on in order to create animations (similar to animations for the paper). I think this is mostly my fault because I know little about blender. I appreciate any information, or link to blender tutorials. I watched several SMPL/X blender tutorials on youtube but none of them addressed my problem. All the bests Mohammad

mmdrahmani commented 2 years ago

lso GPL3 license

Thanks for the code. This is amazing! How should I used SMPL neutral body instead of armature? Thanks

alextitonis commented 2 years ago

@amb Tried to use the script and exported the fbx, but the result was an orb with spykes around moving Is there a different way to export it? bpy.ops.export_scene.fbx(filepath="./tata.fbx", use_selection=True, add_leaf_bones=False)

jaykup1 commented 2 years ago

10_13_22_16_45Unity_WPPkYBPXY1

Conversion of data to .pkl file for 'softcat477/SMPL-to-FBX/'

@jjp9624022 @TREE-Ind @Tony2371 @amb @mmdrahmani @alextitonis tell me if there are any problems, thanks

alextitonis commented 2 years ago

It works like charm awesome @jaykup1

TREE-Ind commented 2 years ago

Getting the FBX python SDK setup correctly was a pain but can also confirm the conversion works for us as well.

2022-10-13 11_53_16-Blender

mmdrahmani commented 2 years ago

10_13_22_16_45Unity_WPPkYBPXY1 10_13_22_16_45Unity_WPPkYBPXY1

Conversion of data to .pkl file for 'softcat477/SMPL-to-FBX/'

@jjp9624022 @TREE-Ind @Tony2371 @amb @mmdrahmani @alextitonis tell me if there are any problems, thanks

This is cool! Thank you. I ran the code and converted and saved the data into a pickle file. But I am still not sure how to use this pickle file in Blender. I have SMPL and SMPL-X addon installed on blender, but I don't see how to import animations saved in the pickle file. Thank you for your support.

jaykup1 commented 2 years ago

@mmdrahmani use the pkl file with this https://github.com/softcat477/SMPL-to-Fbx

Change 'FbxTime.eFrames60's in SMPL-to-FBX/FbxReadWriter.py to 'FbxTime.eFrames30'

Known issues: Glitches below 30 fps exports in 'softcat477/SMPL-to-FBX/' (20 fps expected)

TREE-Ind commented 2 years ago

https://user-images.githubusercontent.com/30479526/195868811-80a23002-9a77-4b22-81ef-233d0c6621f8.mp4

Importing motion diffusion animations into UE5 at runtime now thx to the conversion code :D

jaykup1 commented 2 years ago

Importing motion diffusion animations into UE5 at runtime now thx to the conversion code :D

you can half the time of export by reducing 'self.num_smplify_iters = 150' in simplify_loc2rot.py to 50. (30 seemed fine), I will probably make a simple gradio gui and push it.

ivansabolic commented 2 years ago

The conversion code works awesome, thanks!

Any suggestion what would be the best approach to bring down the file size of generated .fbx? Quality loss is acceptable till some point, I am just not sure if I should be playing with SMPL parameters or something else.

jaykup1 commented 2 years ago

The conversion code works awesome, thanks!

Any suggestion what would be the best approach to bring down the file size of generated .fbx? Quality loss is acceptable till some point, I am just not sure if I should be playing with SMPL parameters or something else.

Not possible (that I know of), reducing the imported fbx size breaks the animation calculation. If you are using a game engine you can export the animation. In unity engine, animation is ~4 MB (0.3 MB if you zip it).

TREE-Ind commented 2 years ago

@babotrojka Not sure if this will work for your setup but we had good success converting the fbx to gltf while retaining the animation data which reduces the file size substantially.

mmdrahmani commented 2 years ago

Thank again. I tried to work with this https://github.com/softcat477/SMPL-to-Fbx as you suggested. But setting up python fbx is quite complicated. I have not been able to configure it on my macbook. The FBX-python instructions are very cryptic. I followed the instructions and installed fbx sdk on my macbook and then the smpl-to-fbx requirements in a new conda environment. Then while running the Python Convert.py I got this error (without more info):

['gBR', 'gPO', 'gLO', 'gMH', 'gLH', 'gHO', 'gWA', 'gKR', 'gJS', 'gJB'] ['sBM', 'sFM'] ['0', '1', '2', '3', '4', '5'] Segmentation fault: 11

Do you have any ideas why I am getting this error? My guess the FBX sdk configuration was not done properly, although I followed the instructions provided. Thanks

yotaro-shimose commented 2 years ago

Thank again. I tried to work with this https://github.com/softcat477/SMPL-to-Fbx as you suggested. But setting up python fbx is quite complicated. I have not been able to configure it on my macbook. The FBX-python instructions are very cryptic. I followed the instructions and installed fbx sdk on my macbook and then the smpl-to-fbx requirements in a new conda environment. Then while running the Python Convert.py I got this error (without more info):

['gBR', 'gPO', 'gLO', 'gMH', 'gLH', 'gHO', 'gWA', 'gKR', 'gJS', 'gJB'] ['sBM', 'sFM'] ['0', '1', '2', '3', '4', '5'] Segmentation fault: 11

Do you have any ideas why I am getting this error? My guess the FBX sdk configuration was not done properly, although I followed the instructions provided. Thanks

The first three lines are the output of these lines

For Segmentation fault: 11, I'm not confident. But if not tried, make sure you installed the correct version of "Python".

yh675 commented 1 year ago

@mmdrahmani use the pkl file with this https://github.com/softcat477/SMPL-to-Fbx

Change 'FbxTime.eFrames60's in SMPL-to-FBX/FbxReadWriter.py to 'FbxTime.eFrames30'

Known issues: Glitches below 30 fps exports in 'softcat477/SMPL-to-FBX/' (20 fps expected)

@jaykup1 In line 84 of the conversion script

smpl_trans = np.array(trans_raw).reshape(self.real_num_frames, 3)*np.array([100, 1, 100])

How come the y coordinate is only scaled by 1 and not 100?

yh675 commented 1 year ago

I am loading animations into Unreal Engine using the conversion to pkl and SMPL-to-FBX repo.

However, I would like to control the relative location of the foot joints to the ground floor in the simulator.

I tried to use the coordinates given in results.npy for one animation, according to which the left foot is at a height of 0.388 cm in the first frame (I multiplied the scalars in results.npy by 100).

However, when I load the animation into Unreal it looks like this:

image

Where the feet are clearly 8-9 cm above the animations origin (m_avg_root) in it's local coordinate frame.

How can I get the accurate local xyz coordinates of the joints in the animations? Did anyone else have a similar issue?

yh675 commented 1 year ago

I resolved the issue by mapping the translation of m_avg_root to m_avg_Pelvis.

Fixed by changing motion-diffusion-model.visualize.vis_utils from: smpl_trans = np.array(trans_raw).reshape(self.real_num_frames, 3)np.array([100, 1, 100]) -> smpl_trans = np.array(trans_raw).reshape(self.real_num_frames, 3)np.array([100, 100, 100])

And by changing the line in SMPL-to-FBX/FbxReadWriter.py from: name = "m_avg_root" -> name = "m_avg_Pelvis"

tommyshelby4 commented 1 year ago

I made the change you suggested but still see an offset of 20cm in the vertical axis, the foot-ground contact does not exist

yh675 commented 1 year ago

I made the change you suggested but still see an offset of 20cm in the vertical axis, the foot-ground contact does not exist

You also have to take into account the lowest joint location in results.npy which may not be at 0. Then load the mesh into Unreal Engine with the lowest joint location offset from the floor height.

LEEYEONSU commented 1 year ago

mdm2ue5.mp4 Importing motion diffusion animations into UE5 at runtime now thx to the conversion code :D

hello. i was wondering the way you import motion-diffusion-model output to Unreal Engine.

I am now struggling with retargeting to UE5.0 .

If you have a code could you share it or explain it.

Thank you for reading.

fwwucn commented 1 year ago

I made the change you suggested but still see an offset of 20cm in the vertical axis, the foot-ground contact does not exist

You also have to take into account the lowest joint location in results.npy which may not be at 0. Then load the mesh into Unreal Engine with the lowest joint location offset from the floor height.

I have the same issue, but I don't think I should always adjust the lowest joint location to 0, because some frames may have jumping action (two feet are away from ground). Is there any flag for each frame which can indicate whether the lowest joint location of current frame should be on the ground plane or not?

tshrjn commented 1 year ago

Thank again. I tried to work with this https://github.com/softcat477/SMPL-to-Fbx as you suggested. But setting up python fbx is quite complicated. I have not been able to configure it on my macbook. The FBX-python instructions are very cryptic. I followed the instructions and installed fbx sdk on my macbook and then the smpl-to-fbx requirements in a new conda environment. Then while running the Python Convert.py I got this error (without more info): ['gBR', 'gPO', 'gLO', 'gMH', 'gLH', 'gHO', 'gWA', 'gKR', 'gJS', 'gJB'] ['sBM', 'sFM'] ['0', '1', '2', '3', '4', '5'] Segmentation fault: 11 Do you have any ideas why I am getting this error? My guess the FBX sdk configuration was not done properly, although I followed the instructions provided. Thanks

The first three lines are the output of these lines

For Segmentation fault: 11, I'm not confident. But if not tried, make sure you installed the correct version of "Python".

I seem to be having the same issue,

tshrjn commented 1 year ago

@mmdrahmani @yotaro-shimose I was able to resolve Segfaults as mentioned in this PR.

tshrjn commented 1 year ago

@TREE-Ind You mentioned converting fbx to gltf - could you share how you did this? And also, how can we transfer the animation to other meshes?

@babotrojka Not sure if this will work for your setup but we had good success converting the fbx to gltf while retaining the animation data which reduces the file size substantially.

tshrjn commented 1 year ago

@mmdrahmani @yh675 @jaykup1 @GuyTevet After we get the FBX file, how are you retargetting that motion to other assets like the one shown in the video?

kruzel commented 1 year ago

hi, I'm looking for a way to create animation without going thru visualize.render_mesh.

does anyone know how to get the root pose and rotations from the model output here in generate.py: image

dj-kefir-siorbacz commented 5 months ago

@TREE-Ind Sure, it's just a proof of concept Blender script tho. Here you go: https://gist.github.com/amb/69cc8396bc61cc59a7e819aed6b21f34 The script has "import bpy" so technically it's also GPL3 license.

If I see correctly, this script creates a Blender armature based on positional data from .npy. Then, for every frame it aligns the pose bones of this armature to match the positions of .npy joints.

There are three issues with this method:

TL;DR this method is worthless for retargetting/driving other 3D models; it fakes the rotational information. You are better off treating the positional data of the original results.npy as the end-effector bones of inverse kinematics.

@amb Tried to use the script and exported the fbx, but the result was an orb with spykes around moving Is there a different way to export it? bpy.ops.export_scene.fbx(filepath="./tata.fbx", use_selection=True, add_leaf_bones=False)

Blender's FBX export is pretty bad. I wouldn't rely on it. You are better off exporting to .bvh using Blender, then importing this .bvh to some software that can both import .bvh and export .fbx well (for example, Autodesk software, e.g. Motion Builder). That way you are sure your .fbx will work in other software. If you're fine with .gltf or .glb, then Blender does a fine job exporting them.

ciroanni commented 1 month ago

Hi, I am trying to convert MDM model output to fbx files so that I can import them to Unity. I have tried several methods, starting from the npy file I used the smpl2bvh script to get the mocap and then from Blender export it to FBX but I get strange effects as shown here

https://github.com/user-attachments/assets/c849de2d-7638-4399-b5b4-fadab7e14923

I assume it is due to the fact that the npy file only contains the positions of the joints frame by frame and not the rotations so when I have more complicated animations where the character has to rotate, turn, etc... it just tries to get the joints to that position. So I tried the repo that you have previously linked i.e. SMPL-TO-FBX. This definitely works better but it is very slow, it takes me at least 3 minutes and also I still have small glitches as you can see here.

https://github.com/user-attachments/assets/270f865a-9de5-4e7d-b81e-4d274a4460a0

So when I have an easy input like “Walk forward”, paradoxically it comes better with the first method, in fact I don't have those little head movements for example and it is much faster. Could you tell me how come I have those problems with the second method? Also, do you have a way to implement the first method by also calculating the rotations but avoiding the SMPLify algorithm which takes a lot of time? Thank you very much in advance