mkocabas / VIBE

Official implementation of CVPR2020 paper "VIBE: Video Inference for Human Body Pose and Shape Estimation"
https://arxiv.org/abs/1912.05656
Other
2.88k stars 548 forks source link

Get animated mesh as .fbx #1

Closed Dene33 closed 3 years ago

Dene33 commented 4 years ago

Is there a fast way of getting estimated animation as .fbx (or any other format)? To import to 3d software. Thanks.

mkocabas commented 4 years ago

Hi @Dene33,

Thanks for your interest.

Output format is shown here. Currently, we output the mesh, pose, shape and other related parameters and save them as a pickle file. I don't have experience with animated .fbx, but it might be straightforward to get it from the information we provide. If you know any good resources or pointers for animated .fbx, I would love to give it a try.

MichaelJBlack commented 4 years ago

I’m cc’ing Joachim Tesch. We can output SMPL in fbx and Joachim is he expert.

Michael

On Dec 13, 2019, at 2:46 AM, Muhammed Kocabas notifications@github.com wrote:

Hi @Dene33,

Thanks for your interest.

Output format is shown here. Currently, we output the mesh, pose, shape and other related parameters and save it as a pickle file. I don't have experience with animated .fbx, but it might be straightforward to get it from the information we provide. If you know any good resources or pointers for animated .fbx, I would love to give it a try.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or unsubscribe.

spha-code commented 4 years ago

Any progress on this? How can we extract the meshes in an .fbx file? Ideally the camera perspective of the .fbx should match the source video. Any hint how to achieve this? thanks

mkocabas commented 4 years ago

@spha-code we are working on a solution, I will update here.

spha-code commented 4 years ago

@mkocabas , thanks for the response, best of luck, I am looking forward to your update.

ikvision commented 4 years ago

In the mean time, I found a SMPL method to save it to obj file, looks pretty simple: https://github.com/CalciferZh/SMPL/blob/f7a2eb30edfb99eb977c5dccdfcde25344a7f545/smpl_np.py#L188

mkocabas commented 4 years ago

@ikvision we also provide an option to save results as obj file. Check here:

https://github.com/mkocabas/VIBE/blob/8e57b9179250ff919b380d0a4c77534e83209a2a/demo.py#L378

zheyangshi commented 4 years ago

Hi! Is any progress on this issue? Thanks.

ga74kud commented 4 years ago

Hi @Dene33,

Thanks for your interest.

Output format is shown here. Currently, we output the mesh, pose, shape and other related parameters and save them as a pickle file. I don't have experience with animated .fbx, but it might be straightforward to get it from the information we provide. If you know any good resources or pointers for animated .fbx, I would love to give it a try.

I tried to unpickle the output file. It does not work for me.

Solution with joblib (see README.mb)

I will try it with the '--save_obj' option...

@ikvision we also provide an option to save results as obj file. Check here:

https://github.com/mkocabas/VIBE/blob/8e57b9179250ff919b380d0a4c77534e83209a2a/demo.py#L378

ga74kud commented 4 years ago

Is there a fast way of getting estimated animation as .fbx (or any other format)? To import to 3d software. Thanks. You have different possibilities, see 'https://stackoverflow.com/questions/34132474/convert-obj-to-fbx-with-python-fbx-sdk'

The laborious way is to convert it by hand with Blender, MeshIO, but for huge obj files this makes no sense...

8secz-johndpope commented 4 years ago

at first glance - to achieve fbx export / animations you need to determine the skeleton / rigging / skinning weights / motion transfer. It seems the obj file is only going to export the model - (sidenote) it maybe possible to drop into mixamo.com to get that to dance around with it's form detected for a particular frame -

To get the animations - which I'm interested as well, need to determine the rigging - and then bind the mesh

If there's interest - @zhan-xu has repo https://github.com/zhan-xu/AnimSkelVolNet

There's also another one coming called RigNet https://zhan-xu.github.io/rig-net/ https://github.com/zhan-xu/RigNet

Skeleton-Aware Networks for Deep Motion Retargeting https://arxiv.org/pdf/2005.05732.pdf

Need to feed the output of VIBE into one of these to determine skeleton

Q) Can the fbx animations be isolated without the model?

this might get things moving in right direction - https://github.com/zhan-xu/AnimSkelVolNet/blob/master/maya_bind.py

related https://github.com/zhan-xu/AnimSkelVolNet/issues/1

(don't need skinning weights - can use Geodesic Voxel Bind) https://www.youtube.com/watch?v=ZyytEiB-1Ug

ikvision commented 4 years ago

@8secz-johndpope thank you for all the information and thoughtful links. I am not an expert in the format of fbx or gltf file formats, but I hope I can help recognized the different component that VIBE can generate for an animation. I think that in the case of SMPL parametric model, used in VIBE, few datastrctures are known and do not need a ML model (as RigNet) to generate them given the final mesh:

  1. Blend Shape - Given by 10 beta and shape_disps Blend Shape
  2. Skeleton - I suppose it relates to the kinematic tree Skeleton
  3. Skinning weights -I think is given by W Skinning
  4. 3d Joint position - 24 joint used to define the skin 3d joints

Given all the above there is a detailed technical report how to convert SMPL model into Unity. This shows how adapting between different format may be challenging, but feasible. I hope those details can help us progress in this important task

8secz-johndpope commented 4 years ago

Hi @ikvision - to clarify -

"VIBE.... predicts the parameters of SMPL body model, parametrized 3d human body mesh, for each frame of an input video base on Video Inference for Body Pose and Shape Estimation" Just imagine 24fps - for 10 seconds - you're going to spit out 240 frames / with body mesh / shape estimations.... while this may suit motion video - for animated 3d models that have rigging - need to deduce the skeleton and make t-pose model

I didn't use SMPL with unity / I might dig into SMPL with Maya. https://psfiles.is.tuebingen.mpg.de/downloads/smpl/SMPL_maya-zip (may need to login)

Inside there's a readme

The script displays a UI to apply SMPL's shape and pose blendshapes and to adjust the skeleton to new body shapes.
Load this plugin into Maya. It will create a window with 3 options:

1- Apply Pose Blend Shapes to Current Frame: 
    If you repose the model in Maya, then click this to 
    compute and apply the pose blend shapes in the current frame. 
    You can als ochoose whether or not to set the keyframes for the 
    pose blendshapes. Check the 'Reset Keyframes' checkbox if you 
    would like to lock blendShape values at given frame by setting 
    a keyframe. 

2- Apply Pose Blend Shapes to Frames in above Range: 
    Specify a range of frames in an animation and then compute/apply 
    the pose blendshapes for all the frames in range. Check the 
    'Reset Keyframes' checkbox if you would like to lock blendShape 
    values at given frame range by setting a keyframe at each frame in the 
    given range.

3- Set Mesh to Bind-Pose & Recompute Skeleton: 
    When you edit the shape blend shapes to change body shape the 
    skeleton will no longer be correct.  Click first button to set the 
    mesh into the bind-pose. Next, click this to 'Recompute Skeleton' 
    to recompute the skeleton rig to match the new body shape.

Always make sure to cilck on the mesh in the 3D view to select it before 
using any of the functions in the plugin. Select only the mesh of the model 
you want to update and then click the appropriate button on the UI.

This will give you a bunch of shapes it looks like - which would suffice for a video inside a game / but not for rigging (make character walk left /right).

Screen Shot 2020-05-27 at 4 30 29 pm

UPDATE / related motion + deep learning library / it works with BVH / blender motion files. https://github.com/DeepMotionEditing/deep-motion-editing

8secz-johndpope commented 4 years ago

This probably doesn't help - but given you can get an obj file (from one frame of a video the model is extracted) you could use this latest development by google to get that posed body into different positions. ETReusEXgAIvUhD https://arxiv.org/pdf/2003.07254.pdf https://github.com/jiashunwang/Neural-Pose-Transfer

8secz-johndpope commented 4 years ago

related - bvh - is blenders motion file format https://github.com/HW140701/VideoTo3dPoseAndBvh

UpDATE https://twitter.com/jimei_yang/status/1286819624319819776?s=21

ujjawalcse commented 4 years ago

@mkocabas Any update about how to get animation as .fbx file for further use in 3D software? If someone tried and did it successfully, please share. Thanks

acoMCMXCVI commented 4 years ago

Hi! Is any progress on this issue? Thanks.

8secz-johndpope commented 4 years ago

https://meshcapade.com/infopages/licensing.html

Screen Shot 2020-08-14 at 1 49 00 pm
mkocabas commented 4 years ago

Hi everyone,

We provide a script to convert VIBE output into FBX/glTF format with this commit https://github.com/mkocabas/VIBE/commit/fda0950d639be8d8847e4d91af7ebde465638bd7.

You can give it a try. Please follow the instructions listed in the readme: https://github.com/mkocabas/VIBE#fbx-and-gltf-output-new-feature.

ghost commented 4 years ago

Hi everyone,

We provide a script to convert VIBE output into FBX/glTF format with this commit fda0950.

You can give it a try. Please follow the instructions listed in the readme: https://github.com/mkocabas/VIBE#fbx-and-gltf-output-new-feature.

Hello @mkocabas . I've tried the implementation but it looks like the conda version for Blender python API is available for python 3.8 and above there are compatibility issues with the current version of code and its packages with the Blender python API. More specifically if I change the python version in install_conda.sh to 3.8, mutli_person_tracker isn't imported. Any leads on this ?

AlleUndKalle commented 4 years ago

@gayatriprasad, I think an immediate solution would be setting up separate virtual environments for VIBE and blender. Or else, we can try to find the snippet preventing multi_person_tracker to work with python3.8.

ujjawalcse commented 3 years ago

Thanks, @mkocabas. I tried the 'fbx_output.py' on python3.7 and blender 2.83 and the code ran without any error. But the output is always a 36.2 MB file with very few animations (means not complete animation as resulted in output video) generated as I tested it in Blender. Any suggestions on how to resolve this issue.

wine3603 commented 3 years ago

hi, @ujjawalcse @mkocabas when I run fbx code on python 3.7, blender 2.80, I got this error,

python lib/utils/fbx_output.py     --input output/sample_video/vibe_output.pkl     --output output/sample_video/fbx_output.fbx \ # specify the file extension as *.glb for glTF
Traceback (most recent call last):
  File "lib/utils/fbx_output.py", line 295, in <module>
    if not input_path.startswith(os.path.sep):
NameError: name 'input_path' is not defined
ujjawalcse commented 3 years ago

Hi @wine3603, Actually, I was using the code for my use case. So I removed all the command line arguments and hard-coded those variables. You can notice the changes below in main as,

if __name__ == '__main__':
    try:
        if bpy.app.background:

            '''
            parser = argparse.ArgumentParser(description='Create keyframed animated skinned SMPL mesh from VIBE output')
            parser.add_argument('--input', dest='input_path', type=str, required=True,
                                help='Input file or directory')
            parser.add_argument('--output', dest='output_path', type=str, required=True,
                                help='Output file or directory')
            parser.add_argument('--fps_source', type=int, default=fps_source,
                                help='Source framerate')
            parser.add_argument('--fps_target', type=int, default=fps_target,
                                help='Target framerate')
            parser.add_argument('--gender', type=str, default=gender,
                                help='Always use specified gender')
            parser.add_argument('--start_origin', type=int, default=start_origin,
                                help='Start animation centered above origin')
            parser.add_argument('--person_id', type=int, default=1,
                                help='Detected person ID to use for fbx animation')

            args = parser.parse_args()
            '''

            #input_path = args.input_path
            #output_path = args.output_path
            input_path='C:\\Users\\UKS\\wowexpai\\motion_tracking3D\\VIBE\\output\\videoplayback\\vibe_output.pkl'
            output_path='C:\\Users\\UKS\\wowexpai\\motion_tracking3D\\VIBE\\output\\videoplayback\\fbx_output.fbx'

            if not os.path.exists(input_path):
                print('ERROR: Invalid input path')
                sys.exit(1)

            #fps_source = args.fps_source
            #fps_target = args.fps_target
            fps_source=25
            fps_target=25

            #gender = args.gender
            gender='female'

            #start_origin = args.start_origin
            start_origin=1
            person_id=1

        # end if bpy.app.background

        startTime = time.perf_counter()

        # Process data
        cwd = os.getcwd()
        #rest of the code below

It worked for my use case at least. Cheers.

wine3603 commented 3 years ago

@ujjawalcse thanks for quick reply! I tried to use your code but still get the same error. It looks like blender didnot run at background and bpy.app.background returns false. Maybe the problem is on my blender python API. May I ask how did you install bpy? in my case, pip install bpy got big error,

pip install bpy
Collecting bpy
  Using cached bpy-2.82.1.tar.gz (19 kB)
    ERROR: Command errored out with exit status 1:
     command: /home/zhang-u16/VIBE/vibe-env/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-1u5gxhj6/bpy/setup.py'"'"'; __file__='"'"'/tmp/pip-install-1u5gxhj6/bpy/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-il3is4lg
         cwd: /tmp/pip-install-1u5gxhj6/bpy/
    Complete output (16 lines):
    Traceback (most recent call last):
      File "/usr/lib/python3.7/tokenize.py", line 397, in find_cookie
        codec = lookup(encoding)
    LookupError: unknown encoding: future_fstrings

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/usr/lib/python3.7/tokenize.py", line 449, in open
        encoding, lines = detect_encoding(buffer.readline)
      File "/usr/lib/python3.7/tokenize.py", line 436, in detect_encoding
        encoding = find_cookie(second)
      File "/usr/lib/python3.7/tokenize.py", line 405, in find_cookie
        raise SyntaxError(msg)
    SyntaxError: unknown encoding for '/tmp/pip-install-1u5gxhj6/bpy/setup.py': future_fstrings
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

so I did pip install fake-bpy-module-2.80 and then I run python lib/utils/fbx_output.py got the NameError: name 'input_path' is not defined

I also tried to install joblib into blender's python3.7 and run blender --python /lib/utils/fbx_output.py but I get the same error

Traceback (most recent call last):
  File "lib/utils/fbx_output.py", line 295, in <module>
    if not input_path.startswith(os.path.sep):
NameError: name 'input_path' is not defined
nopeiyu commented 3 years ago

add sys.argv = sys.argv[sys.argv.index("--") + 1:] args = parser.parse_args(sys.argv) to fbx_otput.py and run blender -b -P lib/utils/fbx_output.py -- --input output/sample_video/vibe_output.pkl --output output/sample_video/vibe_output.fbx --fps_source 30 --fps_target 30 --gender male

ujjawalcse commented 3 years ago

Hi @wine3603, I have intalled blender and setup a new conda python3.7 environment that serves as python for blender. Open command prompt at the path where your blender.exe is present. Then execute the following command:- blender.exe --background --python myscript.py Provide full path to myscript.py. I usually follow this. Thanks.

wine3603 commented 3 years ago

@ujjawalcse hi man, thanks for your help. I figured out the blender bpy bug is from Nvidia Driver. I updated my driver to 450 then made blender work. I haven't try the fbx output interfaces yet.

CrossLee1 commented 3 years ago

Hello everybody, is there a way to parse fbx file to get the parameters for SMPL, like weights, meshes, faces?

Thanks~

yangtao19920109 commented 3 years ago

Hello everybody, is there a way to parse fbx file to get the parameters for SMPL, like weights, meshes, faces?

Thanks~

I also want to get this ~~~

DenseInL2 commented 3 years ago

I got this whole fascinating project working on my Windows machine, but in this last step of going from .pkl to .fbx (or .glb), I get file which seem to have no root motion. All figures are anchored at the hip. Is this something I've done incorrectly, a limitation of the converter, or something else?

carlosedubarreto commented 3 years ago

The same happens to me, looks like it is the way that it works. I'm looking the code if there is some way to get the hips to move.

liuhaorandezhanghao commented 3 years ago

我在我的 Windows 机器上完成了整个引人入胜的项目,但是在从 .pkl 到 .fbx(或 .glb)的最后一步中,我得到的文件似乎没有根运动。所有人物都固定在臀部。这是我做错了什么,转换器的限制,还是其他什么?

This requires calculation of rotation parameters and translation parameters

xiaoyang333 commented 2 years ago

我导出了自己的fbx软件,并且我完全利用命令行,并不依赖于blender软件,我在Ubuntu上直接安装了2.8版本bz2压缩包,解压后找到blender.sh脚本文件,执行./blender -b -P /data6t/jin/projects/VIBE-master/lib/utils/fbx_output.py -- --input /data6t/jin/projects/VIBE-master/output/dance/vibe_output.pkl --output /data6t/jin/projects/VIBE-master/output/dance/vibe_output.fbx --fps_source 30 --fps_target 30 --gender female 值得说明的是,你需要安装joblib在blender内嵌的Python环境中,这里我采用的方法是可以参考此链接 https://blog.csdn.net/G2yyyy/article/details/113820119

wang-zm18 commented 2 years ago

@ujjawalcse thanks for quick reply! I tried to use your code but still get the same error. It looks like blender didnot run at background and bpy.app.background returns false. Maybe the problem is on my blender python API. May I ask how did you install bpy? in my case, pip install bpy got big error,

pip install bpy
Collecting bpy
  Using cached bpy-2.82.1.tar.gz (19 kB)
    ERROR: Command errored out with exit status 1:
     command: /home/zhang-u16/VIBE/vibe-env/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-1u5gxhj6/bpy/setup.py'"'"'; __file__='"'"'/tmp/pip-install-1u5gxhj6/bpy/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-il3is4lg
         cwd: /tmp/pip-install-1u5gxhj6/bpy/
    Complete output (16 lines):
    Traceback (most recent call last):
      File "/usr/lib/python3.7/tokenize.py", line 397, in find_cookie
        codec = lookup(encoding)
    LookupError: unknown encoding: future_fstrings

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/usr/lib/python3.7/tokenize.py", line 449, in open
        encoding, lines = detect_encoding(buffer.readline)
      File "/usr/lib/python3.7/tokenize.py", line 436, in detect_encoding
        encoding = find_cookie(second)
      File "/usr/lib/python3.7/tokenize.py", line 405, in find_cookie
        raise SyntaxError(msg)
    SyntaxError: unknown encoding for '/tmp/pip-install-1u5gxhj6/bpy/setup.py': future_fstrings
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

so I did pip install fake-bpy-module-2.80 and then I run python lib/utils/fbx_output.py got the NameError: name 'input_path' is not defined

I also tried to install joblib into blender's python3.7 and run blender --python /lib/utils/fbx_output.py but I get the same error

Traceback (most recent call last):
  File "lib/utils/fbx_output.py", line 295, in <module>
    if not input_path.startswith(os.path.sep):
NameError: name 'input_path' is not defined

I also met this problem, have you solved it?

lucasjinreal commented 2 years ago

Hi, does anyone knows why the animated FBX orientation is wrong while the original fbx model file is correct?

image
lucasjinreal commented 2 years ago

I have solved it.

image
Mirandl commented 2 years ago

@xiaoyang333 Hi, 我从官网上下载2.8解压后 找不到blender.sh脚本文件,所以也运行不了后面的命令,请问您是在哪下载的blender.sh的呢? Thank you

akk-123 commented 2 years ago

@jinfagang , can you tell me how to solve it ?

Mirandl commented 2 years ago

@akk-123 hi, you can refer this to remove the extra points https://youtu.be/w1biKeiQThY . hope it helps

lucasjinreal commented 2 years ago

https://www.youtube.com/watch?v=w1biKeiQThY&ab_channel=CEBStudios this doesn't solve anything, the whole body still moves along with hip: image

carlosedubarreto commented 2 years ago

Hello @jinfagang , its a long time since I used VIBE, but I think the best way to make it work properly, is to go to the root bone (maybe the hip bone) and remove all the animation on it and rotate properly, usualy 90º on x or y axis.

lucasjinreal commented 2 years ago

@carlosedubarreto I can rotate and make it on the ground. but seems the animation still start from hip root, which is not natural, is there a way to make it in script?

carlosedubarreto commented 2 years ago

@carlosedubarreto I can rotate and make it on the ground. but seems the animation still start from hip root, which is not natural, is there a way to make it in script?

Oh, this problem I've never solved.

I ended up using another solution. Easymocap. I think it deal better with the creation of animation from video.

lucasjinreal commented 2 years ago

@carlosedubarreto as far as I know, easymocap using multi view images to estimate, isn't it?

akk-123 commented 2 years ago

@carlosedubarreto @jinfagang the fbx_output.py set trans is zeros, so the Pelvis is never move, animation is not natural, do you know how to sove it?

carlosedubarreto commented 2 years ago

@carlosedubarreto as far as I know, easymocap using multi view images to estimate, isn't it?

thats correct. you need at least 2 cameras. If you want to try, I did a blender version and addon to make the process easier.

If you like you can get it for free at (you must use a coupon code that is in the description, because of the size of the files I could not put it as free) https://carlosedubarreto.gumroad.com/l/ceb_easymocap_blender

There are also some videos showing how to use. I took the 3 months I had to study, and backed in a solution that other people would not need to suffer like I did.

But depending on your gpu maybe it wont work. I did it using an RTX2060 (I know RTX 3090 do not work with this solution I did)

carlosedubarreto commented 2 years ago

@carlosedubarreto @jinfagang the fbx_output.py set trans is zeros, so the Pelvis is never move, animation is not natural, do you know how to sove it?

I gave up on it and moved to easymocap, it do not have this problem.

lucasjinreal commented 2 years ago

@carlosedubarreto Easymocap using multi view as input..... Which, not suitable for wild inference.

tianyu06030020 commented 2 years ago

hi @carlosedubarreto , thanks for your great work, I want to try your easymocap addon , But I do not know how to download, Is there a download link in https://carlosedubarreto.gumroad.com/l/ceb_easymocap_blender?