caizhongang / SMPLer-X

Official Code for "SMPLer-X: Scaling Up Expressive Human Pose and Shape Estimation"
https://caizhongang.github.io/projects/SMPLer-X/
Other
1k stars 73 forks source link

bone point data of the human body #20

Closed ghx2757 closed 12 months ago

ghx2757 commented 1 year ago

This is really an exciting job. Can I directly apply the bone point data of the human body in the model inference results to the self built human body model? I visualized the bone point data of the human body in the model inference results, but did not output a representation similar to the skeleton. @caizhongang

caizhongang commented 1 year ago

Hi @ghx2757 , we use SMPL-X, which is a parametric mesh model, as the human representation. Currently, we do not output joints but you may intercept the keypoints here.

Please let us know if this resolves your problem. :)

ghx2757 commented 1 year ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

1286710929 commented 1 year ago

非常感谢您的回复!按照您的提示,我模拟了Mesh的可视化方案,成功可视化了人体25个关键点!然而,这些点的顺序是否与orig_joints_name的顺序一致仍然是个问题,因为这里的root_pose 和 body_pose是 22 个旋转向量? 我的目标是使用这些原始旋转矢量来驱动我自己的数字人。再次感谢你。 您好,我最近也在研究该工作输出的关节点,请教一下输出的22个旋转向量和原始的smplx是一致的吗?

ziyuanding commented 12 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

ghx2757 commented 12 months ago

Thank you very much for your reply. You explained it very clearly. Thank you for your guidance. I will try to follow the ideas you provided in the future. Thank you again for your reply, which is extremely important to me. I wish you a happy life! @zacida

AWangji commented 12 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender export_to_blender

Hi thanks for your solution. But I met a strange problem. when I try to import pk file, it says: image

Python: Traceback (most recent call last): File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addoninit.py", line 69, in execute load_bvh(res_db, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 198, in load_bvh ob, obname, arm_ob = init_scene(scene, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 68, in init_scene cam_ob = bpy.data.objects['Camera'] KeyError: 'bpy_prop_collection[key]: key "Camera" not found'

why cause that?

ziyuanding commented 12 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name. rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models. to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

    [

        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

      [

      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

   [ ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif) ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

    [

        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

      [

      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

   [ ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

Hi thanks for your solution. But I met a strange problem. when I try to import pk file, it says: image

Python: Traceback (most recent call last): File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addoninit.py", line 69, in execute load_bvh(res_db, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 198, in load_bvh ob, obname, arm_ob = init_scene(scene, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 68, in init_scene cam_ob = bpy.data.objects['Camera'] KeyError: 'bpy_prop_collection[key]: key "Camera" not found'

why cause that?

do not delete all things at first! keep at least one camera image

AWangji commented 12 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name. rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models. to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

    [

        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

      [

      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
    [

        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

      [

      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

   [ ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif) ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

    [

        ![export_to_blender](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)
      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

      [

      ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

   [ ](https://user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif)

Hi thanks for your solution. But I met a strange problem. when I try to import pk file, it says: image Python: Traceback (most recent call last): File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addoninit.py", line 69, in execute load_bvh(res_db, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 198, in load_bvh ob, obname, arm_ob = init_scene(scene, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 68, in init_scene cam_ob = bpy.data.objects['Camera'] KeyError: 'bpy_prop_collection[key]: key "Camera" not found' why cause that?

do not delete all things at first! keep at least one camera image

Hi, sorry to bother you. But after that I met another weird error again:

Python: Traceback (most recent call last): File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon__init__.py", line 69, in execute load_bvh(res_db, root_path, gender) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 230, in load_bvh apply_trans_pose_shape( File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 160, in apply_trans_pose_shape mrots, bsh = rodrigues2bshapes(pose) File "C:\Users\Administrator\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\hybrik_blender_addon\convert2bvh.py", line 144, in rodrigues2bshapes rod_rots = np.asarray(pose).reshape(24, 3) ValueError: cannot reshape array of size 216 into shape (24,3)

As the log says, the pose size is 216 but I transform to shape(24,3), but the original codes are: image obviously, when the pose size is 216, it should go into the above codes and should not get an error

ziyuanding commented 12 months ago

@AWangji , it is sad that I cannot reproduce your issue... maybe you could go ask hybrik's maintainer.

however, if you meet other error like "numpy64 cannot iterate", you can try to remove all "#apply shape blendshapes", that's for another issue.

AWangji commented 12 months ago

@AWangji , it is sad that I cannot reproduce your issue... maybe you could go ask hybrik's maintainer.

however, if you meet other error like "numpy64 cannot iterate", you can try to remove all "#apply shape blendshapes", that's for another issue.

Hi, thanks for your reply. I have solved all the above problems. But personally, I doubt why the resulted animation jitter is very serious. Why cause that? Can I smooth it?

ziyuanding commented 12 months ago

Hi, @AWangji ,

I don't know, I'm just new to this area.. as I said before, it's just a rough and naive way. If you wanna get the best performance I think you have to write code yourself or wait for the authors of SMPLer-X to finish such feature.

AWangji commented 11 months ago

Hi, @AWangji ,

I don't know, I'm just new to this area.. as I said before, it's just a rough and naive way. If you wanna get the best performance I think you have to write code yourself or wait for the authors of SMPLer-X to finish such feature.

Excuse me, I would like to know if your converted results will also have jitter? I just want to know something to do with my video range of motion

zxx123518 commented 7 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

  1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: compare
  2. using blender addon from HybrIK to import the pickle file to blender
  3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender export_to_blender

Traceback (most recent call last): File "inference.py", line 333, in <module> main() File "inference.py", line 160, in main out = demoer.model(inputs, targets, meta_info, 'test') File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, *kwargs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward return self.gather(outputs, self.output_device) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 181, in gather return gather(outputs, output_device, dim=self.dim) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather res = gather_map(outputs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in <genexpr> return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map return type(out)(map(gather_map, zip(outputs))) TypeError: expected a sequence of integers or a single integer, got '<map object at 0x7a29e0579c10>'

Excuse me, I have such an error after running your comparison code. Do you know what caused it? Thank you very much!

zxx123518 commented 7 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name.

rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models.

to make use of pose data to drive digital human, I have a rough and naive way:

1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: [compare](https://github.com/zacida/SMPLer-X/compare/3e72f5d31fb875c525b97871072bbd3914fe2dce...52390a2c725ccbbe0b823fdf4dbf80e5af7803d7)

2. using [blender addon from HybrIK](https://github.com/Jeff-sjtu/HybrIK/releases) to import the pickle file to blender

3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender export_to_blender

load checkpoint from local path: ../pretrained_models/mmdet/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth 0%| | 0/275 [00:07<?, ?it/s] Traceback (most recent call last): File "inference.py", line 213, in main() File "inference.py", line 136, in main out = demoer.model(inputs, targets, meta_info, 'test') File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, *kwargs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward return self.gather(outputs, self.output_device) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 181, in gather return gather(outputs, output_device, dim=self.dim) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather res = gather_map(outputs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map return type(out)(map(gather_map, zip(outputs))) TypeError: expected a sequence of integers or a single integer, got '<map object at 0x73350d3719a0>'

Hello! First of all, thank you very much for your guidance. I encountered this problem when running your code. Do you know how to solve it? Thank you very much!

xmf1620367664 commented 4 months ago

Thank you very much for your reply!Following your prompt, I simulated Mesh's visualization scheme and successfully visualized 25 key points of the human body! However, it is still questionable whether the order of these points is consistent with the order of orig_joints_name, as root_pose and body_pose here are 22 rotation vectors? My goal is to use these raw rotation vectors to drive my own digital human. Thank you again.

Hi @ghx2757 , hope my reply is not too late. the joint_cam, which contains 25 joints, ordered like pos_joints_name. rootpose and bodypose are different with joint_cam (joint_cam doesn't have spine bones while poses have), adding these poses together you will get something ordering like orig_joints_name, root_pose is Pelvis. so you cannot simply treat joint data and pose data as the same thing. they are two different kinds of models. to make use of pose data to drive digital human, I have a rough and naive way:

1. modify SMPLer-X code to make it output rotmat. export rotmat as pickle file. detailed code change could be found here: [compare](https://github.com/zacida/SMPLer-X/compare/3e72f5d31fb875c525b97871072bbd3914fe2dce...52390a2c725ccbbe0b823fdf4dbf80e5af7803d7)

2. using [blender addon from HybrIK](https://github.com/Jeff-sjtu/HybrIK/releases) to import the pickle file to blender

3. I think this step should be something like rigging or remapping armature? sadly I don't know much about this, but at least you can see the animation in blender, I guess it would be easy if you want to export a bvh file :)

export_to_blender

    [

        ![export_to_blender](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

    ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

      [

      ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

   [ ![export_to_blender](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U) ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

    [

        ![export_to_blender](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

    ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

      [

      ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

   [ ](https://private-user-images.githubusercontent.com/112534341/283023114-dcba96f1-c7f9-4780-9777-1090bd5aa02e.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTEzMzYwMTEsIm5iZiI6MTcxMTMzNTcxMSwicGF0aCI6Ii8xMTI1MzQzNDEvMjgzMDIzMTE0LWRjYmE5NmYxLWM3ZjktNDc4MC05Nzc3LTEwOTBiZDVhYTAyZS5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwMzI1JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDMyNVQwMzAxNTFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZDljZmJkY2IzNjUxOWUwMDYxNzVjOTE3M2U3MzFkYTg4ODZmZWRkMDIwMzdlOTE3MDcyMDA2Mzg1ODVhMjIyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pb8aflRuH1LuqHAFn3Sk7C1CvAe2tuufzu0v5FCrM2U)

load checkpoint from local path: ../pretrained_models/mmdet/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth 0%| | 0/275 [00:07<?, ?it/s] Traceback (most recent call last): File "inference.py", line 213, in main() File "inference.py", line 136, in main out = demoer.model(inputs, targets, meta_info, 'test') File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, *kwargs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward return self.gather(outputs, self.output_device) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 181, in gather return gather(outputs, output_device, dim=self.dim) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 78, in gather res = gather_map(outputs) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in gather_map return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 69, in return type(out)((k, gather_map([d[k] for d in outputs])) File "/home/bruce/anaconda3/envs/smplerx/lib/python3.8/site-packages/torch/nn/parallel/scatter_gather.py", line 73, in gather_map return type(out)(map(gather_map, zip(outputs))) TypeError: expected a sequence of integers or a single integer, got '<map object at 0x73350d3719a0>'

Hello! First of all, thank you very much for your guidance. I encountered this problem when running your code. Do you know how to solve it? Thank you very much!

out['my_body_pose_mat'] = my_body_pose_mat =>out['my_body_pose_mat'] = torch.tensor(my_body_pose_mat).to("cuda:0")

chuxiang93 commented 2 months ago

@AWangji , it is sad that I cannot reproduce your issue... maybe you could go ask hybrik's maintainer.

however, if you meet other error like "numpy64 cannot iterate", you can try to remove all "#apply shape blendshapes", that's for another issue.

I met the error, how to remove all "apply shape blendshapes"?

ziyuanding commented 2 months ago

@chuxiang93 ,你好。 自从我上次关注这个项目以来已经过去很久了,所以我的解决方案可能已经不适用了。如果我记得没错的话,"#apply shape blendshapes"是hybrIK那个项目提供的blender插件中的一行注释,你可能应该自己把这部分注释所描述的代码删掉, 然后打包成blender插件,安装这个你自己修改过的blender插件并使用。

这个解决方案很菜,我不是做这个方向的,就是偶然来了兴趣。如果你使用我的那些改动,那么你可能还要改手指的动作。我记得我直接用手腕的数据套给了所有手指头,当时是想着快速验证。

我一直期待着仓库作者能够早日把他们说的插件端上来,毕竟官方的方案肯定会比我这个半吊子方案好。

It's been a long time since I last looked at this project, so my solution may not be applicable anymore. If I remember correctly, "#apply shape blendshapes" is a line of comment in the blender plugin provided by the hybrIK project. You should probably delete the code described in this comment yourself, then package it into a blender plugin, install this modified blender plugin and use it.

This solution is very rudimentary. I'm not doing this direction, but I just got interested in it by chance. If you use my changes, you may also have to change the finger movements. I remember that I directly used the wrist data to apply to all fingers, thinking of quick verification at the time.

I have been looking forward to the repository author to bring their plugin up as soon as possible. After all, the official solution will definitely be better than my half-baked solution.