mateuszwyszynski / PoseNDF

Implementation of Pose-NDF: Modeling Human Pose Manifolds with Neural Distance Fields
Other
0 stars 0 forks source link

Pose-NDF: Modeling Human Pose Manifolds with Neural Distance Fields

This repository contains official implementation of ECCV-2022 paper: Pose-NDF: Modeling Human Pose Manifolds with Neural Distance Fields (Project Page)

org_scan

Installation:

Please follow INSTALL.md

Training and Dataset

1. Download AMASS: Store in a folder "amass_raw"". You can train the model for SMPL/SMPL+H or SMPL+X.

https://amass.is.tue.mpg.de/

One has to download the SMPL model from: https://smpl.is.tue.mpg.de/. Currently I am using "version 1.1.0 for Python 2.7 (female/male/neutral, 300 shape PCs)"

2.1 Sample poses from AMASS:

This is the data preparation step based on VPoser data preparation. If you already have the data processed, you can skip this step.

python -m data.sample_poses

By default, the raw AMASS data is assumed to be in the ./amass_raw directory and the output is stored in ./amass_samples. One can change this behavious by providing additional arguments. Use:

If you would like to use only a subset of data from AMASS, you should change the predefined variable amass_splits in the data/data_splits.py script.

TODO: Why the following sentence was here in the original repo?

2.2 Create a script for generating training data:

In this step a bash script train_data.sh for training data generation is created in the project root directory.

python -m data.prepare_data

By default, the input data is assumed to be in the ./amass_samples directory (default value from the previous step) and the generated training data will be stored in ./training_data.

One can change the default behaviour by providing additional arguments:

TODO: Deal with these instructions about using slurm:

2.3 Create training data :

Run the bash script (if needed change to your shell):

bash train_data.sh

TODO: Clarify these instructions:

3. Edit configs/<>.yaml for different experimental setup

experiment:
    root_dir: directory for training data/models and results
model:     #Network acrhitecture
    ......
training:  #Training parameters
    ......
data:       #Training sample details
    .......

Root directory will contain dataset, trained models and results.

4. Training Pose-NDF :

python trainer.py --config=configs/amass.yaml

amass.yaml contains the configs used for the pretrained model.

4. Download pre-trained model : Pretrained model

Latest model: version2/ You can also find the corresponding config file in the same folder

Inference

Pose-NDF is a continuous model for plausible human poses based on neural distance fields (NDFs). This can be used to project non-manifold points on the learned manifold and hence act as prior for downstream tasks.

Pose generation

A pose is generated in two steps:

  1. Assign random values to joints
  2. Project the resulting pose onto the manifold

You can generate random plausible poses with:

python -m experiments.sample_poses --config={} --ckpt-path={} --num-poses={} --poses-fpath={} --max-projection-distance={} --max-projection-steps={} --render --save-projection-steps

where:

Pose interpolation

python -m experiments.interpolation --config={} --ckpt-path={} --poses_fpath={} --num-steps={} --step-size={} --max-projection-distance={} --max-projection-steps={} --save-interpolation-steps

where:

Motion denoising

TODO: This section was not revised, because I do not have the noisy data. Have to figure this out.

 python experiment/motion_denoise.py --config=configs/amass.yaml  --motion_data=<motion data folder> --ckpt_path={}  --outpath_folder={} --bm_dir_path={}

Motion data file is .npz file which contains "body_pose", "betas", "root_orient". This is generated using: https://github.com/davrempe/humor/tree/main/humor/datasets bm_dir_path: path to SMPL body model

Image based 3d pose estimation

TODO: Didn't cover this section, because it is not of the main interest for us currently.

 1. Run openpose to generate 2d keypoints for given image(https://github.com/CMU-Perceptual-Computing-Lab/openpose).
 2. python experiment/image_fitting.py --config=configs/amass.yaml  --image_dir=<image data dir>

Both image and corresponding keypoint should be in same directory with .jpg and .json being the image and 2d keypoints file respectively.

Visualization

You can also install viser to run interactive visualizations. I have prepared a script which is based on the example for SMPLX in the viser project. It allows to do some visualization. Note that this is still work in progress.

Projection algorithm

If you have run the experiments/sample_poses.py script with an option to save projection steps

def project(self, noisy_poses, iterations=100, save_projection_steps=True)

you should have created a file with consecutive poses generated by the projection algorithm. Currently these are saved inside the current experiment directory inside the projections_steps folder. With this file ready, you can run:

python -m utils.trajecotry_visualization --model-path={} --poses-path={}
python -m utils.trajectory_visualization --model-path={} --poses-path={} --config={} --checkpoint-path={}

where:

Open the link presented by the CLI in a browser. You can play the animation in a loop by selecting Playing. You can also control the pose index with a slider or the next / previous pose buttons.

Note that there is a read only field which shows the distance to the manifold for the current pose.

AMASS raw

Similarly as in the paragraph above, you can visualize movement in the raw AMASS data. You just have to specify a different --poses-path.

Citation:

@inproceedings{tiwari22posendf,
    title = {Pose-NDF: Modeling Human Pose Manifolds with Neural Distance Fields},
    author = {Tiwari, Garvita and Antic, Dimitrije and Lenssen, Jan Eric and Sarafianos, Nikolaos and Tung, Tony and Pons-Moll, Gerard},
    booktitle = {European Conference on Computer Vision ({ECCV})},
    month = {October},
    year = {2022},
    }

Troubleshooting