This is a repository for the paper "ContourCraft: Learning to Resolve Intersections in Neural Multi-Garment Simulations" (SIGGRAPH2024).
It is based on and fully includes the code for the paper "HOOD: Hierarchical Graphs for Generalized Modelling of Clothing Dynamics"
NOTE: This repo precisely follows the structure and includes all functionality of the HOOD repo. The main difference is the added code and model for ContourCraft. You can run inference of the ContourCraft model using Inference.ipynb
or Inference_from_any_pose.ipynb
(more details below), same as in HOOD. Soon training code for ContourCraft and more functionality specific to multi-layer outfits will be added.
TODO list:
The installation follows the procedure for HOOD, but you'll also need to install several extra libraries (see the end of this section)
We provide a conda environment file hood.yml
to install all the dependencies.
You can create and activate the environment with the following commands:
conda env create -f hood.yml
conda activate hood
If you want to build the environment from scratch, here are the necessary commands:
This is a custom CUDA library for collision detection and response. Install using the instruction in its README
pip install warp-lang
Install following their instructions:
Download the auxiliary data for HOOD using this link.
Unpack it anywhere you want and set the HOOD_DATA
environmental variable to the path of the unpacked folder.
Also, set the HOOD_PROJECT
environmental variable to the path you cloned this repository to:
export HOOD_DATA=/path/to/hood_data
export HOOD_PROJECT=/path/to/this/repository
Download the SMPL models using this link. Unpack them into the $HOOD_DATA/aux_data/body_models/smpl
folder.
If you want to use SMPL-X models, download them and unpack into $HOOD_DATA/aux_data/body_models/smplx
.
In the end your $HOOD_DATA
folder should look like this:
$HOOD_DATA
|-- aux_data
|-- datasplits // directory with csv data splits used for training the model
|-- body_models
|-- smpl // directory with smpl models
|-- SMPL_NEUTRAL.pkl
|-- SMPL_FEMALE.pkl
|-- SMPL_MALE.pkl
|-- smplx // directory with smplx models
|-- SMPLX_NEUTRAL.pkl
|-- SMPLX_FEMALE.pkl
|-- SMPLX_MALE.pkl
|-- garment_meshes // folder with .obj meshes for garments used in HOOD
|-- garments_dict.pkl // dictionary with garmentmeshes and their auxilliary data used for training and inference
|-- smpl_aux.pkl // dictionary with indices of SMPL vertices that correspond to hands, used to disable hands during inference to avoid body self-intersections
|-- trained_models // directory with trained HOOD models
|-- cvpr_submission.pth // model used in the CVPR paper
|-- postcvpr.pth // model trained with refactored code with several bug fixes after the CVPR submission
|-- fine15.pth // baseline model without denoted as "Fine15" in the paper (15 message-passing steps, no long-range edges)
|-- fine48.pth // baseline model without denoted as "Fine48" in the paper (48 message-passing steps, no long-range edges)
The jupyter notebook Inference.ipynb contains an example of how to run inference of a trained HOOD model given a garment and a pose sequence.
It also has code for adding a new garment from an .obj file.
To run inference starting from arbitrary garment pose and arbitrary mesh sequence refer to the InferenceFromMeshSequence.ipynb notebook.
See the RepoIntro.md for more details on the repository structure.
If you use this repository in your paper, please cite:
@inproceedings{grigorev2022hood,
author = {Grigorev, Artur and Thomaszewski, Bernhard and Black, Michael J. and Hilliges, Otmar},
title = {{HOOD}: Hierarchical Graphs for Generalized Modelling of Clothing Dynamics},
journal = {Computer Vision and Pattern Recognition (CVPR)},
year = {2023},
}
@inproceedings{grigorev2024contourcraft,
title={{ContourCraft}: Learning to Resolve Intersections in Neural Multi-Garment Simulations},
author={Grigorev, Artur and Becherini, Giorgio and Black, Michael and Hilliges, Otmar and Thomaszewski, Bernhard},
booktitle={ACM SIGGRAPH 2024 Conference Papers},
pages={1--10},
year={2024}
}