Lingchen-chen / Dynamic-Hair

Other
80 stars 12 forks source link

Dynamic Hair Modeling

The source code of the networks for our paper "Dynamic hair modeling from monocular videos using deep neural networks" (SIGGRAPH ASIA 2019)

We propose a novel framework for dynamic hair modeling from monocular videos. We use two networks HairSpatNet and HairTempNet to separately predict hair geometry and hair motion. The entire framework is as follows:

Improvments

Prerequisites

Getting Started

Related works

Citation

If you find this useful for your research, please cite the following papers.

@inproceedings{wu2022neuralhdhair,
title={NeuralHDHair: Automatic High-fidelity Hair Modeling from a Single Image Using Implicit Neural Representations},
author={Wu, Keyu and Ye, Yifan and Yang, Lingchen and Fu, Hongbo and Zhou, Kun and Zheng, Youyi},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={1526--1535},
year={2022}
}

@article{yang2019dynamic,
  title={Dynamic hair modeling from monocular videos using deep neural networks},
  author={Yang, Lingchen and Shi, Zefeng and Zheng, Youyi and Zhou, Kun},
  journal={ACM Transactions on Graphics (TOG)},
  volume={38},
  number={6},
  pages={1--12},
  year={2019},
}