yufan1012 / MonoGaussianAvatar

MIT License
108 stars 3 forks source link
# Monogaussianavatar: Monocular gaussian point-based head avatar [Yufan Chen](https://yufan1012.github.io/)†,1, [Lizhen Wang](https://lizhenwangt.github.io/)2, [Qijing Li](https://www.liuyebin.com/student.html)2, [Hongjiang Xiao](https://www.semanticscholar.org/author/Hongjiang-Xiao/2747760)3, [Shengping Zhang](http://homepage.hit.edu.cn/zhangshengping)*,1, [Hongxun Yao](http://homepage.hit.edu.cn/yaohongxun)1, [Yebin Liu](http://www.liuyebin.com)2

1Harbin Institute of Technology   2Tsinghua Univserity   3Communication University of China
*Corresponding author   Work done during an internship at Tsinghua Univserity

### [Paper](https://dl.acm.org/doi/abs/10.1145/3641519.3657499) | [Video Youtube](https://www.youtube.com/embed/3UvBkyPc-oc) | [Project Page](https://yufan1012.github.io/MonoGaussianAvatar)

Getting Started

If you'd like to generate your own dataset, please follow intructions in the IMavatar repo.

Link the dataset folder to ./data/datasets. Link the experiment output folder to ./data/experiments.

Pre-trained model

Download a pretrained model from . Uncompress and put into the experiment folder ./data/experiments.

Training

python scripts/exp_runner.py ---conf ./confs/subject1.conf [--is_continue]

Evaluation

Set the is_eval flag for evaluation, optionally set checkpoint (if not, the latest checkpoint will be used) and load_path

python scripts/exp_runner.py --conf ./confs/subject1.conf --is_eval [--checkpoint 60] [--load_path ...]

GPU requirement

We train our models with a single Nvidia 24GB RTX3090 GPU.

Citation

If you find our code or paper useful, please cite as:

@inproceedings{chen2024monogaussianavatar,
  title={Monogaussianavatar: Monocular gaussian point-based head avatar},
  author={Chen, Yufan and Wang, Lizhen and Li, Qijing and Xiao, Hongjiang and Zhang, Shengping and Yao, Hongxun and Liu, Yebin},
  booktitle={ACM SIGGRAPH 2024 Conference Papers},
  pages={1--9},
  year={2024}
}