You can train finetune your GFPGAN-1024 model with your own dataset! inputs:512 -> outputs:1024
You can use my model to train. It contains everything!
original | gfpgan | gfgan-1024
pip install -r requirements.txt
git clone git@github.com:LeslieZhoa/LVT.git
cd process; python get_roi.py
refer GFPGAN to download
put these model into pretrained_models
change dataset path in model/config.py
self.img_root -> ffhq data root
self.train_hq_root -> your own 1024 data root
self.train_lq_root -> your own lq data root
self.train_lmk_base -> train lmk by get_roi.py
self.val_lmk_base -> val lmk by get_roi.py
self.val_lq_root -> val lq data root
self.val_hq_root -> val hq data root
set self.mode = 'decoder' in model/config.py
train util you think it is ok.
set self.mode = 'encoder' and self.pretrain_path from stage 1 in model/config.py
train util you think it is ok.
set self.mode = 'encoder' and self.pretrain_path from stage 2 in model/config.py
use early stop.
stage 1 && stage 2 -> python train.py --batch_size 2 --scratch --dist
stage 3 -> python train.py --batch_size 2 --early_stop --dist
Support multi node multi gpus training
Can multi batch
python utils/convert_pt.py