Zhendong-Wang / Diffusion-GAN

Official PyTorch implementation for paper: Diffusion-GAN: Training GANs with Diffusion
MIT License
602 stars 65 forks source link

how to lower fid? #26

Open octadion opened 1 year ago

octadion commented 1 year ago

now I'm trying to generate images in the form of textile motifs with diff-stylegan, I'm using transfer learning from lsun bedroom, cfg paper256 and the same diffusion configuration as the example, the fid I get ranges from 111 when it converges, is there any tips for can lower fid? now I'm trying to find a suitable learning rate, besides the learning rate are there other configurations that play an important role? thank you.

wangyp33 commented 1 year ago

@octadion@Zhendong-Wang

I got the same question when training with the command

python train.py --outdir=training-runs --data="cifar10.zip" --gpus=1 --batch 16 --batch-gpu=16 --cfg fastgan --kimg 50000 --target 0.45 --d_pos first --noise_sd 0.5

then I get the network named "best_model.pkl" in the outdir "training-runs", so I sample and evalution using the network, the metric command as follows:

python calc_metrics.py --metrics=fid50k_full --data=cifar10.zip --mirror=1 --network=best_model.pkl

But the fid is too high, can you tell me why if you know the reason :

{"results": {"fid50k_full": 13.780802051702606}, "metric": "fid50k_full", "total_time": 205.13619828224182, "total_time_str": "3m 25s", "num_gpus": 1, "snapshot_pkl": "best_model.pkl", "timestamp": 1685179674.1273081}

Zhendong-Wang commented 1 year ago

Hi there,

I rerun my code and it seems reproducing the good results. I guess you use the Diffusion-ProejctedGAN code. I did the follows:

conda env create -f environment.yaml
conda activate pg
python train.py --outdir=training-runs --data="~/cifar10.zip" --gpus=4 --batch 64 --batch-gpu=16 --cfg fastgan --kimg 50000 --target 0.45 --d_pos first --noise_sd 0.5

I show the trainining outputs that I have here. It converges fast.

Screenshot 2023-05-27 at 11 40 59