naver-ai / StyleMapGAN

Official pytorch implementation of StyleMapGAN (CVPR 2021)
https://www.youtube.com/watch?v=qCapNyRA_Ng
Other
458 stars 81 forks source link

you can add example for colab #1

Open molo32 opened 3 years ago

molo32 commented 3 years ago

please add example for colab

blandocs commented 3 years ago

Unfortunately, I don't have enough time for now. I'd really appreciate it if other people post it.

cyrilzakka commented 3 years ago

I have a working colab in case you need assisstance

molo32 commented 3 years ago

cyrilzakka do you have a working colab? you can share it

cyrilzakka commented 3 years ago
!git clone https://github.com/naver-ai/StyleMapGAN.git
!cd StyleMapGAN/

!pip install pytorch==1.4.0 torchvision==0.5.0
!pip install numpy==1.18.1 scikit-image==0.16.2 tqdm
!pip install lmdb==0.98 opencv-python==4.2.0.34 munch==2.5.0
!pip install -U scikit-image==0.15.0 scipy==1.2.1 matplotlib scikit-learn
!pip install flask==1.0.2 pillow==7.0.0
!pip install Ninja

%cd StyleMapGAN/

To download checkpoints:

!bash download.sh create-lmdb-dataset celeba_hq

# Download the pretrained network (256x256) 
!bash download.sh download-pretrained-network-256 celeba_hq # 20M-image-trained models
!bash download.sh download-pretrained-network-256 celeba_hq_5M # 5M-image-trained models used in our paper for comparison with other baselines and for ablation studies.

# Download the pretrained network (1024x1024 image / 16x16 stylemap / Light version of Generator)
!bash download.sh download-pretrained-network-1024 ffhq_16x16

And training is as easy as:

!python train.py --dataset celeba_hq --train_lmdb data/celeba_hq/LMDB_train --val_lmdb data/celeba_hq/LMDB_val

Don't forget to preprocess your data if you're using a custom dataset

ArijZouaoui commented 2 years ago

Hello @cyrilzakka , can you please send me a link for your colab notebook so I can custom it to my dataset? In fact, I have been trying to prepare my data and feed it to generate.py in order to test the model but I'm facing some problems. Thank you in advance!