XingangPan / GAN2Shape

Code for GAN2Shape (ICLR2021 oral)
https://arxiv.org/abs/2011.00844
MIT License
571 stars 104 forks source link

How do I get latent files? #6

Open h1-ti opened 3 years ago

h1-ti commented 3 years ago

Hi. Thanks for sharing!! I confirmed this code working well on Windows. (need to replace the separator) How do I prepare latent files(.pt) if I want to train my own images?

XingangPan commented 3 years ago

@h1-ti Thanks for your interest in our work. To train on your own images, your need to perform the so called "GAN inversion" process to obtain the latent codes. There are several tools to do this, some good ones are: https://github.com/rosinality/stylegan2-pytorch (projector.py) and https://github.com/genforce/idinvert_pytorch.

h1-ti commented 3 years ago

Thank you for your reply! I've checked https://github.com/genforce/idinvert_pytorch and I obtained latent codes, but they are different dimensions(14, 512) from yours(1, 512). Finally I found this issue https://github.com/genforce/idinvert_pytorch/issues/5. But I don't know how to reduce dimensions. Could you give me a clue? Thank you.

XingangPan commented 3 years ago

@h1-ti I haven't try idinvert_pytorch before, but https://github.com/rosinality/stylegan2-pytorch (projector.py) works for me.

FrancisYu2020 commented 3 years ago

I can only see the pretrained model on FFHQ dataset in https://github.com/rosinality/stylegan2-pytorch, does that mean if we want to try other images say dogs, then we need to first use our own dog images to train a pretrained model of stylegan2 and use that checkpoint to do the GAN inversion? And finally, use the latent from stylegan2 inversion and images to train from scratch in your code to do the 3d reconstruction?

XingangPan commented 3 years ago

@FrancisYu2020 Yes, you are right.

FrancisYu2020 commented 3 years ago

@FrancisYu2020 Yes, you are right.

Thank you for your awesome pipelines and example scripts for the interesting idea. They are more than helpful!

XingangPan commented 3 years ago

@FrancisYu2020 Thanks for your interest. Hope you are having fun with it : )