-
will the repo add code to train a BigGan from scratch? Thank you
-
Thank you for your great work.
I was wondering if you used only fake samples when testing with the CNNSpot dataset. Here are the A.P. results of my tests, and almost all of the real samples were det…
-
-
Would it be possible to finetune this BigGAN implementation to a custom dataset, in order to generate new classes of images?
-
Version 2 of the BigGAN paper introduced a new architecture which made each upscaling block deeper but thinner. This BigGAN-deep was reportedly strictly superior: fewer parameters, less memory-use, an…
gwern updated
3 years ago
-
### 論文へのリンク
[[arXiv:2009.13829] TinyGAN: Distilling BigGAN for Conditional Image Generation](https://arxiv.org/abs/2009.13829)
### 著者・所属機関
Ting-Yun Chang, Chi-Jen Lu
- Institute of Informa…
-
Why the program can run in the training, but stuck in saving weight?
Saving weights to /data1/code/BigGAN-PyTorch/weights/BigGAN_I512_seed0_Gch96_Dch96_bs8_nDa8_nGa8_Glr1.0e-04_Dlr4.0e-04_Gnlinplac…
-
Looking at the official BigGAN implementation in Tensorflow, I found they use ConvTranspose2d for Upsample and Conv2d for Downsample in the ResNet block (e.g. [https://github.com/taki0112/BigGAN-Tenso…
-
model = BigGAN.from_pretrained('biggan-deep-256')
Can we just replace 'biggan-deep-256' with the path which we download the pre-train model??
-
@es-clip hi,
i released a biggan + clip + cma-es notebook in parallel with bigsleep, which to my best knowledge was the first solution combining clip and cma-es, and at the time allowed me to get res…