-
Thanks for the interesting work.
I want to reproduce the NVAE model in CelebAHQ 256 dataset. Could you provide the main NVAE training scripts with detailed parameters in the Celeba-HQ dataset?
…
-
As pointed out in other issues, one cannot judge convergence based on losses. However, I am afraid of overfitting and it is quite difficult to judge from looking at the generated images if the model i…
-
Regarding CelebA-HQ - in the link you shared there are instructions on how to create tfrecords for CelebA-HQ but not the npy which are required in your code. Can you please provide some guidance on th…
-
Thanks for sharing your code!
I have a question about the parameter 'overlap'. I download your pretrained model and test it on CelebA-HQ dataset with rectangle mask. When I set '--overlap 4', the r…
wj320 updated
3 years ago
-
The pre-trained model you provided is not well-performing over the Celeb-A-HQ dataset. So I've got a question that for how many epochs you have trained the pre-trained model and on what data set.
A…
-
Thank you very much for the impressive work
It is not clear to me:
1) How to fine-tune the existing models on another dataset?
2) How to train a deep privacy model from scratch?
I…
-
Thank you for your great work! I'm about to try to reproduce the FFHQ encoder. The problem may not from your code but from the datasets preparation . The Nvidia FFHQ dataset is available for resolutio…
-
If I have a new face photo and a already trained model(.pth files), how can I get the predicted_labels(.png mask files)?
Can I use your program to automatically generate it?
Thanks very much!
-
I get stuck at the same error whether I try the Docker approach or follow the instructions myself.
```
PackagesNotFoundError: The following packages are not available from current channels:
-…
-
Hello, thank you for this great work.
I am trying to train retinanet512 model on my own dataset, but there aren't examples of usage of train.py in version 2.
How do I train on my own dataset in ve…