We are newly training a generator and an encoder using our own datasets.
We checked that the generator which we trained can generate images very well. However, we got a problem when training the encoder. Encoder is not able to reconstruct the input images at all.
We tried to learn both generator and encoder using our own datasets and FFHQ 256x256 images. However, both encoder is not trained at all.
This is the results of running train_encoder.py
We are using the best .pkl file by generating and so on, but the encoder is not trained correctly after enough iterations.
We checked the several closed and open issues, and we found the solution you suggested. #16
adding batch normalization layer after convolution in encoder network.
But in the training/network_encoder.py, there is a batch norm layer after each convolution layer.
We didnt' change any options except minibatch size. Could you give me advices and suggestions to solve the problem?
We are working on IdInvert for a few weeks to apply to our own datasets.
Also, could you share us detail hyperparameter setting for FFHQ 256x256 ?
Iteration and elapsed time for train.py / train_encoder.py
Thank you for your great work!
We are newly training a generator and an encoder using our own datasets. We checked that the generator which we trained can generate images very well. However, we got a problem when training the encoder. Encoder is not able to reconstruct the input images at all.
We tried to learn both generator and encoder using our own datasets and FFHQ 256x256 images. However, both encoder is not trained at all. This is the results of running train_encoder.py
We are using the best .pkl file by generating and so on, but the encoder is not trained correctly after enough iterations.
We checked the several closed and open issues, and we found the solution you suggested. #16 adding batch normalization layer after convolution in encoder network. But in the training/network_encoder.py, there is a batch norm layer after each convolution layer.
We didnt' change any options except minibatch size. Could you give me advices and suggestions to solve the problem?
We are working on IdInvert for a few weeks to apply to our own datasets. Also, could you share us detail hyperparameter setting for FFHQ 256x256 ?
Thank you.