shaoanlu / faceswap-GAN

A denoising autoencoder + adversarial losses and attention mechanisms for face swapping.
3.39k stars 841 forks source link

error while Building new loss funcitons... #164

Open Kizunaaa opened 4 years ago

Kizunaaa commented 4 years ago

when gen_iterations =7850/11850/15850.. output is below like this i need to set gen_iterations to 7851/11581/15851 by myself ,then continue training?


当我迭代到需要改变loss参数的时候服务就会挂掉,然后我重启的时候手动设置迭代数,再开始训练,这样没问题吧?

Model weights files have been saved to ./models. WARNING:tensorflow:From /home/user/.local/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:88: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead.

Model weights files are successfully loaded. Building new loss funcitons... gan_training = mixup_LSGAN use_PL = True PL_before_activ = False use_mask_hinge_loss = True m_mask = 0.5 lr_factor = 1.0 use_cyclic_loss = False WARNING:tensorflow:From /home/user/.local/lib/python3.7/site-packages/tensorflow_core/python/ops/math_grad.py:1424: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version. Instructions for updating: Use tf.where in 2.0, which has the same broadcast rule as np.where Complete.

Kizunaaa commented 4 years ago

loss_config['m_mask'] = 0. this is 0.0 or 0.2?

elif gen_iterations == (2*TOTAL_ITERS//3 - display_iters//2): clear_output() loss_config['use_PL'] = True loss_config['use_mask_hinge_loss'] = False loss_config['m_mask'] = 0.
loss_config['lr_factor'] = 0.3 reset_session(models_dir) print("Building new loss funcitons...") show_loss_config(loss_config) model.build_train_functions(loss_weights=loss_weights, **loss_config) print("Done.")