ZhenYangIACAS / NMT_GAN

generative adversarial nets for neural machine translation
Apache License 2.0
119 stars 37 forks source link

dis_saveto #26

Open SunXiaoqian1 opened 5 years ago

SunXiaoqian1 commented 5 years ago

Hi, What file corresponds to dis_saveto? Looking forword to your reply.

SunXiaoqian1 commented 5 years ago

I encountered the following error when i trained the discriminator.

DataLossError (see above for traceback): Unable to open table file ./model/discriminator: Failed precondition: model/discriminator: perhaps your file is in a different file format and you need to use a different restore operator? [[Node: save/RestoreV2_6 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_arg_save/Const_0_0, save/RestoreV2_6/tensor_names, save/RestoreV2_6/shape_and_slices)]] [[Node: save/RestoreV2_34/_247 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/gpu:0", send_device="/job:localhost/replica:0/task:0/cpu:0", send_device_incarnation=1, tensor_name="edge_930_save/RestoreV2_34", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/gpu:0"]()]]

Looking forword your reply.

ZhenYangIACAS commented 5 years ago

It seems that you made a mistake when you restore the parameters of the discriminator? Have you pre-trained a discriminator?

SunXiaoqian1 commented 5 years ago

I trained the generator first, then generated the negative data, and then the following error occurred when training the discriminator.

------------------ 原始邮件 ------------------ 发件人: "zhyang"notifications@github.com; 发送时间: 2019年4月24日(星期三) 下午4:02 收件人: "ZhenYangIACAS/NMT_GAN"NMT_GAN@noreply.github.com; 抄送: "普罗旺斯的约定"378299099@qq.com; "Author"author@noreply.github.com; 主题: Re: [ZhenYangIACAS/NMT_GAN] dis_saveto (#26)

It seems that you made a mistake when you restore the parameters of the discriminator? Have you pre-trained a discriminator?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

ZhenYangIACAS commented 5 years ago

Maybe you need to set reload=False, whchi ensure that you re-train the discriminator, other than reloads the pre-train discriminator.

SunXiaoqian1 commented 5 years ago

Thank you very much. Should I set reload =False when pre-training the discriminator, and then set reload=reload when gan training. Best wishes for you.

SunXiaoqian1 commented 5 years ago

I am sorry to bother you again. How should I generate a .pkl file for the dictionary ? Looking forword your reply.

SunXiaoqian1 commented 5 years ago

After changing the reload, I pre-train the discriminator and I got the following error.

Traceback (most recent call last): File "discriminator_pretrain.py", line 82, in gan_train(config) File "discriminator_pretrain.py", line 63, in gan_train discriminator.train() File "/home/sxq/Zhen/cnn_discriminator.py", line 669, in train x, y, xs, epoch = next(train_it) File "/home/sxq/Zhen/cnn_discriminator.py", line 632, in train_iter n_words_source = self.vocab_size_s) File "/home/sxq/Zhen/data_iterator.py", line 18, in init self.dic_target = pkl.load(f_trg) cPickle.UnpicklingError: invalid load key, '<'.

In the discriminator, I set the dictionary to be the same as in the generator. The dictionary of the discriminator of the discriminator in your project is a .pkl file. I think it is because the format is incorrect, but I don't kown what to do next. Looking forword your reply.

------------------ 原始邮件 ------------------ 发件人: "zhyang"notifications@github.com; 发送时间: 2019年4月24日(星期三) 下午4:02 收件人: "ZhenYangIACAS/NMT_GAN"NMT_GAN@noreply.github.com; 抄送: "普罗旺斯的约定"378299099@qq.com; "Author"author@noreply.github.com; 主题: Re: [ZhenYangIACAS/NMT_GAN] dis_saveto (#26)

It seems that you made a mistake when you restore the parameters of the discriminator? Have you pre-trained a discriminator?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

ZhenYangIACAS commented 5 years ago

Getting the .pkl file is easy. You just need to dump the vocabs used by the generator to .pkl file.

SunXiaoqian1 commented 5 years ago

Thank You very much. The pre-training of the discriminator is running normally. Best wishes for you.

SunXiaoqian1 commented 5 years ago

Hello, I have been training the discriminator when the accuracy rate has been: 0.25, 0.5, 0.75, 1 these four numbers, I don't know why? In addition, the paper compares the effects of different accuracy rate discriminators on confrontation training. How is this accuracy tested? I look forward to your reply.

epoch 0, samples 20668, loss 3.634067, accuracy 0.500000 BatchTime 0.372959, for discriminator pretraining epoch 0, samples 20672, loss 3.883449, accuracy 0.500000 BatchTime 0.419294, for discriminator pretraining epoch 0, samples 20676, loss 2.348865, accuracy 0.500000 BatchTime 0.420956, for discriminator pretraining epoch 0, samples 20680, loss 2.402224, accuracy 0.250000 BatchTime 0.405261, for discriminator pretraining epoch 0, samples 20684, loss 1.328440, accuracy 0.500000 BatchTime 0.467977, for discriminator pretraining epoch 0, samples 20688, loss 1.028025, accuracy 0.750000 BatchTime 1.012261, for discriminator pretraining epoch 0, samples 20692, loss 1.549341, accuracy 0.500000 BatchTime 0.634697, for discriminator pretraining epoch 0, samples 20696, loss 6.031885, accuracy 0.500000 BatchTime 0.405600, for discriminator pretraining epoch 0, samples 20700, loss 1.932773, accuracy 0.500000 BatchTime 0.455647, for discriminator pretraining epoch 0, samples 20704, loss 1.371862, accuracy 0.500000 BatchTime 1.361365, for discriminator pretraining epoch 0, samples 20708, loss 0.425249, accuracy 0.750000 BatchTime 0.371475, for discriminator pretraining epoch 0, samples 20712, loss 1.613419, accuracy 0.250000 BatchTime 0.744851, for discriminator pretraining epoch 0, samples 20716, loss 1.059414, accuracy 0.500000 BatchTime 0.959003, for discriminator pretraining epoch 0, samples 20720, loss 2.398868, accuracy 0.500000 BatchTime 0.575805, for discriminator pretraining epoch 0, samples 20724, loss 1.453211, accuracy 0.500000 BatchTime 0.410487, for discriminator pretraining epoch 0, samples 20728, loss 2.586250, accuracy 0.500000 BatchTime 0.915709, for discriminator pretraining epoch 0, samples 20732, loss 2.207909, accuracy 0.500000 BatchTime 0.688516, for discriminator pretraining epoch 0, samples 20736, loss 0.855769, accuracy 0.500000 BatchTime 0.499345, for discriminator pretraining epoch 0, samples 20740, loss 3.027243, accuracy 0.500000 BatchTime 0.410718, for discriminator pretraining epoch 0, samples 20744, loss 1.883792, accuracy 0.500000 BatchTime 0.696694, for discriminator pretraining epoch 0, samples 20748, loss 0.590286, accuracy 0.750000 BatchTime 0.330475, for discriminator pretraining epoch 0, samples 20752, loss 1.134261, accuracy 0.750000 BatchTime 0.478576, for discriminator pretraining epoch 0, samples 20756, loss 1.763076, accuracy 0.500000 BatchTime 0.424916, for discriminator pretraining