hexiangnan / attentional_factorization_machine

TensorFlow Implementation of Attentional Factorization Machine
407 stars 153 forks source link

dropout in validation/evaluation #8

Open Atomu2014 opened 6 years ago

Atomu2014 commented 6 years ago

非常感谢分享这份代码, 能否解释一下第134和143行中, dropout_keep这个参数在init_graph的时候直接传入tf.nn.dropout

Atomu2014 commented 6 years ago

请问在validation的时候是否也用了drop out,这是合理情况吗?在test的时候是否会使用dropout?

hexiangnan commented 6 years ago

只有training时使用dropout。validation 和 testing时必须disable dropout.

On Thu, Feb 15, 2018 at 2:44 PM, KevinKune notifications@github.com wrote:

请问在validation的时候是否也用了drop out,这是合理情况吗?在test的时候是否会使用dropout?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/hexiangnan/attentional_factorization_machine/issues/8#issuecomment-365839220, or mute the thread https://github.com/notifications/unsubscribe-auth/ABGxjhT99zz2sz_ARYG_9IJTalTZ7Ho7ks5tU9JdgaJpZM4R1huB .

-- Best Regards, Xiangnan He

Atomu2014 commented 6 years ago

貌似training和validation共用一套计算图,而且dropout在计算图中是写死的,没有发现training flag之类的分支

hexiangnan commented 6 years ago

train_phase 这个flag用户区分是在training还是testing/validation.

On Mon, Feb 26, 2018 at 11:02 AM, KevinKune notifications@github.com wrote:

貌似training和validation共用一套计算图,而且dropout在计算图中是写死的,没有发现training flag之类的分支

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/hexiangnan/attentional_factorization_machine/issues/8#issuecomment-368376996, or mute the thread https://github.com/notifications/unsubscribe-auth/ABGxjhlsyWmD7k2I0XFNC80Ae6vapAw7ks5tYh7IgaJpZM4R1huB .

-- Best Regards, Xiangnan He

Atomu2014 commented 6 years ago

感谢告知!