Open Atomu2014 opened 6 years ago
请问在validation的时候是否也用了drop out,这是合理情况吗?在test的时候是否会使用dropout?
只有training时使用dropout。validation 和 testing时必须disable dropout.
On Thu, Feb 15, 2018 at 2:44 PM, KevinKune notifications@github.com wrote:
请问在validation的时候是否也用了drop out,这是合理情况吗?在test的时候是否会使用dropout?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/hexiangnan/attentional_factorization_machine/issues/8#issuecomment-365839220, or mute the thread https://github.com/notifications/unsubscribe-auth/ABGxjhT99zz2sz_ARYG_9IJTalTZ7Ho7ks5tU9JdgaJpZM4R1huB .
-- Best Regards, Xiangnan He
貌似training和validation共用一套计算图,而且dropout在计算图中是写死的,没有发现training flag之类的分支
train_phase 这个flag用户区分是在training还是testing/validation.
On Mon, Feb 26, 2018 at 11:02 AM, KevinKune notifications@github.com wrote:
貌似training和validation共用一套计算图,而且dropout在计算图中是写死的,没有发现training flag之类的分支
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/hexiangnan/attentional_factorization_machine/issues/8#issuecomment-368376996, or mute the thread https://github.com/notifications/unsubscribe-auth/ABGxjhlsyWmD7k2I0XFNC80Ae6vapAw7ks5tYh7IgaJpZM4R1huB .
-- Best Regards, Xiangnan He
感谢告知!
非常感谢分享这份代码, 能否解释一下第134和143行中, dropout_keep这个参数在init_graph的时候直接传入tf.nn.dropout