Excuse me, I have a question for the LIFT network implementation.
I found the supplementary description of your LIFT paper as:
In the picture you set the Activation for DESC part as Tanh. However:
I noticed that in your config.py you set config.desc_activ as 'relu' as default setting.
Then in your modules/lift_desc.py, from Line 69 to Line 75 the activation for DESC part will set to ''relu" if I run the python scripts as default setting.
Is it a wrong setting in the code? Or the activation of relu or Tanh does not matter so much in DESC part?
Or did I miss some setting changes in another place?
You are correct, that's what we originally did here, with Theano (and before that, here, with luatorch), but when porting to tensorflow ReLU performed either the same or slightly better, so this current repository uses ReLU. You can try it yourself with --desc_activ.
Excuse me, I have a question for the LIFT network implementation. I found the supplementary description of your LIFT paper as:
In the picture you set the Activation for DESC part as Tanh. However: I noticed that in your config.py you set config.desc_activ as 'relu' as default setting. Then in your modules/lift_desc.py, from Line 69 to Line 75 the activation for DESC part will set to ''relu" if I run the python scripts as default setting.
Is it a wrong setting in the code? Or the activation of relu or Tanh does not matter so much in DESC part? Or did I miss some setting changes in another place?