AndreaCossu / Relation-Network-PyTorch

Implementation of Relation Network and Recurrent Relational Network using PyTorch v1.3. Original papers: (RN) https://arxiv.org/abs/1706.01427 (RRN): https://arxiv.org/abs/1711.08028
MIT License
19 stars 7 forks source link

Activation in RRN #2

Closed DennisCraandijk closed 5 years ago

DennisCraandijk commented 5 years ago

Hi Andrea, Thanks for this neat implementation!

I've noticed the MLP consists of linear layers followed by a tanh layer (code). However, in the RRN paper the authors mention using ReLu layers followed by a linear layer. Is this variation intentional?

AndreaCossu commented 5 years ago

Hello Dennis,

I hardcoded the activation function as tanh just to experiment with it but my final aim is to make the activation function a parameter in the constructor of the MLP. As soon as I have time I will make this change.

DennisCraandijk commented 5 years ago

Ok great, just checking if there was any other reason.