Closed Kaixhin closed 6 years ago
Rather than introducing a MI-RNN, MI-LSTM, MI-GRU etc., adding an extra flag to each of the existing units would seem better for code-reuse?
@Kaixhin Yes. Those are some really promising results. Do you plan on implementing MI for any of the rnn modules?
@nicholas-leonard No - I think this is best left to someone who's already familiar with the codebase (as opposed to a pure user, e.g. myself).
@Kaixhin Sounds good. In any case, thanks for suggesting the addition. I believe @Manojelement wants to work on it.
Hey guys, is there any update on this?
On Multiplicative Integration with Recurrent Neural Networks shows a boost in performance with RNNs by replacing the (elementwise) addition of the input and previous hidden state with an elementwise multiplication (a.k.a. the Hadamard product).
Rather than introducing a MI-RNN, MI-LSTM, MI-GRU etc., adding an extra flag to each of the existing units would seem better for code-reuse?