So basically, when a neuron is activated, it will set all it's gated connections gains to its activation value - seems ok.
But what if 2 neurons gate the same connection? After an activation, the gain of a connection will bet set to one of these two neurons (depending on activation order). Does this mean that multiple neurons gating the same connection us useless?
I'm not really familiar with LSTM's. However I thought I saw something odd:
See neuron.js (line 104)
So basically, when a neuron is activated, it will set all it's gated connections gains to its activation value - seems ok.
But what if 2 neurons gate the same connection? After an activation, the
gain
of a connection will bet set to one of these two neurons (depending on activation order). Does this mean that multiple neurons gating the same connection us useless?