Find details about the calculation in the neuron of a LSTM network. The X symbol represent an element-wise multiplication between its inputs, but how does it work?
What is a time step (one element of a sequence or mutliple sequences)?
Functionality of all gates is not clear (different gates are listed but their use is not clearly pointed out).
Gates are not neural networks, instead activation functions applied to weighted input / hidden states, the use of the term neural network for the gates seems confusing.
Judging from the first picture U and W are not the input and hidden state vectors but rather weights applied to them.
Hidden state is the same as cell output?
From what I understood there are three operations within a LSTM cell:
Forget operation (forget gate): Based on the previous cells output and the current input, information is removed from the cell state.
Input? operation (candidate layer / input gate): Based on the previous cells output and the current input information is added to the cell state.
Output (output gate): Based on the previous cells output, the current input and the cell state the output of the cell is calculated.
Find details about the calculation in the neuron of a LSTM network. The X symbol represent an element-wise multiplication between its inputs, but how does it work?