jatinchowdhury18 / RTNeural

Real-time neural network inferencing
BSD 3-Clause "New" or "Revised" License
600 stars 58 forks source link

No 'vanilla' RNN layer support? #147

Open nvssynthesis opened 3 weeks ago

nvssynthesis commented 3 weeks ago

Is it correct that there is no support for 'vanilla' RNN layers, e.g. that of torch.nn.RNN? Is the reason for this something like 'GRU or LSTM is better anyway, just use that'?

jatinchowdhury18 commented 3 weeks ago

That's correct, at the moment RTNeural does not have support for that layer type. The reasoning is more just that I haven't yet had a need for it, and haven't received any requests to implement it (up to now). We probably should implement that layer, especially since it's simpler than the GRU or LSTM layers.

I'll probably end up naming the layer something like ElmanRNN, since I think that's maybe a more "specific" name than just RNN. Would you happen to know if TensorFlow has an equivalent layer?

I've added this to my to-do list, but it might be a minute before I get around to implementing it.

nvssynthesis commented 2 weeks ago

Excellent, thanks for clarifying. I just know there is tf.keras.layers.RNN but I'm not dead sure if it's equivalent.