keras-team / tf-keras

The TensorFlow-specific implementation of the Keras API, which was the default Keras from 2019 to 2023.
Apache License 2.0
63 stars 29 forks source link

LMU layer, Legendre Memory Units #368

Open flacle opened 1 year ago

flacle commented 1 year ago

Describe the problem

LMU's (originally introduced in 2019) are a separate class of RNNs. They have been demonstrated to outperform LSTMs on some tasks that require longer time windows while requiring fewer parameters (see NeurIPS 2019 submission). Including LMU's directly into Keras would extend the usefulness and impact of Keras for its users where parameter size is an important consideration.

tilakrayal commented 1 year ago

@flacle, Could you please elaborate about your Feature. Also, please specify the Use Cases for this feature. Thank you!

mattdangerw commented 1 year ago

Talked this over, I am not sure we would want to go straight to a new layer at this time, but if you are interested, you could contribute a new Keras example (instructions here) showing how to implement this layer, and how it can improve performance on a timeseries dataset over lstms!

github-actions[bot] commented 1 year ago

This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.

github-actions[bot] commented 7 months ago

This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.