nengo / keras-lmu

Keras implementation of Legendre Memory Units
https://www.nengo.ai/keras-lmu/
Other
207 stars 35 forks source link

Add trainable theta and euler as discretizer #41

Closed gsmalik closed 2 years ago

gsmalik commented 3 years ago

What does this PR add?

How is a trainable theta implemented?

How does training with Euler work?

How does training with Zero Order Hold (zoh) work?

Where to start the review?

You can start from the commit of Add trainable theta and discretization options and then go to Update and add new tests. These are the only 2 main commits. There is an additional commit but that is a bones update.

Any other remarks?