murraylab / PsychRNN

https://psychrnn.readthedocs.io
MIT License
133 stars 42 forks source link

Numerical stability is better when array sizes are powers of two #38

Open syncrostone opened 2 years ago

syncrostone commented 2 years ago

Problem: when running the same code multiple times with the same seeds, there are small numerical differences that arise over the course of training. This is fixed if array sizes are powers of two.

Suggestion: Use array sizes that are powers of two for now

Eventually I would like to implement a workaround (if tensorflow doesn't have a way to activate a built in one) where if the array size is not a power of two, in the background an array with dimensions that are powers of two is made and unneeded entries are set to 0. If this is relevant to you and you want to work on that workaround please do (and drop a comment here so people don't duplicate work).