nengo / nengo-loihi

Run Nengo models on Intel's Loihi chip
https://www.nengo.ai/nengo-loihi/
Other
35 stars 12 forks source link

Discretize to use as much of the dynamic range as possible #83

Open tbekolay opened 5 years ago

tbekolay commented 5 years ago

Right now our discretization process uses the maximum and minimums from the actual values of several quantities, including intercepts, which led to issues such as those solved in #69. We should look into other methods for setting the ranges for these discretization processes, as the numerical resolution is not even across the number line, and even if it were we should not be super sensitive to outliers.

tbekolay commented 5 years ago

As we determined in #89, discretization ranges are especially important for networks with online learning (PES rule). If the initial function / weights are in the wrong range, then error signals can easily push weights to under/overflow.

This case might be similar enough to other cases that it can be handled in similar ways. But it also might be case that we'll have to do something different here.

A solution that comes to mind is to determine the min/max for discretization based only on factors like the number of neurons and their properties in the pre ensemble rather than using the connection function, even if that means we do a very bad job at computing the initial connection.