fmi-basel / latent-predictive-learning

Code to accompany our paper "The combination of Hebbian and predictive plasticity learns invariant object representations in deep sensory networks” bioRxiv 2022
MIT License
22 stars 5 forks source link

Implementation of double exponential filtering for synaptic traces #3

Closed yilun-wu closed 10 months ago

yilun-wu commented 1 year ago

Dear authors, In the implementation for $\alpha * \big(\epsilon * S_j(t)f^{\prime}U_i(t)\big)$ (eq. 18):

https://github.com/fmi-basel/latent-predictive-learning/blob/dd486d092b8cea9bfa1f27f42209f9174b82e61c/spiking_simulations/LPLConnection.cpp#L303-L316

It seems the two exponential filterings of $\alpha$ are done differently:

My questions are:

fzenke commented 1 year ago

Thanks again for your question. Conventionally, I do the double filtering for synaptic traces differently when the inputs are spikes so that the jump for spike arrival in the first filter is always one regardless of the time step. However, the alpha trace does not receive spikes as inputs, so this is unnecessary here. I can only assume that I did it out of habit, but frankly, I do not remember off the top of my head. In any case, the only thing it changes is the amplitude of the filter kernel, which can be absorbed in the learning rate. Concerning your second question, why is one filtering done first? This choice was probably unintentional, and the two are approximately the same for small time steps (or long enough time constants). Strictly speaking, all updates should be done simultaneously anyway. However, in Auryn, we deliberately do in-place operations for performance reasons. I will keep your two tickets open until I have time to look into the code again in more detail. But as I already mentioned, it will take me some time. If we confirm discrepancies between the code and our methods, we will issue an erratum.

Thanks heaps again for your careful reading of the code, and let me know if you have any further questions.

yilun-wu commented 1 year ago

Understood. Thank you!