idiap / sparch

PyTorch based toolkit for developing spiking neural networks (SNNs) by training and testing them on speech command recognition tasks
20 stars 4 forks source link

Paper equations not corresponding to the implementation #4

Open narduzzi opened 2 months ago

narduzzi commented 2 months ago

Hello,

I am trying to replicate the experiments of the paper. However, I don't understand why the torch implementation is as follows:

# in adLIF forward function, in loop (line 437)
            # Compute potential (adLIF)
            wt = beta * wt + a * ut + b * st
            ut = alpha * (ut - st) + (1 - alpha) * (Wx[:, t, :] - wt)

While the paper reports the following equations:


            # Compute potential (adLIF)
            prev_w = wt
            prev_u = ut
            prev_s = s

            ut = alpha * prev_u + (1 - alpha) * (Wx[:, t, :] - prev_w) - threshold*prev_s
            wt = beta * prev_w + (1 - beta) * a * prev_u + b * prev_s
            st = ...

Is there a simplification that I am missing? Where did the (1-beta) term go? Is it normal that the wtis reused within the same timestep?

Thank you in advance for your answer.