fangwei123456 / spikingjelly

SpikingJelly is an open-source deep learning framework for Spiking Neural Network (SNN) based on PyTorch.
https://spikingjelly.readthedocs.io
Other
1.22k stars 233 forks source link

降低时间步长方法 #540

Open 1439278026 opened 1 month ago

1439278026 commented 1 month ago

对于一个数据形状为100*500,100表示通道数,500表示数据点数,也可以视为SNN的时间步数,所有数据已是0,1格式,很明显500个时间步对于SNN来说太长了,请问有没有什么方法降低时间步长

Ym-Shan commented 1 month ago

From the perspective of rate coding, we can consider input data. For SNN, we generally understand it as encoding the data first, and then inputting the encoded result into the network. The number of frames (T) you encode the data into will result in corresponding data input into the network. So I think you can modify the encoding method of the network input data and reduce its encoding timesteps.

But if the previous work used timesteps=500, then modifying the size of timesteps may lead to a decrease in accuracy, which we believe is very likely.

Are you using a conversion method? Why is there such a large number of timesteps?

1439278026 commented 1 month ago

Thank you for your reply. In fact, this is a dataset we collected ourselves. After downsampling, the shape is 80 * 500, but each channel is still very sparse. Among the 500 data points, less than 10% of them are 1. Therefore, I am looking for some way to reduce the time step, whether it is encoding or other methods

Met4physics commented 1 month ago

你是rate encoding吗,还是神经形态数据集

1439278026 commented 1 month ago

是神经形态数据集,经过处理后是目前的形式