fangwei123456 / spikingjelly

SpikingJelly is an open-source deep learning framework for Spiking Neural Network (SNN) based on PyTorch.
https://spikingjelly.readthedocs.io
Other
1.35k stars 239 forks source link

关于LatencyEncoder #286

Closed kumomokumo closed 1 year ago

kumomokumo commented 1 year ago

您好,我想在activation_based/examples/lif_fc_mnist.py的基础上,用LatencyEncoder+STDP训练一个单层全连接SNN。我仅修改了encoder方式:encoder = encoding.LatencyEncoder(10),结果报错tensor尺寸不匹配:RuntimeError: The size of tensor a (64) must match the size of tensor b (16) at non-singleton dimension 0。请问是我的调用方法有误,还是LatencyEncoder不能用于MNIST?谢谢!

fangwei123456 commented 1 year ago

为何不先把涉及计算的tensor的shape打印出来呢

kumomokumo commented 1 year ago

Using a target size (torch.Size([16, 10])) that is different to the input size (torch.Size([64, 10])). 打印了如何修改呢?

fangwei123456 commented 1 year ago

上面给出的信息太少,看不出哪个tensor是哪个shape。 提供最小的能够复现错误的代码。

kumomokumo commented 1 year ago

仅在示例代码基础上修改了这2行: parser.add_argument('-opt', type=str, choices=['sgd', 'adam'], default='sgd', help='use which optimizer. SGD or Adam') #使用SGD

encoder = encoding.LatencyEncoder(10)

kumomokumo commented 1 year ago

不好意思。。我重新下载了源码,没有error了

fangwei123456 commented 1 year ago

这个问题是存在的,原因是 LatencyEncoder 是有状态的编码器,在输入下一个样本前,应该进行重置。 所以应该在所有的 functional.reset_net(net) 后加一行 encoder.reset().

https://spikingjelly.readthedocs.io/zh_CN/latest/activation_based/basic_concept.html#id5

fangwei123456 commented 1 year ago

image 现在在文档中增加了提示,避免用户忘了reset