ZK-Zhou / spikformer

ICLR 2023, Spikformer: When Spiking Neural Network Meets Transformer
MIT License
277 stars 43 forks source link

Energy consumption calculation on CIFAR10 dataset #23

Open msjun23 opened 6 months ago

msjun23 commented 6 months ago

Hi, thanks for you fancy work and publishing the code!

I reproduced you work with CIFAR10 dataset and calculated the energy consumption with this code (Spikingformer).

I got below outputs

SSA info: {'depth': 4, 'Nheads': 12, 'embSize': 384, 'patchSize': 4, 'Tsteps': 4} Firing rate of Q/K/V inputs in each block: [[0.14379521053803118, 0.10392173096726212, 0.08082685798783845], [0.09622804452724094, 0.079531397717663, 0.07723161116053787], [0.0788185744534565, 0.07352738761449162, 0.0678107915799829], [0.0876925362250473, 0.08682637812593315, 0.06879291607986522]] Number of operations: 0.005312266 G MACs, 0.2458512465306023 G ACs Energy consumption: 0.24570254547754208 mJ Nops: ['3703662346.0 Ops', '284837097.97817653 Ops', '1073133322.0 Ops'] Nparams: 9330874 Time cost: 0.4905501802762349 min

But, there are no accurate information about energy concumption on CIFAR10 in your paper, so I can't check if they are right values.

Can I make sure that is a properly reproduced results?

I used default config setting on cifar10.yaml i.e., Spikformer-4-384.

Thank you.