jeshraghian / snntorch

Deep and online learning with spiking neural networks in Python
https://snntorch.readthedocs.io/en/latest/
MIT License
1.26k stars 214 forks source link

Add power profiling capabilities #170

Open jeshraghian opened 1 year ago

jeshraghian commented 1 year ago

This seems to be a super popular feature request. Making accurate estimates seems near impossible, but we can probably generate an order of magnitude guess here.

The user would construct a model, pass data in, and the power profiling function returns the number of Synaptic operations in the forward-pass (this could be averaged across batches).

Each synaptic op would be scaled by the energy cost for all selected devices; e.g., various GPUs & neuromorphic hardware. The same number would be given for non-spiking networks too. This could be achieved by just removing the spiking modules.

SpikingKeras has a similar function that does it really nicely. However, it overstates the improvement given with spikes because it does not account for overhead (i.e., moving data to/from memory, or between multiple chips).

Including an argument that factors in overhead would by tricky, but useful. The model would be parsed for number of neurons/synapses, and if either exceeds the bandwidth of a single chip, then we need to estimate how frequently data needs to be moved between chips & add that to the overall energy consumption.

A lot of coarse estimates would be made, but I think it could be helpful.

mlewandowski0 commented 1 year ago

Dear Sir,

Are there plans or work being done to implement it? Can I implement it or help with the implementation ?

Best Regards,