IntelLabs / IntelNeuromorphicDNSChallenge

Intel Neuromorphic DNS Challenge
MIT License
127 stars 27 forks source link

Problems with validating model size in Kilobytes #19

Closed Michaeljurado42 closed 1 year ago

Michaeljurado42 commented 1 year ago

Hello and good day. I was trying to benchmark the kilobytes of my submission. I created a script to loop through the netx layers to calculate the size of the model. I am getting the result that the baseline SDNN requires about 670 KB. I am thinking there is potentially a mistake in this code or that the original baseline solution has a slightly miscalculated model size. If I am making a mistake in this calculation, then my estimated model size will be off.

Also, is it possible to submit a metrics board solution on the 21st if our model weights are frozen?

bamsumit commented 1 year ago

Hi @Michaeljurado42, here is how the model size was calculated.

import numpy as np
from lava.lib.dl import netx
net = netx.utils.NetDict('network.net')
weights = [net['layer'][l]['weight'] for l in range(1, 4)]
delays = [net['layer'][l]['delay'] for l in range(1, 3)]
model_bits = sum([np.ceil(np.log2(np.abs(w).max())) * np.prod(w.shape) for w in weights]) + sum([np.ceil(np.log2(np.abs(d)).max()) * np.prod(d.shape) for d in delays])
print('Model Size (KB):', model_bits / 8 / 1024)

Also, is it possible to submit a metrics board solution on the 21st if our model weights are frozen?

You need to have at least one metricsboard entry with your model before the Track 1 test set is announced (EOD August 18, 2023) to qualify for Track 1 evaluation.

Michaeljurado42 commented 1 year ago

Thanks! @bamsumit I have added the metricsboard entry to my fork.