neurosim / DNN_NeuroSim_V1.3

Benchmark framework of compute-in-memory based accelerators for deep neural network (inference engine focused)
62 stars 36 forks source link

about batchnormalization layer #10

Closed JoyKwan closed 3 years ago

JoyKwan commented 3 years ago

Hello, I've learned a lot from Neurosim, But I have a quick question about network architecture: I didn't see batchnormalization code in any part of code related to network structure definitions, such as modules/ quantization_cpu_np_infer. py and cifar /model.py, Is there no Batchnormalization layer? And why not? thanks!

neurosim commented 3 years ago

Hi! The batch normalization is replaced by constant scaling in the "WAGE" algorithm with integers training. You can check the paper https://arxiv.org/pdf/1802.04680.pdf for more details.