NeuromorphicProcessorProject / snn_toolbox

Toolbox for converting analog to spiking neural networks (ANN to SNN), and running them in a spiking neuron simulator.
MIT License
360 stars 105 forks source link

Error in build_convolution targeting Brian2 #44

Closed wilkieolin closed 5 years ago

wilkieolin commented 5 years ago

I'm building the sample Keras LeNet ANN into an SNN targeting the Brian2 backend. However, there is an error when running the build_convolution section of the conversion where conn is referenced before being defined.

def build_convolution(self, layer, input_weight=None):
        from snntoolbox.simulation.utils import build_convolution

        delay = self.config.getfloat('cell', 'delay')
        transpose_kernel = \
            self.config.get('simulation', 'keras_backend') == 'tensorflow'
        self._conns, self._biases = build_convolution(layer, delay,
                                                      transpose_kernel)
        self.set_biases()

        print("Connecting layer...")
        for conn in self._conns:
            i = conn[0]
            j = conn[1]
            self.connections[-1].connect(i=i, j=j)
        if input_weight is not None:
            self.connections[-1].w = input_weight.flatten()
        else:
           # --- here conn is referenced without being defined ---
            self.connections[-1].w[i, j] = conn[2] 

I believe the following code is appropriate to fix the issue but wanted to confirm.

def build_convolution(self, layer, input_weight=None):
        from snntoolbox.simulation.utils import build_convolution

        delay = self.config.getfloat('cell', 'delay')
        transpose_kernel = \
            self.config.get('simulation', 'keras_backend') == 'tensorflow'
        self._conns, self._biases = build_convolution(layer, delay,
                                                      transpose_kernel)
        self.set_biases()

        print("Connecting layer...")
        np_conns = np.array(self._conns)

        self.connections[-1].connect(i=np_conns[:,0].astype('int64'), j=np_conns[:,1].astype('int64'))
        if input_weight is None:
            self.connections[-1].w = np_conns[:,2]
        else:
            self.connections[-1].w = input_weight.flatten()
rbodo commented 5 years ago

This indentation error was introduced by a recent pull request and I unfortunately did not catch it in review. Thanks for noticing and proposing the parallelized approach. If you make two cosmetic changes (spaces after ,, and a linebreak before j=...), I'd be happy to pull this fix. Thank you!