Dengyu-Wu / spkeras

Conversion from CNNs to SNNs using Tensorflow-Keras
MIT License
36 stars 8 forks source link

List index out of range #7

Closed sauravtii closed 1 year ago

sauravtii commented 1 year ago

Hello! I am again facing the same issue with the following architecture. Everything is same as before. Can you please help me on this?

input_shape = (32, 32, 3)
input_layer = Input(input_shape)

layer = Conv2D(filters=4,
               kernel_size=(1, 1),
               strides=(1, 1),
               padding="same")(input_layer)

layer=Activation('relu')(layer)

# 1
layer = Conv2D(filters=64,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(input_layer)

# 2
layer = Conv2D(filters=64,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

layer=Activation('relu')(layer)

# Using Conv layer as a pooling layer
layer = Conv2D(filters=64,
               kernel_size=(3, 3),
               strides=(2, 2),
               padding="same")(layer)

# 3
layer = Conv2D(filters=128,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 4
layer = Conv2D(filters=128,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# Using Conv layer as a pooling layer
layer = Conv2D(filters=128,
               kernel_size=(3, 3),
               strides=(2, 2),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 5
layer = Conv2D(filters=256,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# 6
layer = Conv2D(filters=256,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 7
layer = Conv2D(filters=256,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# Using Conv layer as a pooling layer
layer = Conv2D(filters=256,
               kernel_size=(3, 3),
               strides=(2, 2),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 8
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# 9
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 10
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# Using Conv layer as a pooling layer
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(2, 2),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 11
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# 12
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

layer=Activation('relu')(layer)

# 13
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(1, 1),
               padding="same")(layer)

# Using Conv layer as a pooling layer
layer = Conv2D(filters=512,
               kernel_size=(3, 3),
               strides=(2, 2),
               padding="same")(layer)

layer=Activation('relu')(layer)

layer = Flatten()(layer)

layer = Dense(units=512)(layer)

layer = Dense(units=10)(layer)

layer=Activation('relu')(layer)

model = Model(input_layer, layer)

My model summary:

Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_1 (InputLayer)        [(None, 32, 32, 3)]       0         

 conv2d_1 (Conv2D)           (None, 32, 32, 64)        1792      

 conv2d_2 (Conv2D)           (None, 32, 32, 64)        36928     

 activation_1 (Activation)   (None, 32, 32, 64)        0         

 conv2d_3 (Conv2D)           (None, 16, 16, 64)        36928     

 conv2d_4 (Conv2D)           (None, 16, 16, 128)       73856     

 activation_2 (Activation)   (None, 16, 16, 128)       0         

 conv2d_5 (Conv2D)           (None, 16, 16, 128)       147584    

 conv2d_6 (Conv2D)           (None, 8, 8, 128)         147584    

 activation_3 (Activation)   (None, 8, 8, 128)         0         

 conv2d_7 (Conv2D)           (None, 8, 8, 256)         295168    

 conv2d_8 (Conv2D)           (None, 8, 8, 256)         590080    

 activation_4 (Activation)   (None, 8, 8, 256)         0         

 conv2d_9 (Conv2D)           (None, 8, 8, 256)         590080    

 conv2d_10 (Conv2D)          (None, 4, 4, 256)         590080    

 activation_5 (Activation)   (None, 4, 4, 256)         0         

 conv2d_11 (Conv2D)          (None, 4, 4, 512)         1180160   

 conv2d_12 (Conv2D)          (None, 4, 4, 512)         2359808   

 activation_6 (Activation)   (None, 4, 4, 512)         0         

 conv2d_13 (Conv2D)          (None, 4, 4, 512)         2359808   

 conv2d_14 (Conv2D)          (None, 2, 2, 512)         2359808   

 activation_7 (Activation)   (None, 2, 2, 512)         0         

 conv2d_15 (Conv2D)          (None, 2, 2, 512)         2359808   

 conv2d_16 (Conv2D)          (None, 2, 2, 512)         2359808   

 activation_8 (Activation)   (None, 2, 2, 512)         0         

 conv2d_17 (Conv2D)          (None, 2, 2, 512)         2359808   

 conv2d_18 (Conv2D)          (None, 1, 1, 512)         2359808   

 activation_9 (Activation)   (None, 1, 1, 512)         0         

 flatten (Flatten)           (None, 512)               0         

 dense (Dense)               (None, 512)               262656    

 dense_1 (Dense)             (None, 10)                5130      

 activation_10 (Activation)  (None, 10)                0         

=================================================================
Total params: 20,476,682
Trainable params: 20,476,682
Non-trainable params: 0
_________________________________________________________________

The error:

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
Cell In [13], line 7
      2 from spkeras.spkeras.models import cnn_to_snn
      4 #Current normalisation using cnn_to_snn
      5 ##Default: signed_bit=0, amp_factor=100, method=1, epsilon = 0.001
----> 7 snn_model = cnn_to_snn(signed_bit=0)(cnn_model,x_train)

File /opt/ml_team_data/saurav/new/saurav/vgg16/SpKeras/spkeras/spkeras/models.py:29, in cnn_to_snn.__call__(self, mdl, x_train)
     27 self.use_bias = use_bias        
     28 self.get_config()
---> 29 self.model = self.convert(mdl,x_train,                    
     30                           thresholding = self.thresholding,
     31                           scaling_factor = self.scaling_factor,
     32                           method = self.method,
     33                           timesteps=self.timesteps)
     35 return self

File /opt/ml_team_data/saurav/new/saurav/vgg16/SpKeras/spkeras/spkeras/models.py:101, in cnn_to_snn.convert(self, mdl, x_train, thresholding, scaling_factor, method, timesteps)
     98     _weights[0] = _weights[0].astype(int)   
     99     _weights[0] = _weights[0]/2**bit
--> 101 _bias = kappa*_weights[1]/lmax[num+1]
    102 _bias = _bias/norm
    103 bias.append(_bias.tolist())    

IndexError: list index out of range
Dengyu-Wu commented 1 year ago

You need ReLU activation layer after each Conv2D.

sauravtii commented 1 year ago

Is it possible to convert cnn to snn with this architecture? I am working on a project that requires this architecture.

Dengyu-Wu commented 1 year ago

The answer would be yes only if you can modify the layers into

sauravtii commented 1 year ago

Okay, thanks!