Hi @mdeff,
l'm wondering if Chebyshev polynomial is concerned by the backpropagation or just the result of Chebyshev polynomial.
in
https://github.com/mdeff/cnn_graph/blob/master/lib/models.py#L495
def _inference(self, x, dropout):
with tf.name_scope('gconv1'):
N, M = x.get_shape() # N: number of samples, M: number of features
M = int(M)
# Transform to Chebyshev basis
xc = tf.transpose(x) # M x N
def chebyshev(x):
return graph.chebyshev(self.L, x, self.K)
xc = tf.py_func(chebyshev, [xc], [tf.float32])[0]
xc = tf.transpose(xc) # N x M x K
xc = tf.reshape(xc, [-1, self.K]) # NM x K
# Filter
W = self._weight_variable([self.K, self.F])
y = tf.matmul(xc, W) # NM x F
y = tf.reshape(y, [-1, M, self.F]) # N x M x F
# Bias and non-linearity
# b = self._bias_variable([1, 1, self.F])
b = self._bias_variable([1, M, self.F])
y += b # N x M x F
y = tf.nn.relu(y)
with tf.name_scope('fc1'):
W = self._weight_variable([self.F*M, NCLASSES])
b = self._bias_variable([NCLASSES])
y = tf.reshape(y, [-1, self.F*M])
y = tf.matmul(y, W) + b
return y
xc = tf.py_func(chebyshev, [xc], [tf.float32])[0]
xc = tf.transpose(xc) # N x M x K
xc = tf.reshape(xc, [-1, self.K]) # NM x K
# Filter
W = self._weight_variable([self.K, self.F])
y = tf.matmul(xc, W) # NM x F
y = tf.reshape(y, [-1, M, self.F]) # N x M x F
l would like to know if only :
y = tf.nn.relu(y)
https://github.com/mdeff/cnn_graph/blob/master/lib/models.py#L514 which concerned by backpropagation or also :https://github.com/mdeff/cnn_graph/blob/master/lib/models.py#L503
?
Thank you a lot for your answer.