Open ksarker1205 opened 8 years ago
Hi @ksarker1205 , when the computing graph is compiled into a theano function, the forward and backward propagation happens each time the function is called at this line: https://github.com/uoguelph-mlrg/theano_alexnet/blob/master/train_funcs.py#L165
The function is responsible for taking the input and forward propagating through the graph, and the update
argument of the function specifies how the gradient is used in backward propagation.
See how the theano function is constructed here: https://github.com/uoguelph-mlrg/theano_alexnet/blob/master/alex_net.py#L216
I am interested to see where forward and backward propagation is happening in the code. Can you point me to that specific portion of the code ?