uoguelph-mlrg / theano_alexnet

Theano-based Alexnet
BSD 3-Clause "New" or "Revised" License
229 stars 115 forks source link

Forward and Backward Propagation #28

Open ksarker1205 opened 8 years ago

ksarker1205 commented 8 years ago

I am interested to see where forward and backward propagation is happening in the code. Can you point me to that specific portion of the code ?

hma02 commented 8 years ago

Hi @ksarker1205 , when the computing graph is compiled into a theano function, the forward and backward propagation happens each time the function is called at this line: https://github.com/uoguelph-mlrg/theano_alexnet/blob/master/train_funcs.py#L165

The function is responsible for taking the input and forward propagating through the graph, and the update argument of the function specifies how the gradient is used in backward propagation.

See how the theano function is constructed here: https://github.com/uoguelph-mlrg/theano_alexnet/blob/master/alex_net.py#L216