hysic / NNDL_ReadingNotes

Reading notes for Neural Networks and Deep Learning
0 stars 0 forks source link

Chp.2 How the backpropagation algorithm works #2

Open hysic opened 7 years ago

hysic commented 7 years ago

反向传播(backpropagation):计算cost function关于w和b的偏微分。

要想应用反向传播,cost function需要满足两个条件:

  1. The cost function can be written as an average C=1/n*∑C_x over cost functions C_x for individual training examples, x.
  2. The cost function can be written as a function of the outputs from the neural network.

满足上述两个条件的cost function举例:

image

hysic commented 7 years ago

定义: imageimage

反向传播(BP)算法公式: image

反向传播算法步骤(单个training example): image

BP + mini-batch Gradient descent: image

hysic commented 7 years ago

反向传播(BP)算法的速度:一次反向传播,就可以算出所有的梯度。所需要的时间:一次前向传播 + 一次后向传播 ≈ 两次前向传播。

对比根据梯度定义(见下式)计算所需要的时间:前向传播的次数 = 权重的个数 + 1

image

所以BP算法比梯度定义的算法快得多。

hysic commented 7 years ago

BP算法公式(3)和(4)的推导:

image