vlfeat / matconvnet

MatConvNet: CNNs for MATLAB
Other
1.4k stars 752 forks source link

Local Response Normalization backpropagation #10

Open 4fur4 opened 10 years ago

4fur4 commented 10 years ago

Hi!

Im checking the implementation in C++ of the local response normalization here:

https://github.com/vlfeat/matconvnet/blob/master/matlab/src/bits/normalize.cpp

Based on Hinton´s paper http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf we have that the response-normalized activity is given by the formula (notation adapted for readability):

activation

which totally fits with the implementation mentioned above. To get the back-propagation formulas we have that if

c

then

gradient

If im not wrong, this maps to the C++ implementation as

map

which should lead to the formula

formula1

however, the implementation is (lines 276-280)

formula2

Note the change zat(q) -> zat(t). Is there anything wrong there that I didnt notice?

Thank you!

Urko

johnny5550822 commented 9 years ago

I am interested in the answer of this question too.

jroose commented 9 years ago

I'm not convinced that the negative term really makes a difference. In Hinton's paper he sets the value of K=2, beta=0.75, and alpha=1e-4. Using those values, I believe the negative term is almost certainly negligible because the positive term is likely to be so much larger.

Math isn't my specialty though, so I'd be interested in your opinion Urko. Did you happen to come to a similar conclusion?

easten20 commented 9 years ago

I think the result is the same, it just assigned to different variable though, instead of assigning it to yat(t) they assign it to yat(q).

https://github.com/vlfeat/matconvnet/blob/b7dd9c963541582faa04572f510e0cc20545e086/matlab/src/bits/impl/normalize_cpu.cpp

line 271 -275

yat(q) -= zat(t) * xat(t) * xat(q) * ab2 * Lbeta1 ;

is that line of code did you mean?