rasmusbergpalm / DeepLearnToolbox

Matlab/Octave toolbox for deep learning. Includes Deep Belief Nets, Stacked Autoencoders, Convolutional Neural Nets, Convolutional Autoencoders and vanilla Neural Nets. Each method has examples to get you started.
BSD 2-Clause "Simplified" License
3.79k stars 2.28k forks source link

Function caesdlm in caetrain.m of CAE (convolution autoencoder) project. #39

Open rt77789 opened 11 years ago

rt77789 commented 11 years ago

In caetrain.m file, I'm confused what's the purpose of this statement:

 cae = caesdlm(cae, opts, m);

In my opinion, this statement is useless, am I right? Thanks.

rasmusbergpalm commented 11 years ago

Nope. It modifies cae. Why would you think its useless?

rt77789 commented 11 years ago

Current gradient updating doesn't involve ddok and ddik, and I found some commented statements contain ddok and ddik, in caeapplygrads.m file:

%             cae.vik{i}{j} = cae.momentum * cae.vik{i}{j} + cae.alpha ./ (cae.sigma + cae.ddik{i}{j}) .* cae.dik{i}{j};
%             cae.vok{i}{j} = cae.momentum * cae.vok{i}{j} + cae.alpha ./ (cae.sigma + cae.ddok{i}{j}) .* cae.dok{i}{j};
            cae.vik{i}{j} = cae.alpha * cae.dik{i}{j};
            cae.vok{i}{j} = cae.alpha * cae.dok{i}{j};

I'm not familiar with "stochastic diagonal levenberg-marquardt", and I think it makes convergency fast.

rasmusbergpalm commented 11 years ago

Aha! nicely spotted. I can't remember why that is out-commented.. I'm pretty sure its safe to bring it back in (and commenting out the next two lines obviously). You can do a simple test where you plot MSE vs epochs and see whether it decreases faster with SDLM activated or not. If it does I'll happily accept a pull request that changes it.