vlfeat / matconvnet

MatConvNet: CNNs for MATLAB
Other
1.4k stars 753 forks source link

how to add the Regularization term in matconvnet loss function? #887

Open eroszip opened 7 years ago

eroszip commented 7 years ago

as we known, Regularization term is a good way to preventing overfit,in caffe we can easily set the parameter with weight_decay,but how to set it with matconvnet?

AruniRC commented 7 years ago

Hi, MatConvNet also has a weight-decay term.

Each layer has a weight decay field (or net.param in case of DAG has a field called net.param(i).weightDecay). This is usually set to 1 for all layers/parameters.

Now during cnn_train, a global weight decay term is set (usually 0.0005 by default, hardcoded in the cnn_train code). So for the i-th layer, weight decay becomes net.param(i).weightDecay * 0.0005.

HTH

layumi commented 7 years ago

Hi, @eroszip I agree with @AruniRC. I think you can change the global value "opts.weightDecay = 0.0005;" (Line22 /matlab/cnn_train_dagnn.m) without changing the weightDecay for every params. This opts will multiply the param's weightDecay as the final weightDecay to update the network. (Line295 /matlab/cnn_train_dagnn.m)

simsonthebest commented 6 years ago

Hi! I was wondering if there is a way to use L1 weight regularization instead of L2 method in matconvnet.