polynomial fit: sum(loss(poly(xi, theta, degree = 1-16), yi), i=1,n) where {xi, yi} are generated from a yi = poly(xi, theta) distribution. here can use square loss, cauchy loss ...
logistic regression with L2 normalization: random {xi, yi} samples with random regularization factor in [0, 1]
mlps with various hidden layers and activation functions: mlp[0,1,2,3]_act[tanh,logit,sin,pwave]
support for both L1 and L2 regularization
have the stochastic benchmark and unit test use only these functions
compare stochastic optimizers to the batch ones for these functions
benchmark functions from Nesterov book: geometric optimization and norm ...
also there is no need to specify the number of epochs: can use stoch_ratio to figure it out
polynomial fit: sum(loss(poly(xi, theta, degree = 1-16), yi), i=1,n) where {xi, yi} are generated from a yi = poly(xi, theta) distribution. here can use square loss, cauchy loss ...
logistic regression with L2 normalization: random {xi, yi} samples with random regularization factor in [0, 1]
mlps with various hidden layers and activation functions: mlp[0,1,2,3]_act[tanh,logit,sin,pwave]
support for both L1 and L2 regularization
have the stochastic benchmark and unit test use only these functions
compare stochastic optimizers to the batch ones for these functions
benchmark functions from Nesterov book: geometric optimization and norm ...
also there is no need to specify the number of epochs: can use stoch_ratio to figure it out