joncox123 / Cortexsys

Matlab GPU Accelerated Deep Learning Toolbox
Other
70 stars 28 forks source link

error happened when to train a mapping from input with 100 neurons to output with 4 neurons #13

Open RyanCV opened 7 years ago

RyanCV commented 7 years ago

The network defined as follows: input size = 100, output size = 4

layers.af{1} = [];
layers.sz{1} = [input_size 1 1];
layers.typ{1} = defs.TYPES.INPUT;

layers.af{end+1} = ReLU(defs, []);
layers.sz{end+1} = [input_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED;

layers.af{end+1} = ReLU(defs, []);
layers.sz{end+1} = [output_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED;

if defs.plotOn
    nnShow(23, layers, defs);
end

Error in

Error using  - 
Matrix dimensions must agree.

Error in squaredErrorCostFun (line 2)
    J = (Y.v(:,:,t)-A.v(:,:,t)).^2;

Error in ReLU/cost (line 65)
            J = squaredErrorCostFun(Y, A, m, t);

Error in nnCostFunctionCNN (line 29)
J = nn.l.af{nn.N_l}.cost(Y, nn.A{nn.N_l}, m, 1) + J_s;

Error in
Train_proposal>@(nn,r,newRandGen)nnCostFunctionCNN(nn,r,newRandGen)

Error in gradientDescentAdaDelta (line 69)
    [J, dJdW, dJdB] = feval(f, nn, r, true);

Error in Train_proposal (line 158)
nn = gradientDescentAdaDelta(costFunc, nn, defs, [], [], [], [], 'Training
Entire Network');
joncox123 commented 7 years ago

Its hard to say exactly without seeing your exact code and data formats. However, it seems like the dimensionality of your label matrix (Y) could be incorrect. Check the shape of Y and try permuting it so that it matches that of the final output layer.

RyanCV commented 7 years ago

Thanks for your help, it solves the problem.

RyanCV commented 7 years ago

@joncox123 by the way, where can I find '~/mnist_full.mat' in order to run your example?

RyanCV commented 7 years ago

@joncox123 , if I want to design a network only has one hidden layer (including one Fully_connected layer + ReLU), and the output layer is only Fully_connected layer, is the following design right??

layers.af{1} = [];
layers.sz{1} = [input_size 1 1];
layers.typ{1} = defs.TYPES.INPUT;  (input layer)

layers.af{end+1} = ReLU(defs, defs.COSTS.SQUARED_ERROR);
layers.sz{end+1} = [input_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED; (hidden layer 1)

layers.af{end+1} = ReLU(defs, defs.COSTS.SQUARED_ERROR); (**for this line, if I use [], there will be error??** )
layers.sz{end+1} = [output_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED; (**output layer, how to set it only has fully connected layer??**)

X = varObj(X,defs, defs.TYPES.INPUT);
Y = varObj(Y,defs, defs.TYPES.OUTPUT);

Thanks.