cbovar / ConvNetSharp

Deep Learning in C#
MIT License
467 stars 109 forks source link

how to concat multi layers? #138

Open BackT0TheFuture opened 5 years ago

BackT0TheFuture commented 5 years ago

Hi, It's great work you made. I came across one problem, python code looks like below

flatten = mx.symbol.Flatten(data=relu3)
fc1 = mx.symbol.FullyConnected(data=flatten, num_hidden=512)
fc21 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc22 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc23 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc24 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc2 = mx.symbol.Concat(*[fc21, fc22, fc23, fc24], dim=0)
label = mx.symbol.transpose(data=label)
label = mx.symbol.Reshape(data=label, target_shape=(0, ))
mx.symbol.SoftmaxOutput(data=fc2, label=label, name="softmax")  

Is it possible to implement this using ConvNetSharp? thanks!

cbovar commented 5 years ago

Hi,

It should be possible using the 'Flow' part of ConvNetSharp (by creating a computation graph). I will try to implement your example to ConvNetSharp soon and will post it here.

cbovar commented 5 years ago

It would look something like:

var cns = new ConvNetSharp<double>();

var input = cns.PlaceHolder("input");
var flatten = cns.Flatten(input);
var fc1 = cns.Dense(flatten, 512);
var fc21 = cns.Dense(fc1, 11);
var fc22 = cns.Dense(fc1, 11);
var fc23 = cns.Dense(fc1, 11);
var fc24 = cns.Dense(fc1, 11);
var fc2 = cns.Concat(cns.Concat(fc21, fc22), cns.Concat(fc23, fc24));
var model = cns.Softmax(fc2);

var label = cns.PlaceHolder("label");
var cost = cns.CrossEntropyLoss(model, label);
BackT0TheFuture commented 5 years ago

Cool , thank you so much. I'll have a try and give some feedback again.