hpe-cct / cct-core

CCT compiler, runtime, graphical debugger, and standard library
Apache License 2.0
15 stars 4 forks source link

Is padding implemented in the layers ? #1

Open manjunaths opened 8 years ago

manjunaths commented 8 years ago

Hello, Is padding implemented in the ConvolutionLayer and other calls ?

Is there an example that demonstrates this ?

Thanks.

bchandle commented 8 years ago

I just added an example here:

https://github.com/hpe-cct/cct-nn/blob/master/src/test/scala/toolkit/neuralnetwork/examples/AlexNet.scala

The border argument on ConvolutionLayer specifies how padding should be handled. The convolution sizes in this example are exactly consistent with the Caffe reference implementation:

https://github.com/BVLC/caffe/blob/master/models/bvlc_alexnet/deploy.prototxt

A BorderValid convolution produces an output with dimensions equal to the input size minus the (kernel size - 1) in each dimension. A BorderZero convolution will zero-pad such that the output has the same dimensions as the input.

If you want a different padding configuration, the ZeroPad function will let you do that. This combination of manual padding and BorderValid convolution is equivalent to BorderZero convolution:

val c1 = ConvolutionLayer(ZeroPad(data, 5), Shape(11, 11), 96, BorderValid, lr, stride = 4, impl = Space)
manjunaths commented 8 years ago

Thank you for this. I will try to run this and check.

manjunaths commented 8 years ago

What is the best way to benchmark this AlexNet network ?