NervanaSystems / neon

Intel® Nervana™ reference deep learning framework committed to best performance on all hardware
http://neon.nervanasys.com/docs/latest
Apache License 2.0
3.87k stars 811 forks source link

MergeSum gives an error for custom layer #433

Open sangameshnr opened 6 years ago

sangameshnr commented 6 years ago

Hi, I am trying to use MergeSum in my network to implement Resnet kind of layer. In short, consider a layer with 2 paths like this: path1: [ conv2d, conv2d] path2: [crop() ] .. custom layer layer = [MergeSum([path1,path2])] ... this works fine for existing layers but fails for custom layer. For instance, I took the custom layer Normalize() from SSD examples (neon/examples/SSD/layer.py) and used that as, path1: [Normalize()] path2: [Normalize()] ... this too fails with MergeSum. It fails with an error: [src/concat.c:269] err (-1) I tracked down a bit, and it fails at the sum_tensor() function in nervanamkl.py
Could anyone help me in understanding this issue??

baojun-nervana commented 6 years ago

@sangameshnr Do you have the similar issue with cpu backend (-b cpu)? I am wondering if this is mkl specific?

sangameshnr commented 6 years ago

Yes. I get this error for 'mkl' backend and not for 'cpu'.

sangameshnr commented 6 years ago

I have one more comment. For cpu backend, though there is no error, the fprop output of the MergeSum layer is equal to the second path only. It does not return the sum of two paths. Am I missing something here ??

baojun-nervana commented 6 years ago

@sangameshnr Thank you for the info. We are looking into it.

airofjune commented 6 years ago

@sangameshnr MergeSum on CPU backend applies an adding function dependent on: 1 shared output 2 beta argument. Thus custom layer should write correct allocate() and fprop().

baojun-nervana commented 6 years ago

@sangameshnr We will include the fix in next release.

sangameshnr commented 6 years ago

@airofjune thanks for the comment on the arguments. I will note that. @baojun-nervana thanks for taking care of the issue. I will look forward to the release.