Closed GauravBh1010tt closed 7 years ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.
have you solved that problem yet?
@JustinhoCHN - Yes the issue is resolved. Although I haven't uploaded the code for the exact same problem, this implementation of Neural Tensor Network is based on the general solution for varied batch sizes - NTN.
Hope it helps. Gaurav.
Hi, I am implementing Siamese LSTM as described in Siamese Recurrent Architectures. My custom layer function works for batch size 1 but it gives dimension mismatch error when batch size > 1. My code is
The error that I am getting is
ValueError: GpuElemwise. Input dimension mis-match. Input 1 (indices start at 0) has shape[1] == 1, but the output's size on that axis is 4.
This is the way Keras core layer are defined and they work pretty well for varied batch sizes. Any suggestions?
Thank you!