-
Besides the #3 , I have a question on matrix multiply. What is its performance compared to the hardcoded library functions? If it is something like a third, then it might be possible to speed it up by…
-
@SeanNaren I would like to know what exactly is the use of the function [calculateInputSizes](https://github.com/SeanNaren/deepspeech.torch/blob/master/DeepSpeechModel.lua#L54). I am using my own imag…
-
We can recommend some papers for further discussion under this issue. Include a link to the paper + the conference name and other related information (like the abstract, some basic descriptions, links…
-
I'm doing video classification with video data of different lengths, varying from 20 to 500 frames. I can input one video data with any frames into the network without any problem, since my underlying…
-
I had been trying to figure out how to implement sentiment analysis by following [sequence to one](https://github.com/Element-Research/rnn/blob/master/examples/sequence-to-one.lua) example but found i…
-
How can I build an RNN model with three unrolled states(t-3,t-2 and t-1) which accepts three images of dimensions 200x200 and predicts a fourth image? What would I need to change in this [RNN example]…
-
Could I receive some guidance on adjusting the net to correctly predict new targets from train.csv?
https://s3-sa-east-1.amazonaws.com/nu-static/workable-data-science/data-science-puzzle.zip
hpoit updated
8 years ago
-
Dear zehao
Firstly, thanks very much for the great test code of the paper. As you implemented the training code by caffe, would you please share it on github? If it contains some bugs, would you pleas…
-
I'm implementing convolution LSTMs with autograd - the pseudo code is here https://github.com/ankurhanda/convlstm.autograd/blob/master/RecurrentConvLSTMNetwork.lua. The code is adapted from RecurrentL…
-
layers.py, on which most of the nntools code is based, has always been geared towards feed-forward neural networks. We should look into recurrent neural networks as well. Personally I don't have a lot…