Open hugman opened 9 years ago
For now, forget about everything but Bidirectional. Everybody else is WIP and may just be deleted.
You can use it defining two RNNs and passing them as input:
gru1 = GRU(inp, out)
gru2 = GRU(inp, out)
brnn = Bidirectional(gru1, gru2)
model = Sequential()
model.add(brnn)
model.add ...
...
I hope that helps.
Ok. I see. Thank you.
I'll go ahead and close this. Reopen if you have more questions on how to use Bidirectional
.
Have you tested Bidirectional? I get errors when I try using it on the imdb example data.
For example, it complains about return_sequences not being defined.
@daquang tbh, I've tested with an older Keras API and haven't write tests. But thanks for the report I'll check it out right now.
done! check this out https://github.com/EderSantana/seya/blob/master/examples/imdb_brnn.py
Still getting errors. What version of theano and keras are you using?
keras is the last one on github master, Theano is this:
git+https://github.com/Theano/Theano.git@15c90dd3#egg=Theano==0.8.git
Did you update seya? What is the error message?
I'm not using the latest github versions, just the latest stable versions. Theano 0.7 and Keras 0.1.3. Here is my error:
Traceback (most recent call last):
File "imdb_brnn.py", line 67, in
I my experience old theano had problems with concatenation, which is what seems to be happening here. The theano that comes with anaconda is not good. Give the version I just told you a try. You can always do that in a virtualenv
pip install git+https://github.com/Theano/Theano.git@15c90dd3#egg=Theano==0.8.git
@EderSantana I want use Layer brnn as the first layer. But I met this error.
The first layer in a Sequential model must get an
input_shapeor
batch_input_shapeargument.
I've set the input_shape in forward lstm and backward lstm. Can you tell me how to solve this problems?
First of all, thank you for great code. From your code, I got some good understandings to extend Keras API to my purpose.
I think your documentation link is broken. (or under construction?) ( https://seya.readthedocs.org )
Could you give some brief explanations on each class in seay.layers.recurrent.py ? Unfortunately, there are almost no comments in your code :)
Bidirectional
GRUM
ExoGRUM
GRUM2
SingleCell
What I'm planning to do is implementing
gru_cond_layer
- Conditional GRU RNN in https://github.com/kyunghyuncho/dl4mt-material/blob/master/session2/nmt.py which is the theano code of http://arxiv.org/pdf/1409.0473.pdfI think I can get good starting points from you code.