ningshixian / LSTM_Attention

attention-based LSTM/Dense implemented by Keras
https://github.com/ningshixian/LSTM_Attention
292 stars 77 forks source link

About Attention #5

Open qinya opened 6 years ago

qinya commented 6 years ago

What should I do if I follow the attention layer behind the CRF layer?

ningshixian commented 6 years ago

What do you specifically mean?

qinya commented 6 years ago

I want to connect CRF behind LSTM+Attention and do NER. I tried but failed. I added CRF when calculating loss.

ningshixian commented 6 years ago

you can do it like this,

output = Bidirectional(LSTM())
output = AttentionDecoder()(output)
output = TimeDistributed(Dense(num_class))(output)
output = crf(output)