applenob / RNN-for-Joint-NLU

Tensorflow implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
281 stars 98 forks source link

Hi, I have some question about intent hope you could reply me. #3

Open iamzww opened 6 years ago

iamzww commented 6 years ago

1.is there an attention on Intent Detection task in your code? you seem to just make a full-connected layer after Bi-LSTM. That confused me.
2.why the final intent vector is encoder_final_state_h rather than summation of all encoder_state_h_i?

applenob commented 6 years ago

Yes you are right. I didn't use attention in the classification model. I would add this part or you can commit your code if you like. Thank you for pointing this out.