Closed SRL94 closed 2 years ago
The Conversation History Encoder embeds the questions by feeding glove word embeddings to a Bi-LSTM. How do you encode the answers?
Hi,
It should be the same way as the questions. You can refer to the code policy.ranker.forward_answer() for more details.
The Conversation History Encoder embeds the questions by feeding glove word embeddings to a Bi-LSTM. How do you encode the answers?