facebookresearch / EmpatheticDialogues

Dialogue model that produces empathetic responses when trained on the EmpatheticDialogues dataset.
Other
444 stars 63 forks source link

ED architecture? #30

Closed 17521121 closed 4 years ago

17521121 commented 4 years ago

As I understand, ED used

and another phase is generative base I marked it like a bert decoder - because bert doen't have a tokenizer decoder , so we train a transformer like a decoder to get a sentence from bert encoder output?

I also mention before that transformer has many architure right now (huggingface), so it makes confuse to everybody come up with this method. Hope you answer these questions

EricMichaelSmith commented 4 years ago

Hi there! Yes, that's how the BERT encoder works for retrieval. We didn't do a BERT generative model, only a fully Transformer generative model.

17521121 commented 4 years ago

Alright I got it, thank you very much.

EricMichaelSmith commented 4 years ago

Sure thing!