ChunchuanLv / AMR_AS_GRAPH_PREDICTION

53 stars 16 forks source link

Contextual embeddings #17

Open ghost opened 5 years ago

ghost commented 5 years ago

Hi,

there is this recent ACL paper https://arxiv.org/abs/1905.08704 which has AMR parsing SOTA results (76 F1 Smatch). However it uses contextual embeddings and some other powerful mechanisms (copying, attention, etc). Therefore I believe that your model (74 F1 Smatch) has more potential... could you (or anybody else) give me some tips on how to incorporate BERT embeddings in your model? I would like to try your model with contextual embeddings and think there will be a good performance boost.

Thanks in advance for your patience

ChunchuanLv commented 5 years ago

Hi Hamlet,

Felt like we can just change the word embedding/lemma embedding to Bert. Allennlp seems to have some modules to handle the tokenization, but I am not very sure.

Another major change that this ACL paper made is that using maximum spanning tree decoding for relations. I felt this might provide a more accurate model (although it is not the right model due to re-entrance).

Chunchuan

On Fri, 14 Jun 2019 at 15:21, hamlet-father-ghost notifications@github.com wrote:

Hi,

there is this recent ACL paper https://arxiv.org/abs/1905.08704 which has AMR parsing SOTA results. However it uses contextual embeddings and some other powerful mechanisms (copying, attention). Therefore I believe that your model has more potential... could you (or anybody else) give me some tips on how to incorporate BERT embeddings in your model? I would like to try your model with contextual embeddings and think there will be a good performance boost.

Thanks in advance for your patience

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ChunchuanLv/AMR_AS_GRAPH_PREDICTION/issues/17?email_source=notifications&email_token=AA5TK3NEINIG2VXIB7XX2HLP2OSNZA5CNFSM4HYIKPDKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4GZSGMJA, or mute the thread https://github.com/notifications/unsubscribe-auth/AA5TK3K2LIG5FBUB2A3SLSLP2OSNZANCNFSM4HYIKPDA .