INK-USC / TriggerNER

TriggerNER: Learning with Entity Triggers as Explanations for Named Entity Recognition (ACL 2020)
https://arxiv.org/abs/2004.07493
173 stars 19 forks source link

Need help about paper and code #1

Closed dalinvip closed 4 years ago

dalinvip commented 4 years ago

Hi,

I have just read the paper, and it's a nice work.

I have some questions:

1, To enable more efficient batch-based training, a sentence contains only one entity and one trigger? How much less efficient would it be if didn't limit this?

2, gs or gt is obtained by weighted sum of the token vectors, there's no representation of the sum in the formula ?

3, base model is CNN-BLSTM-CRF, there is no comparison in the results table ? Is that BLSTM-CRF for short?

4, What would happen if a full amount of trigger data be used?

Best Wishes

danny911kr commented 4 years ago

Hi, Thank you for your good questions! Here's are my answers.

  1. For preprocessing, sentences that contain more than one entity should be split down into sentences with only one entity. I think it's not related to "efficient" training but it's more related to a machine-understandable perspective. As for the human perspective, if the sentence has more than one entity, we can't distinguish which entity is indicated by the trigger. (it makes confuse.)

  2. the formula represents weighted sum of the token vectors. In more specific, it is the weighted sum of LSTM hidden states. You can find the more specific formula in this paper: https://openreview.net/pdf?id=BJC_jUqxe

  3. Yes, its BLSTM_CRF for short.

  4. For clarification, is the full amount of trigger data mean triggers for a 100% training dataset? If it's correct, we couldn't conduct experiment with more triggers. (more than triggers for 20% training dataset) we are planning to collect more triggers by turking and expecting good results. will ping you when we get a result.

If you need more answers, feel free to ask into this thread! Thank you!

dalinvip commented 4 years ago

@danny911kr ok, Thank you for your answers.