[Author] did [Method] to solve [Motivation] and found [Insight]
Authors
Kenton Lee†, Luheng He†, Mike Lewis‡ , and Luke Zettlemoyer†∗
(†Univ. of Washington, ∗AI2, and ‡FAIR)
Motivation
Existing method is/has/uses ...
All recent coreference models, including neural approaches that achieved impressive performance gains (Wiseman et al., 2016; Clark and Manning, 2016b,a), rely on syntactic parsers, both for head-word features and as the input to carefully hand-engineered mention proposal algorithms.
Method
Proposed method is/has/uses ...
The key idea is to directly consider all spans in a document as potential mentions and learn distributions over possible antecedents for each.
Scoring all span pairs in our end-to-end model is impractical.
Therefore we factor the model over unary mention scores and pairwise antecedent scores, both of which are simple functions of the learned span embedding.
End-to-end Neural Coreference Resolution (EMNLP17)
Contribution summary
[Author] did [Method] to solve [Motivation] and found [Insight]
Authors
Kenton Lee†, Luheng He†, Mike Lewis‡ , and Luke Zettlemoyer†∗ (†Univ. of Washington, ∗AI2, and ‡FAIR)
Motivation
Existing method is/has/uses ...
Method
Proposed method is/has/uses ...
Results / Insight