jwkanggist / SSL-narratives-NLP-1

거꾸로 읽는 self-supervised learning in NLP
27 stars 2 forks source link

[1주차] Dense Passage Retrieval for Open-Domain Question Answering #2

Open Gangsss opened 2 years ago

Gangsss commented 2 years ago

Keywords

In-batch negative training

TL;DR

Train better dense embedding model using only pairs of questions and passages without additional pretraining.

Abstract

Open-domain question answering relies on efficient passage retrieval to select candidate contexts, where traditional sparse vector space models, such as TF-IDF or BM25, are the de facto method. In this work, we show that retrieval can be practically implemented using dense representations alone, where embeddings are learned from a small number of questions and passages by a simple dual-encoder framework. When evaluated on a wide range of open-domain QA datasets, our dense retriever outperforms a strong Lucene-BM25 system largely by 9%-19% absolute in terms of top-20 passage retrieval accuracy, and helps our end-to-end QA system establish new state-of-the-art on multiple open-domain QA benchmarks.

Paper link

https://arxiv.org/abs/2004.04906

Presentation link

Slide

video link

https://youtu.be/GE2Qzq1Xj6c

Gangsss commented 2 years ago

참고자료

NAACL Tutorial : Contrastive Data and Learning for Natural Language Processing

EMNLP 2020 : link