caskcsg / ir

ConTextual Mask Auto-Encoder for Dense Passage Retrieval
Apache License 2.0
35 stars 3 forks source link
ir nlp pre-trained-model

Information Retrieval Researches

Codes and models for our information retrieval research papers.

Knowledge Computing and Service Group, Institute of Information Engineering, Chinese Academy of Sciences.

Releases

CoT-MAE-qc: Query-as-context Pre-training for Dense Passage Retrieval. A simple yet effective pre-training scheme for single vector Dense Passage Retrieval. (Accepted by EMNLP 2023 Main Conference)

CoT-MAE: ConTextual Mask Auto-Encoder for Dense Passage Retrieval. CoT-MAE is a transformers based Mask Auto-Encoder pre-training architecture designed for Dense Passage Retrieval. (Accepted by AAAI 2022)