AdeDZY / SIGIR19-BERT-IR

Repo of code and data for SIGIR-19 short paper "Deeper Text Understanding for IR with Contextual NeuralLanguage Modeling"
BSD 3-Clause "New" or "Revised" License
162 stars 37 forks source link

Training document level BERT model #7

Closed zh-zheng closed 4 years ago

zh-zheng commented 4 years ago

Hi,

Is the document level BERT model just the BERT-FirstP mode of passage level model?

Thanks.

AdeDZY commented 4 years ago

Yes, that is correct. It is BERT-FirstP, with (qid, docid) as labels.

On Tue, Dec 24, 2019 at 7:09 AM NeoZzh notifications@github.com wrote:

Hi,

When training document level model, did you only use BERT-FirstP (i.e., the concatenation of first 200 tokens and title)? And the label is the corresponding(qid, docid) label, right?

Thanks.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/AdeDZY/SIGIR19-BERT-IR/issues/7?email_source=notifications&email_token=ABHQHGGTC7HN7BDU5HXQ56TQ2H3WZA5CNFSM4J652RS2YY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4ICP53JA, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABHQHGHP2KLY747ZC2T4VJTQ2H3WZANCNFSM4J652RSQ .