psnonis / FinBERT

BERT for Finance : UC Berkeley MIDS w266 Final Project
197 stars 62 forks source link

Precise training data of Combo #4

Open djstrong opened 4 years ago

djstrong commented 4 years ago

In the readme:

Combo Pre-trained continued from original BERT on 2017, 2018, 2019 SEC 10K dataset

but in the paper:

train a Combo Model on top of the last checkpoint of BERT-Base Uncased. This training was done in parallel with FinBERT Prime, using SEC2019 for the first 250,000 and using SEC1999 for the last 250,000.

What is correct?