CPSSD / LUCAS

The repository for the LUCAS/Lucify project
MIT License
11 stars 4 forks source link

Added the two BERT training notebooks. #202

Closed Deniall closed 5 years ago

Deniall commented 5 years ago

The OpSpam notebook achieved the first SOTA+ results in a while, so it is no.1 priority to verify those results now. The YelpZIP one did quite poorly, but it could be due to the sequence length truncation.

StefanKennedy commented 5 years ago

16_BERT_Finetuning_on_OpSpam.ipynb should not be in the yelp directory

StefanKennedy commented 5 years ago

Could you upload the code you used for cross validation?

Edit: If we're doing 10 fold cross validation, wouldn't the test split be 0.1?

Deniall commented 5 years ago

Could you upload the code you used for cross validation?

Edit: If we're doing 10 fold cross validation, wouldn't the test split be 0.1?

Correct. If I'm to run the experiments again I'll do 10 fold, How would you recommend doing so without having to rerun the experiment manually 10 times with a different fold number?

StefanKennedy commented 5 years ago

Even code that allows you to manually change the fold number is okay. I'm more interested in seeing how cross validation is being done