uclnlp / jack

Jack the Reader
MIT License
257 stars 82 forks source link

Pretrained NLI models #350

Closed dirkweissenborn closed 6 years ago

dirkweissenborn commented 6 years ago

We need pre-trained models for MultiNLI and SNLI.

... and create some documentation for this task like for extractive QA

pminervini commented 6 years ago

I've tried it once but nothing was working: I was getting a random-chance test error, and the very same code is getting ~SOTA - ~87% test accuracy - in another codebase. Let me debug this

dirkweissenborn commented 6 years ago

when did you try? the code has changed, maybe it will work better now.

pminervini commented 6 years ago

Let's see!

pminervini commented 6 years ago

The code is now very very different from what I have - I have to find some time for looking into this new implementation of ESIM (and DAM eventually), and think what it means in terms of time and papers

pminervini commented 6 years ago

I'm not sure I've the time of working on two very distinct implementations of the same thing :( sorry

dirkweissenborn commented 6 years ago

It is ok to simply run the current implementations as is, host the trained models and report the results in a specific NLI document. No need to look into the code at the moment. We just need some models and numbers.

pminervini commented 6 years ago

ESIM's accuracy seems decent (86.6, 86.0) and slightly lower than my implementation (no char inputs, above 87); DAM's seems sub-par (48.0, 48.4), probably it's just a matter of hyperparams.

I'm uploading the models here: http://jack.neuralnoise.com/jack/natural_language_inference/

pminervini commented 6 years ago

Done. Happy Holidays!