mrqa / MRQA-Shared-Task-2019

Resources for the MRQA 2019 Shared Task
https://mrqa.github.io
MIT License
292 stars 31 forks source link

Due to the requirement of submitting interactive servers, is ensemble model still available in this competition? #12

Closed yangapku closed 5 years ago

yangapku commented 5 years ago

Hi,

We noticed that in this shared task, it's required that the submission should be organized into an interactive server. A script predict_server.py is officially provided to post queries and fetch predictions by turn. As we know, there are timeout settings for building the connection and processing HTTP requests. So does it mean there are implicit time limits for loading the model and making inference? In classical ways, we submit an ensemble inference pipeline on CodaLab by running each sub-model on the testset in series. However, in the interactive server setting, for a single sample, we won't have enough time to load each sub-model and make inference one by one. Is that the case? Thank you very much!

robinjia commented 5 years ago

Hi Yang,

We discourage using ensemble systems for this reason. We ask that your model averages < 1 second per question across the out-of-domain examples (9633 total).

robinjia commented 5 years ago

Sorry, forgot to mention this is on our evaluation server, which will have one Titan Xp GPU.

yangapku commented 5 years ago

Thanks! If we test our interactive server on the development set at codalab before the official evaluation, will we have access to this Titan GPU? Or we should still use the default one K80 GPU on codalab?

robinjia commented 5 years ago

Unfortunately the Titan GPU is private (currently there is no way for me to expose my codalab workers to other accounts). We really don't want to penalize submissions that are too slow, and we expect this time limit to be easy to fulfill (our BERT large baseline does about 10 examples per second).

If you have access to other GPUs, you can set up your own codalab worker following these instructions https://github.com/codalab/codalab-worksheets/wiki/Execution