thunlp / PLMpapers

Must-read Papers on pre-trained language models.
MIT License
3.33k stars 436 forks source link

Move "Thieves of Sesame Street" to Compression or Analysis #15

Closed martiansideofthemoon closed 4 years ago

martiansideofthemoon commented 4 years ago

Hi, I'm the first author of a paper mentioned in this list, "Thieves on Sesame Street! Model Extraction of BERT-based APIs". Firstly, thanks so much for the mention!

I feel the paper might be more appropriate under the Knowledge Distillation & Model Compression section or the Analysis section, since it's not really a new model.

zzy14 commented 4 years ago

Fixed. Thanks for your suggestion!