dair-ai / ml-nlp-paper-discussions

📄 A repo containing notes and discussions for our weekly NLP/ML paper discussions.
150 stars 12 forks source link

Let's select a paper for our paper reading/discussion session on June 6, 2020 #2

Closed tshrjn closed 4 years ago

tshrjn commented 4 years ago

I guess, we can start the voting process & Elvis Saravia can choose the most voted paper in few (2-3) days & make it official by announcement.

Comment a paper you would like us to discuss during our weekly paper reading discussion.

You can vote on a suggested paper by using the 👍 emoji. Thanks.

tshrjn commented 4 years ago

Title: Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks Link: https://arxiv.org/abs/2005.11401

manisnesan commented 4 years ago

Title: A brief introduction to weakly supervised learning Paper: https://api.semanticscholar.org/CorpusID:44192968 Taxonomy: Modeling -> Training -> Weak Supervised Learning

Why: Modeling topic, literature review type, and practical in nature. Taxonomy is obtained from https://nlusense.com/v/32:18

Note: Proposing the paper I suggested earlier (6 votes)

sksq96 commented 4 years ago

Title: Language Models are Few-Shot Learners (GPT-3) Paper: https://arxiv.org/abs/2005.14165

Why: Relevant in the context of WS2 we discussed in the last reading group. What have we learned from the scale? Really impressive zero-shot performance, on a number of NLP tasks.

msg4naresh commented 4 years ago

Title: Attention Is All You Need paper: https://arxiv.org/pdf/1706.03762.pdf

init27 commented 4 years ago

Title: A critical analysis of self-supervision, or what we can learn from a single image Paper: https://openreview.net/pdf?id=B1esx6EYvr

itamarsalazar commented 4 years ago

Title: Universal Adversarial Perturbations Paper: https://arxiv.org/pdf/1610.08401.pdf

eyebies commented 4 years ago

End-to-End Object Detection with Transformers

omarsar commented 4 years ago

Thank you all for suggesting papers and voting. It seems GPT-3 is the winner here. Feel free to suggest your papers again next weekend or bring new papers that you feel are exciting for the group.:)