PyTorch 기초부터 NLP 근본 딥러닝 모델의 논문 구현 스터디를 진행하고자 합니다.
이인서, 임수정, 한나연, 허치영, 김소연
Date | Paper | Year | Presenter | Source |
---|---|---|---|---|
7/13 | Deep Residual Learning for Image Recognition | 2015 | 인서, 나연 | ResNet |
7/20 | RNN, LSTM | 수정, 치영 | RNN | |
7/27 | Sequence to Sequence Learning with Neural Networks | 2014 | 인서, 나연 | Seq2Seq |
8/3 | Neural Machine Translation by Jointly Learning to Align and Translate | ICLR 2015 | 수정, 치영 | Seq2Seq with Attention |
8/10 | Attention Is All You Need | NIPS 2017 | 모두 | [Transformer 리뷰]() |
8/17 | Attention Is All You Need | NIPS 2017 | 인서, 나연 | [Transformer 코드]() |
8/24 | Attention Is All You Need | NIPS 2017 | 수정, 소연 | [Transformer 코드]() |
8/31 | Pre-training of Deep Bidirectional Transformers for Language Understanding | 2018 | 예정 | [BERT 리뷰]() |
9/9 | Pre-training of Deep Bidirectional Transformers for Language Understanding | 2018 | 예정 | [BERT 코드]() |