IFNLP
進行方式
- 課程網頁(2019 winter)https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/
- youtube播放清單https://www.youtube.com/playlist?list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z
- 分組名單—
G1: 潔熹、宗聖、名傑、宜昌
G2: 子珊、志中、珮玲、克成、育銓
G3: 忠毅、宛誼、鈺瑋、佩璇
- 讀書會時間地點:2020年3月~7月,每週四1730-1830@12B多功能會議室
- 每週進度為一次lecture,請於當週課程前抽空看完youtube影片,讀書會僅針對問題與作業進行討論
- 由G1-G3依序擔任值星組,週三下班前搜集當週讀書會待討論題綱(每組每週至少需針對影片或作業提出2個問題),週四結束後上傳記錄
- 預計於lecture 10訂出期末side project主題,7月課程結束後舉行demo day
3/5 Lecture 1 (G1)
討論內容:
- Human language and word meaning
- Word2vec introduction(core concepts, network structure, objective function, gradient descent)
- Semantics in word vectors
- Illustration of hierarchical softmax(ref:https://www.quora.com/What-is-hierarchical-softmax ->感謝昌哥補充)
3/12 Lecture 2 & HW 1(G2)
討論內容:
- Review on word2vec and optimization method (GD/SGD)
- The construction of co-occurrence matrix and its implication
- How to utilize the co-occurrence matrix to construct word vectors?: i) SVD ii) GloVe
- GloVe introduction (rationale, the difference with word2vec, objective function setup)
- Evaluation Method (intrisic vs extrinsic)
- What to do when word sense ambiguity occurs?
3/19 Lecture 3(G3)
3/26 Lecture 4 & HW 2(G1)
4/2 Lecture 5(G2)
4/9 Lecture 6 & HW 3(G3)
5/7 Lecture 7
補充資料:
- LSTM中文圖解 https://zhuanlan.zhihu.com/p/32085405
- GRU中文圖解 https://zhuanlan.zhihu.com/p/32481747
- RNN/LSTM/GRU圖文並茂解說(動畫版)https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21
- ResNet, HighwayNets, DenseNet https://chatbotslife.com/resnets-highwaynets-and-densenets-oh-my-9bb15918ee32
QA模型延伸閱讀
DCN https://zhuanlan.zhihu.com/p/27151397
BiDAF, DCN https://zhuanlan.zhihu.com/p/63502183
Transformer延伸閱讀
Google Blog: https://ai.googleblog.com/2017/08/transformer-novel-neural-network.html
圖文並茂逐步驟拆解說明(英文版) http://jalammar.github.io/illustrated-transformer/
李宏毅生動影片與筆記 https://www.youtube.com/watch?v=ugWDIIOHtPA 、https://hackmd.io/@abliu/BkXmzDBmr
作業參考實作內容
高手Github repo參考: https://github.com/Luvata/CS224N-2019/tree/master/Assignment