Timestamps:
00:00 Welcome!
00:08 Introduction
01:09 Word Embeddings - definition
01:38 Technical issue
02:37 Overview
02:51 Word Embeddings - examples
03:17 Why Word Embeddings?
04:35 Word Analogies
05:12 Distributional Hypothesis
05:55 Neural Language Models
08:45 Word2Vec
09:58 Word2Vec - CBOW (Continuous Bag Of Words)
10:53 Word2Vec - SkipGram
11:24 GloVe - slide 1
12:26 GloVe - slide 2
14:57 fastText
16:24 StarSpace
19:14 RAND-WALK
23:03 Features/Sentence Representation
25:35 Implementation links (Python)
27:07 Applications in NLP
29:25 The end?
29:35 Application to RNN
31:42 Question - Preference about when to use which model?
33:19 Question - How do LSTM models perform with embeddings?
34:12 Question - How do you initialize RAND-WALK?
35:16 Question - Size of embedding for sentences and articles using GloVe?
36:42 Question - How do document embeddings compare to other methods (e.g. topic modeling with LDA)?
38:30 Final
Time stamps for the video Beyond word2vec: GloVe, fastText, StarSpace - Konstantinos Perifanos
Timestamps: 00:00 Welcome! 00:08 Introduction 01:09 Word Embeddings - definition 01:38 Technical issue 02:37 Overview 02:51 Word Embeddings - examples 03:17 Why Word Embeddings? 04:35 Word Analogies 05:12 Distributional Hypothesis 05:55 Neural Language Models 08:45 Word2Vec 09:58 Word2Vec - CBOW (Continuous Bag Of Words) 10:53 Word2Vec - SkipGram 11:24 GloVe - slide 1 12:26 GloVe - slide 2 14:57 fastText 16:24 StarSpace 19:14 RAND-WALK 23:03 Features/Sentence Representation 25:35 Implementation links (Python) 27:07 Applications in NLP 29:25 The end? 29:35 Application to RNN 31:42 Question - Preference about when to use which model? 33:19 Question - How do LSTM models perform with embeddings? 34:12 Question - How do you initialize RAND-WALK? 35:16 Question - Size of embedding for sentences and articles using GloVe? 36:42 Question - How do document embeddings compare to other methods (e.g. topic modeling with LDA)? 38:30 Final