LeCun updated his cake recipe last week at the 2019 International Solid-State Circuits Conference (ISSCC) in San Francisco, replacing “unsupervised learning” with “self-supervised learning,” a variant of unsupervised learning where the data provides the supervision.
In the semi-supervised learning setting, the goal is to use both a small labeled training set and a much larger unlabeled data set.
Source: http://ai.stanford.edu/blog/weak-supervision/
self-supervised learning in unsupervised learning
transfer learning: pre-trained model -> fine-tuned model by freezing or fine-tuning
pre-trained model is not always unsupervised learning (Image transfer learning)
Semi-supervised Learning
Source: http://jalammar.github.io/illustrated-bert/
Self-supervised Learning
Source: https://www.kakaobrain.com/blog/118
Source: https://syncedreview.com/2019/02/22/yann-lecun-cake-analogy-2-0/ Source: https://www.slideshare.net/rouyunpan/deep-learning-hardware-past-present-future
Source: https://www.slideshare.net/xavigiro/selfsupervised-learning-from-video-sequences-xavier-giro-upc-barcelona-2019
Labeled data
Source: https://medium.com/@behnamsabeti/various-types-of-supervision-in-machine-learning-c7f32c190fbe