jwkanggist / self-supervised-learning-narratives-1

거꾸로 읽는 self-supervised learning 파트 1
50 stars 7 forks source link

[5주차] Representation Learning with Contrastive Predictive Coding #14

Open wavy-jung opened 2 years ago

wavy-jung commented 2 years ago

Keywords

CPC, InfoNCE, Predictive Coding, Negative Sampling

TL;DR

Abstract

While supervised learning has enabled great progress in many applications, unsupervised learning has not seen such widespread adoption, and remains an important and challenging endeavor for artificial intelligence. In this work, we propose a universal unsupervised learning approach to extract useful representations from high-dimensional data, which we call Contrastive Predictive Coding. The key insight of our model is to learn such representations by predicting the future in latent space by using powerful autoregressive models. We use a probabilistic contrastive loss which induces the latent space to capture information that is maximally useful to predict future samples. It also makes the model tractable by using negative sampling. While most prior work has focused on evaluating representations for a particular modality, we demonstrate that our approach is able to learn useful representations achieving strong performance on four distinct domains: speech, images, text and reinforcement learning in 3D environments.

Paper link

https://arxiv.org/abs/1807.03748

Presentation link

https://docs.google.com/presentation/d/1QDXmJL5YvycXf8vL-OYP61FKVTUecNqk/edit?usp=sharing&ouid=114847754426815005538&rtpof=true&sd=true

video link

https://youtu.be/vgzDpgxDVGQ

jwkanggist commented 2 years ago

@Doohae 님 감사합니다

혹시 13페이지 이후 수정이 안됐을까요 ?

wavy-jung commented 2 years ago

@Doohae 님 감사합니다

혹시 13페이지 이후 수정이 안됐을까요 ?

제가 수정 전 링크를 올렸었네요🥲 링크 수정해서 업데이트했습니다!