nishnik / Paper-Leaf

Contains the description of various papers I have read or reading
12 stars 0 forks source link

CTRL: A Conditional Transformer Language Mode for Controllable Generation #11

Open nishnik opened 5 years ago

nishnik commented 5 years ago

Paper link pdf Nitish Shirish Keskar∗, Bryan McCann∗, Lav R. Varshney, Caiming Xiong, Richard Socher

Introduction

Language Modeling with CTRL

image

Basically a transformer model (huge transformer model) with multi head attention.

3.2 Experimental Settings

BPE: Previous work addresses the translation of out-of-vocabulary words by backing off to a dictionary. In this paper, we introduce a simpler and more effective approach, making the NMT model capable of open-vocabulary translation by encoding rare and unknown words as sequences of subword units. This is based on the intuition that various word classes are translatable via smaller units than words, for instance names (via character copying or transliteration), compounds (via compositional translation), and cognates and loanwords (via phonological and morphological transformations).

Sampling

image (Gumble Softmax)