a1da4 / paper-survey

Summary of machine learning papers
32 stars 0 forks source link

Reading: Exponential Family Embeddings #127

Open a1da4 opened 4 years ago

a1da4 commented 4 years ago

0. Paper

@incollection{NIPS2016_6571, title = {Exponential Family Embeddings}, author = {Rudolph, Maja and Ruiz, Francisco and Mandt, Stephan and Blei, David}, booktitle = {Advances in Neural Information Processing Systems 29}, editor = {D. D. Lee and M. Sugiyama and U. V. Luxburg and I. Guyon and R. Garnett}, pages = {478--486}, year = {2016}, publisher = {Curran Associates, Inc.}, url = {http://papers.nips.cc/paper/6571-exponential-family-embeddings.pdf} }

1. What is it?

They proposed exponential family-based embeddings.

2. What is amazing compared to previous works?

They embed neural data, count data, and rating data as word embeddings.

3. Where is the key to technologies and techniques?

Exponential Family-based Embeddings have three ingredients:

3.1 context function

They defined that each observation conditioned on a set of other observations (context). For example, the context is defined:

3.2 conditional exponential family

They defined these conditional probabilities. スクリーンショット 2020-10-02 12 33 51 A natural parameter η is computed by linear models (target embedding ρ, context embedding α) and a link function f(・). スクリーンショット 2020-10-02 12 34 05

3.3 embedding structure

The target and context embeddings shear parameters at the same point. To study a target embedding ρ and a context embedding α, they defined the objective function. They use logp(ρ) and logp(α) as regularizers. (e.g. a Gaussian probability leads to l2 regularization) スクリーンショット 2020-10-02 12 44 14

4. How did evaluate it?

4.1 neural data analysis

4.2 market basket analysis and movie review

5. Is there a discussion?

6. Which paper should read next?

a1da4 commented 4 years ago

125 Dynamic Embedding for Language Evolution

They use Bernoulli Embeddings for diachronic semantic change