@incollection{NIPS2016_6571,
title = {Exponential Family Embeddings},
author = {Rudolph, Maja and Ruiz, Francisco and Mandt, Stephan and Blei, David},
booktitle = {Advances in Neural Information Processing Systems 29},
editor = {D. D. Lee and M. Sugiyama and U. V. Luxburg and I. Guyon and R. Garnett},
pages = {478--486},
year = {2016},
publisher = {Curran Associates, Inc.},
url = {http://papers.nips.cc/paper/6571-exponential-family-embeddings.pdf}
}
1. What is it?
They proposed exponential family-based embeddings.
2. What is amazing compared to previous works?
They embed neural data, count data, and rating data as word embeddings.
3. Where is the key to technologies and techniques?
Exponential Family-based Embeddings have three ingredients:
context function
conditional exponential family
embedding structure
3.1 context function
They defined that each observation conditioned on a set of other observations (context).
For example, the context is defined:
neuroscience: neural activities of other nearby neurons
shopping: other items in the shopping cart
3.2 conditional exponential family
They defined these conditional probabilities.
A natural parameter η is computed by linear models (target embedding ρ, context embedding α) and a link function f(・).
3.3 embedding structure
The target and context embeddings shear parameters at the same point.
To study a target embedding ρ and a context embedding α, they defined the objective function.
They use logp(ρ) and logp(α) as regularizers. (e.g. a Gaussian probability leads to l2 regularization)
4. How did evaluate it?
4.1 neural data analysis
Data: the neural activity of a larval zebrafish, recorded at a single-cell resolution for 3000-time frames.
Model: a Gaussian embedding (G-EMB) and nonnegative Gaussian embedding (NG-EMB).
Evaluation: (1) leave one out, (2) leave 25% out
4.2 market basket analysis and movie review
Data (market): IRI dataset contains purchases.
Data (movie): MovieLens-100K dataset.
Model: a Poisson embedding (P-EMB) and additive Poisson embedding (AP-EMB)
0. Paper
@incollection{NIPS2016_6571, title = {Exponential Family Embeddings}, author = {Rudolph, Maja and Ruiz, Francisco and Mandt, Stephan and Blei, David}, booktitle = {Advances in Neural Information Processing Systems 29}, editor = {D. D. Lee and M. Sugiyama and U. V. Luxburg and I. Guyon and R. Garnett}, pages = {478--486}, year = {2016}, publisher = {Curran Associates, Inc.}, url = {http://papers.nips.cc/paper/6571-exponential-family-embeddings.pdf} }
1. What is it?
They proposed exponential family-based embeddings.
2. What is amazing compared to previous works?
They embed neural data, count data, and rating data as word embeddings.
3. Where is the key to technologies and techniques?
Exponential Family-based Embeddings have three ingredients:
3.1 context function
They defined that each observation conditioned on a set of other observations (context). For example, the context is defined:
3.2 conditional exponential family
They defined these conditional probabilities. A natural parameter η is computed by linear models (target embedding ρ, context embedding α) and a link function f(・).
3.3 embedding structure
The target and context embeddings shear parameters at the same point. To study a target embedding ρ and a context embedding α, they defined the objective function. They use logp(ρ) and logp(α) as regularizers. (e.g. a Gaussian probability leads to l2 regularization)
4. How did evaluate it?
4.1 neural data analysis
4.2 market basket analysis and movie review
5. Is there a discussion?
6. Which paper should read next?