ajbc / lda-svi

an implementation of latent Dirichlet allocation (LDA) with stochastic variational inference
GNU General Public License v3.0
20 stars 16 forks source link

ONLINE VARIATIONAL BAYES FOR LATENT DIRICHLET ALLOCATION

Allison J.B. Chaney achaney@cs.princeton.edu

This code is branched from source originally created by Matthew D. Hoffman mdhoffma@cs.princeton.edu

(C) Copyright 2010, Matthew D. Hoffman Modifications (C) Copyright 2014, Allison J. B. Chaney

This is free software, you can redistribute it and/or modify it under the terms of the GNU General Public License.

The GNU General Public License does not permit this software to be redistributed in proprietary programs.

This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA


This Python code implements the online Variational Bayes (VB) algorithm presented in the paper "Online Learning for Latent Dirichlet Allocation" by Matthew D. Hoffman, David M. Blei, and Francis Bach, to be presented at NIPS 2010.

The algorithm uses stochastic optimization to maximize the variational objective function for the Latent Dirichlet Allocation (LDA) topic model. It only looks at a subset of the total corpus of documents each iteration, and thereby is able to find a locally optimal setting of the variational posterior over the topics more quickly than a batch VB algorithm could for large corpora.

Files provided:

You will need to have the numpy and scipy packages installed somewhere that Python can find them to use these scripts.

Example: python onlinewikipedia.py 101 python printtopics.py dictnostops.txt lambda-100.dat 10

This would run the algorithm for 101 iterations, and display the (expected value under the variational posterior of the) topics fit by the algorithm. (Note that the algorithm will not have fully converged after 101 iterations---this is just to give an idea of how to use the code.) The last argument for printtopics is the number of terms to print per topic.