Discriminative learning for dimensionality reduction and classification

TitleDiscriminative learning for dimensionality reduction and classification
Publication TypeConference Proceedings
Year of Conference2008
AuthorsLacoste-Julien, Simon, Fei Sha, & Michael I. Jordan
Conference NameNeural Information Processing Systems (NIPS'08)
Date Published12/2008
Conference LocationVancouver, Canada
Abstract

Probabilistic topic models (and their extensions) have become popular as models of latent structures in collections of text documents or images. These models are usually treated as generative models and trained using maximum likelihood estimation, an approach which may be suboptimal in the context of an overall classification problem. In this paper, we describe DiscLDA, a discriminative learning framework for such models as Latent Dirichlet Allocation (LDA) in the setting of dimensionality reduction with supervised side information. In DiscLDA, a class-dependent linear transformation is introduced on the topic mixture proportions. This parameter is estimated by maximizing the conditional likelihood using Monte Carlo EM. By using the transformed topic mixture proportions as a new representation of documents, we obtain a supervised dimensionality reduction algorithm that uncovers the latent structure in a document collection while preserving predictive power for the task of classification. We compare the predictive power of the latent structure of DiscLDA with unsupervised LDA on the 20 Newsgroup ocument classification task.

URLhttp://books.nips.cc/papers/files/nips21/NIPS2008_0993.pdf