Repository logo
 

Inter-battery topic representation learning

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Zhang, C 
Kjellström, H 
Ek, CH 

Abstract

In this paper, we present the Inter-Battery Topic Model (IBTM). Our approach extends traditional topic models by learning a factorized latent variable representation. The structured representation leads to a model that marries benefits traditionally associated with a discriminative approach, such as feature selection, with those of a generative model, such as principled regularization and ability to handle missing data. The factorization is provided by representing data in terms of aligned pairs of observations as different views. This provides means for selecting a representation that separately models topics that exist in both views from the topics that are unique to a single view. This structured consolidation allows for efficient and robust inference and provides a compact and efficient representation. Learning is performed in a Bayesian fashion by maximizing a rigorous bound on the log-likelihood. Firstly, we illustrate the benefits of the model on a synthetic dataset,. The model is then evaluated in both uni- and multi-modality settings on two different classification tasks with off-the-shelf convolutional neural network (CNN) features which generate state-of-the-art results with extremely compact representations.

Description

Keywords

Factorized representation, Topic model, Multi-view model, CNN feature, Image classification

Journal Title

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Conference Name

European Conference on Computer Vision (ECCV) 2016, 14th European Conference, Amsterdam, The Netherlands

Journal ISSN

0302-9743
1611-3349

Volume Title

9912 LNCS

Publisher

Springer International Publishing

Rights

All rights reserved