Repository logo
 

Decision Forests, Convolutional Networks and the Models in-Between

Published version
Peer-reviewed

Type

Report

Change log

Authors

Robertson, Duncan 
Zikic, Darko 
Kontschieder, Peter 
Shotton, Jamie 

Abstract

This paper investigates the connections between two state of the art classifiers: decision forests (DFs, including decision jungles) and convolutional neural networks (CNNs). Decision forests are computationally efficient thanks to their conditional computation property (computation is confined to only a small region of the tree, the nodes along a single branch). CNNs achieve state of the art accuracy, thanks to their representation learning capabilities. We present a systematic analysis of how to fuse conditional computation with representation learning and achieve a continuum of hybrid models with different ratios of accuracy vs. efficiency. We call this new family of hybrid models conditional networks. Conditional networks can be thought of as: i) decision trees augmented with data transformation operators, or ii) CNNs, with block-diagonal sparse weight matrices, and explicit data routing functions. Experimental validation is performed on the common task of image classification on both the CIFAR and Imagenet datasets. Compared to state of the art CNNs, our hybrid models yield the same accuracy with a fraction of the compute cost and much smaller number of parameters.

Description

Keywords

cs.CV, cs.CV, cs.AI

Is Part Of

Publisher

Publisher DOI

Publisher URL