On Graph Classification Networks, Datasets and Baselines
Accepted version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
Graph classification receives a great deal of attention from the non-Euclidean machine learning community. Recent advances in graph coarsening have enabled the training of deeper networks and produced new state-of-the-art results in many benchmark tasks. We examine how these architectures train and find that performance is highly-sensitive to initialisation and depends strongly on jumping-knowledge structures. We then show that, despite the great complexity of these models, competitive performance is achieved by the simplest of models -- structure-blind MLP, single-layer GCN and fixed-weight GCN -- and propose these be included as baselines in future.
Description
Keywords
cs.LG, cs.LG, stat.ML
Is Part Of
Publisher
Publisher DOI
Publisher URL
Rights
All rights reserved