Repository logo
 

ADAPTATION IN LOG-CONCAVE DENSITY ESTIMATION

Accepted version
Peer-reviewed

Change log

Authors

Kim, Arlene KH 
Guntuboyina, Adityanand 
Samworth, Richard J 

Abstract

The log-concave maximum likelihood estimator of a density on the real line based on a sample of size n is known to attain the minimax optimal rate of convergence of O(n−4/5) with respect to, e.g., squared Hellinger distance. In this paper, we show that it also enjoys attractive adaptation properties, in the sense that it achieves a faster rate of convergence when the logarithm of the true density is k-affine (i.e.\ made up of k affine pieces), provided k is not too large. Our results use two different techniques: the first relies on a new Marshall's inequality for log-concave density estimation, and reveals that when the true density is close to log-linear on its support, the log-concave maximum likelihood estimator can achieve the parametric rate of convergence in total variation distance. Our second approach depends on local bracketing entropy methods, and allows us to prove a sharp oracle inequality, which implies in particular that the rate of convergence with respect to various global loss functions, including Kullback--Leibler divergence, is O(knlog5/4n) when the true density is log-concave and its logarithm is close to k-affine.

Description

Keywords

Adaptation, bracketing entropy, log-concavity, maximum likelihood estimation, Marshall's inequality

Journal Title

ANNALS OF STATISTICS

Conference Name

Journal ISSN

0090-5364

Volume Title

46

Publisher

Institute of Mathematical Statistics
Sponsorship
Engineering and Physical Sciences Research Council (EP/J017213/1)
Leverhulme Trust (PLP-2014-353)
Engineering and Physical Sciences Research Council (EP/N031938/1)
Alan Turing Institute (unknown)
AKH Kim: National Research Foundation of Korea (NRF) grant 2017R1C1B5017344. A Guntuboyina: NSF Grant DMS-1309356. RJ Samworth: EPSRC Early Career Fellowship and a grant from the Leverhulme Trust.