Repository logo
 

Federated Principal Component Analysis.

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Mendoza-Smith, Rodrigo 
Crowcroft, Jonathon  ORCID logo  https://orcid.org/0000-0002-7013-0121

Abstract

We present a federated, asynchronous, and (ε, δ)-differentially private algorithm for PCA in the memory-limited setting. Our algorithm incrementally computes local model updates using a streaming procedure and adaptively estimates its r leading principal components when only O(dr) memory is available with d being the dimensionality of the data. We guarantee differential privacy via an input-perturbation scheme in which the covariance matrix of a dataset X is perturbed with a non-symmetric random Gaussian matrix with variance in O((d/n)^2 log(d)), thus improving upon the state-of-the-art. Furthermore, contrary to previous federated or distributed algorithms for PCA, our algorithm is also invariant to permutations in the incoming data, which provides robustness against straggler or failed nodes. Numerical simulations show that, while using limited memory, our algorithm exhibits performance that closely matches or outperforms traditional non-federated algorithms, and in the absence of communication latency, it exhibits attractive horizontal scalability.

Description

Keywords

Journal Title

NeurIPS

Conference Name

34th Conference on Neural Information Processing Systems (NeurIPS 2020

Journal ISSN

Volume Title

Publisher

Rights

All rights reserved
Sponsorship
Alan Turing Institute (unknown)