Repository logo
 

A quantum search decoder for natural language processing

Published version
Peer-reviewed

Change log

Authors

Subramanian, Sathyawageeswar 
Piddock, Stephen 

Abstract

Abstract: Probabilistic language models, e.g. those based on recurrent neural networks such as long short-term memory models (LSTMs), often face the problem of finding a high probability prediction from a sequence of random variables over a set of tokens. This is commonly addressed using a form of greedy decoding such as beam search, where a limited number of highest-likelihood paths (the beam width) of the decoder are kept, and at the end the maximum-likelihood path is chosen. In this work, we construct a quantum algorithm to find the globally optimal parse (i.e. for infinite beam width) with high constant success probability. When the input to the decoder follows a power law with exponent k > 0, our algorithm has runtime Rnf(R, k), where R is the alphabet size, n the input length; here f < 1/2, and f→0 exponentially fast with increasing k, hence making our algorithm always more than quadratically faster than its classical counterpart. We further modify our procedure to recover a finite beam width variant, which enables an even stronger empirical speedup while still retaining higher accuracy than possible classically. Finally, we apply this quantum beam search decoder to Mozilla’s implementation of Baidu’s DeepSpeech neural net, which we show to exhibit such a power law word rank frequency.

Description

Funder: Science and Engineering Research Board; doi: https://doi.org/10.13039/501100001843


Funder: Cambridge Commonwealth Trust; doi: https://doi.org/10.13039/501100003342

Keywords

Research Article, Recurrent neural networks, Quantum algorithms, Quantum search, Parsing, Natural language processing, Quantum speedups

Journal Title

Quantum Machine Intelligence

Conference Name

Journal ISSN

2524-4906
2524-4914

Volume Title

3

Publisher

Springer International Publishing
Sponsorship
Pembroke College, University of Cambridge (Draper’s Research Fellowship)