Repository logo
 

Online representation learning in recurrent neural language models

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Rei, M 

Abstract

© 2015 Association for Computational Linguistics. We investigate an extension of continuous online learning in recurrent neural network language models. The model keeps a separate vector representation of the current unit of text being processed and adaptively adjusts it after each prediction. The initial experiments give promising results, indicating that the method is able to increase language modelling accuracy, while also decreasing the parameters needed to store the model along with the computation required at each step.

Description

Keywords

Journal Title

Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing

Conference Name

EMNLP 2015: Conference on Empirical Methods in Natural Language Processing

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics
Sponsorship
Cambridge Assessment (unknown)