Repository logo
 

Attending to characters in neural sequence labeling models

Published version
Peer-reviewed

Type

Conference Object

Change log

Authors

Rei, M 
Crichton, GKO 
Pyysalo, S 

Abstract

Sequence labeling architectures use word embeddings for capturing similarity, but suffer when handling previously unseen or rare words. We investigate character-level extensions to such models and propose a novel architecture for combining alternative word representations. By using an attention mechanism, the model is able to dynamically decide how much information to use from a word- or character-level component. We evaluated different architectures on a range of sequence labeling datasets, and character-level extensions were found to improve performance on every benchmark. In addition, the proposed attention-based architecture delivered the best results even with a smaller number of trainable parameters.

Description

Keywords

Journal Title

COLING 2016 - 26th International Conference on Computational Linguistics, Proceedings of COLING 2016: Technical Papers

Conference Name

The International Conference on Computational Linguistics (COLING)

Journal ISSN

Volume Title

Publisher

Sponsorship
Cambridge Assessment (unknown)