Show simple item record

dc.contributor.authorBarrett, M
dc.contributor.authorBingel, J
dc.contributor.authorHollenstein, N
dc.contributor.authorRei, Marek
dc.contributor.authorSøgaard, A
dc.date.accessioned2019-01-15T00:31:23Z
dc.date.available2019-01-15T00:31:23Z
dc.date.issued2018
dc.identifier.isbn9781948087728
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/287996
dc.description.abstractLearning attention functions requires large volumes of data, but many NLP tasks simulate human behavior, and in this paper, we show that human attention really does provide a good inductive bias on many attention functions in NLP. Specifically, we use estimated human attention derived from eye-tracking corpora to regularize attention functions in recurrent neural networks. We show substantial improvements across a range of tasks, including sentiment analysis, grammatical error detection, and detection of abusive language.
dc.publisherAssociation for Computational Linguistics
dc.titleSequence classification with human attention
dc.typeConference Object
prism.endingPage312
prism.publicationDate2018
prism.publicationNameCoNLL 2018 - 22nd Conference on Computational Natural Language Learning, Proceedings
prism.startingPage302
dc.identifier.doi10.17863/CAM.35315
dcterms.dateAccepted2018-07-27
rioxxterms.versionofrecord10.18653/v1/k18-1030
rioxxterms.versionAM
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.licenseref.startdate2018-01-01
rioxxterms.typeConference Paper/Proceeding/Abstract
pubs.funder-project-idCambridge Assessment (unknown)
pubs.conference-nameProceedings of the 22nd Conference on Computational Natural Language Learning
pubs.conference-start-date2018-10
cam.orpheus.successThu Nov 05 11:53:19 GMT 2020 - Embargo updated
pubs.conference-finish-date2018-10
rioxxterms.freetoread.startdate2019-01-01


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record