Show simple item record

dc.contributor.authorRei, Mareken
dc.date.accessioned2019-07-26T10:16:41Z
dc.date.available2019-07-26T10:16:41Z
dc.date.issued2015-09-19en
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/294968
dc.description.abstract© 2015 Association for Computational Linguistics. We investigate an extension of continuous online learning in recurrent neural network language models. The model keeps a separate vector representation of the current unit of text being processed and adaptively adjusts it after each prediction. The initial experiments give promising results, indicating that the method is able to increase language modelling accuracy, while also decreasing the parameters needed to store the model along with the computation required at each step.
dc.language.isoenen
dc.titleOnline representation learning in recurrent neural language modelsen
dc.typeConference Object
prism.endingPage243
prism.publicationDate2015en
prism.publicationNameConference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processingen
prism.startingPage238
dc.identifier.doi10.17863/CAM.21361
dcterms.dateAccepted2015-07-29en
rioxxterms.versionAMen
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserveden
rioxxterms.licenseref.startdate2015-09-19en
rioxxterms.typeConference Paper/Proceeding/Abstracten
pubs.funder-project-idCambridge Assessment (unknown)
cam.issuedOnline2015-09-19en
pubs.conference-nameEMNLP 2015: Conference on Empirical Methods in Natural Language Processingen
pubs.conference-start-date2015-09-19en


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record