Show simple item record

dc.contributor.authorRagni, Antonen
dc.contributor.authorWu, Chunyangen
dc.contributor.authorGales, Marken
dc.contributor.authorVasilakes, Jen
dc.contributor.authorKnill, Katherineen
dc.date.accessioned2018-03-23T14:54:23Z
dc.date.available2018-03-23T14:54:23Z
dc.date.issued2017-06-16en
dc.identifier.isbn9781509041176en
dc.identifier.issn1520-6149
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/274301
dc.description.abstract© 2017 IEEE. Training neural network acoustic models on limited quantities of data is a challenging task. A number of techniques have been proposed to improve generalisation. This paper investigates one such technique called stimulated training. It enables standard criteria such as cross-entropy to enforce spatial constraints on activations originating from different units. Having different regions being active depending on the input unit may help network to discriminate better and as a consequence yield lower error rates. This paper investigates stimulated training for automatic speech recognition of a number of languages representing different families, alphabets, phone sets and vocabulary sizes. In particular, it looks at ensembles of stimulated networks to ensure that improved generalisation will withstand system combination effects. In order to assess stimulated training beyond 1-best transcription accuracy, this paper looks at keyword search as a proxy for assessing quality of lattices. Experiments are conducted on IARPA Babel program languages including the surprise language of OpenKWS 2016 competition.
dc.titleStimulated training for automatic speech recognition and keyword search in limited resource conditionsen
dc.typeConference Object
prism.endingPage4834
prism.publicationDate2017en
prism.publicationNameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedingsen
prism.startingPage4830
dc.identifier.doi10.17863/CAM.21426
dcterms.dateAccepted2016-12-12en
rioxxterms.versionofrecord10.1109/ICASSP.2017.7953074en
rioxxterms.versionAM*
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserveden
rioxxterms.licenseref.startdate2017-06-16en
dc.contributor.orcidWu, Chunyang [0000-0002-0269-3555]
dc.contributor.orcidGales, Mark [0000-0002-5311-8219]
dc.contributor.orcidKnill, Katherine [0000-0003-1292-2769]
rioxxterms.typeConference Paper/Proceeding/Abstracten
pubs.funder-project-idIARPA (4912046943)
rioxxterms.freetoread.startdate2018-06-19


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record