Repository logo
 

A neural network multi-task learning approach to biomedical named entity recognition

cam.orpheus.successThu Jan 30 12:58:42 GMT 2020 - The item has an open VoR version.*
dc.contributor.authorCrichton, Gamalen
dc.contributor.authorPyysalo, Sen
dc.contributor.authorChiu, Ben
dc.contributor.authorKorhonen, Anna-Leenaen
dc.contributor.orcidCrichton, Gamal [0000-0002-3036-0811]
dc.date.accessioned2017-08-02T14:07:13Z
dc.date.available2017-08-02T14:07:13Z
dc.date.issued2017-08-15en
dc.description.abstract$\textbf{Background}$ Named Entity Recognition (NER) is a key task in biomedical text mining. Accurate NER systems require task-specific, manually-annotated datasets, which are expensive to develop and thus limited in size. Since such datasets contain related but different information, an interesting question is whether it might be possible to use them together to improve NER performance. To investigate this, we develop supervised, multi-task, convolutional neural network models and apply them to a large number of varied existing biomedical named entity datasets. Additionally, we investigated the effect of dataset size on performance in both single- and multi-task settings. $\textbf{Results}$ We present a single-task model for NER, a Multi-output multi-task model and a Dependent multi-task model. We apply the three models to 15 biomedical datasets containing multiple named entities including Anatomy, Chemical, Disease, Gene/Protein and Species. Each dataset represent a task. The results from the single-task model and the multi-task models are then compared for evidence of benefits from Multi-task Learning. With the Multi-output multi-task model we observed an average F-score improvement of 0.8% when compared to the single-task model from an average baseline of 78.4%. Although there was a significant drop in performance on one dataset, performance improves significantly for five datasets by up to 6.3%. For the Dependent multi-task model we observed an average improvement of 0.4% when compared to the single-task model. There were no significant drops in performance on any dataset, and performance improves significantly for six datasets by up to 1.1%. The dataset size experiments found that as dataset size decreased, the multi-output model’s performance increased compared to the single-task model’s. Using 50, 25 and 10% of the training data resulted in an average drop of approximately 3.4, 8 and 16.7% respectively for the single-task model but approximately 0.2, 3.0 and 9.8% for the multi-task model. $\textbf{Conclusions}$ Our results show that, on average, the multi-task models produced better NER results than the single-task models trained on a single NER dataset. We also found that Multi-task Learning is beneficial for small datasets. Across the various settings the improvements are significant, demonstrating the benefit of Multi-task Learning for this task.
dc.description.sponsorshipThis work was supported by Medical Research Council [grant number MR/M013049/1] and the Cambridge Commonwealth, European and International Trust.
dc.identifier.doi10.17863/CAM.12244
dc.identifier.eissn1471-2105
dc.identifier.issn1471-2105
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/265855
dc.language.isoenen
dc.publisherBioMed Central
dc.rightsAttribution 4.0 Internationalen
dc.rightsAttribution 4.0 Internationalen
dc.rightsAttribution 4.0 Internationalen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.subjectmulti-task learningen
dc.subjectconvolutional neural networksen
dc.subjectnamed entity recognitionen
dc.subjectbiomedical text miningen
dc.titleA neural network multi-task learning approach to biomedical named entity recognitionen
dc.typeArticle
dcterms.dateAccepted2017-07-31en
prism.number368en
prism.publicationDate2017en
prism.publicationNameBMC Bioinformaticsen
prism.volume18en
pubs.funder-project-idMedical Research Council (MR/M013049/1)
rioxxterms.licenseref.startdate2017-08-15en
rioxxterms.licenseref.urihttp://creativecommons.org/licenses/by/4.0/en
rioxxterms.typeJournal Article/Reviewen
rioxxterms.versionVoRen
rioxxterms.versionofrecord10.1186/s12859-017-1776-8en

Files

Original bundle
Now showing 1 - 3 of 3
Loading...
Thumbnail Image
Name:
Crichton.pdf
Size:
877.46 KB
Format:
Adobe Portable Document Format
Description:
Published version
Licence
http://creativecommons.org/licenses/by/4.0/
No Thumbnail Available
Name:
multi-task_learning.pdf
Size:
376.89 KB
Format:
Adobe Portable Document Format
Description:
Accepted version
Licence
http://creativecommons.org/licenses/by/4.0/
No Thumbnail Available
Name:
supplementary.pdf
Size:
176.86 KB
Format:
Adobe Portable Document Format
Description:
Accepted version
Licence
http://creativecommons.org/licenses/by/4.0/
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
DepositLicenceAgreement.pdf
Size:
417.78 KB
Format:
Adobe Portable Document Format