Few-Shot Table-to-Text Generation with Prototype Memory
dc.contributor.author | Su, Y | |
dc.contributor.author | Meng, Z | |
dc.contributor.author | Baker, S | |
dc.contributor.author | Collier, N | |
dc.date.accessioned | 2022-05-27T23:30:38Z | |
dc.date.available | 2022-05-27T23:30:38Z | |
dc.date.issued | 2021 | |
dc.identifier.isbn | 9781955917100 | |
dc.identifier.uri | https://www.repository.cam.ac.uk/handle/1810/337574 | |
dc.description.abstract | Neural table-to-text generation models have achieved remarkable progress on an array of tasks. However, due to the data-hungry nature of neural models, their performances strongly rely on large-scale training examples, limiting their applicability in real-world applications. To address this, we propose a new framework: Prototype-to-Generate (P2G), for table-to-text generation under the few-shot scenario. The proposed framework utilizes the retrieved prototypes, which are jointly selected by an IR system and a novel prototype selector to help the model bridging the structural gap between tables and texts. Experimental results on three benchmark datasets with three state-of-the-art models demonstrate that the proposed framework significantly improves the model performance across various evaluation metrics. | |
dc.publisher | Association for Computational Linguistics | |
dc.rights | Attribution 4.0 International | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.title | Few-Shot Table-to-Text Generation with Prototype Memory | |
dc.type | Conference Object | |
dc.publisher.department | Department of Theoretical & Applied Linguistics | |
dc.publisher.department | Faculty of Modern And Medieval Languages And Linguistics | |
dc.date.updated | 2022-05-27T06:36:32Z | |
prism.endingPage | 917 | |
prism.publicationDate | 2021 | |
prism.publicationName | Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 | |
prism.startingPage | 910 | |
dc.identifier.doi | 10.17863/CAM.84983 | |
dcterms.dateAccepted | 2021-08-25 | |
rioxxterms.versionofrecord | 10.18653/v1/2021.findings-emnlp.77 | |
rioxxterms.version | VoR | |
dc.contributor.orcid | Su, Yixuan [0000-0002-1472-7791] | |
dc.contributor.orcid | Collier, Nigel [0000-0002-7230-4164] | |
cam.issuedOnline | 2021-11 | |
pubs.conference-name | Findings of the Association for Computational Linguistics: EMNLP 2021 | |
pubs.conference-start-date | 2021-11 | |
cam.orpheus.success | 2022-08-30: Embargo removed | |
cam.orpheus.counter | 11 | |
cam.depositDate | 2022-05-27 | |
pubs.conference-finish-date | 2021-11 | |
pubs.licence-identifier | apollo-deposit-licence-2-1 | |
pubs.licence-display-name | Apollo Repository Deposit Licence Agreement |
Files in this item
This item appears in the following Collection(s)
-
Cambridge University Research Outputs
Research outputs of the University of Cambridge