Few-Shot Table-to-Text Generation with Prototype Memory
View / Open Files
Publication Date
2021-01-01Journal Title
Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021
ISBN
9781955917100
Pages
910-917
Type
Conference Object
This Version
VoR
Metadata
Show full item recordCitation
Su, Y., Meng, Z., Baker, S., & Collier, N. (2021). Few-Shot Table-to-Text Generation with Prototype Memory. Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021, 910-917. https://doi.org/10.17863/CAM.84983
Abstract
Neural table-to-text generation models have achieved remarkable progress on an array of tasks. However, due to the data-hungry nature of neural models, their performances strongly rely on large-scale training examples, limiting their applicability in real-world applications. To address this, we propose a new framework: Prototype-to-Generate (P2G), for table-to-text generation under the few-shot scenario. The proposed framework utilizes the retrieved prototypes, which are jointly selected by an IR system and a novel prototype selector to help the model bridging the structural gap between tables and texts. Experimental results on three benchmark datasets with three state-of-the-art models demonstrate that the proposed framework significantly improves the model performance across various evaluation metrics.
Embargo Lift Date
2100-01-01
Identifiers
External DOI: https://doi.org/10.17863/CAM.84983
This record's URL: https://www.repository.cam.ac.uk/handle/1810/337574
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.