Abstract
Advances in natural language processing, such as transfer learning from pre-trained language models, have impacted how models are trained for programming language tasks too. Previous research primarily explored code pre-training and expanded it through multi-modality and multi-tasking, yet the data for downstream tasks remain modest in size. Focusing on data utilization for downstream tasks, we propose and adapt augmentation methods that yield consistent improvements in code translation and summarization by up to 6.9% and 7.5% respectively. Further analysis suggests that our methods work orthogonally and show benefits in output code style and numeric consistency. We also discuss test data imperfections.
Original language | English |
---|---|
Title of host publication | Findings of the Association for Computational Linguistics: EACL 2023 |
Editors | Andreas Vlachos, Isabelle Augenstein |
Place of Publication | Dubrovnik, Croatia |
Publisher | Association for Computational Linguistics |
Pages | 1542-1550 |
Number of pages | 9 |
ISBN (Electronic) | 978-1-959429-47-0 |
DOIs | |
Publication status | Published - 1 May 2023 |
Event | The 17th Conference of the European Chapter of the Association for Computational Linguistics - Valamar Lacroma, Dubrovnik, Croatia Duration: 2 May 2023 → 6 May 2023 Conference number: 17 https://2023.eacl.org/ |
Conference
Conference | The 17th Conference of the European Chapter of the Association for Computational Linguistics |
---|---|
Abbreviated title | EACL 2023 |
Country/Territory | Croatia |
City | Dubrovnik |
Period | 2/05/23 → 6/05/23 |
Internet address |