Natural Language Generation with Transformer Self-Translation: A Case Study of Multilingual Translation Models
Abstract
In the context of large-scale deep learning models, natural language generation systems are required to handle multiple tasks concurrently. However, the addition of a new task typically necessitates retraining the model from scratch with both the original and new data, resulting in considerable time and resource consumption. Moreover, as the number of supported tasks grows, models with a fixed number of parameters may face capacity limitations, which can degrade overall performance. To address this challenge, this study introduces a Transformer-based self-back-translation approach for natural language generation, termed TransNMT, using multilingual translation as a case study. This method modularizes the model to enable dynamic scalability, effectively mitigating the capacity constraints posed by fixed parameters. Furthermore, a self-back-translation mechanism is designed for the TransNMT model, consisting of both forward and backward translation, which refines the model’s performance internally while reducing external noise. This approach allows the model to perform well, particularly in low-resource translation tasks. Experimental results demonstrate significant improvements in BLEU scores across four low-resource and three high-resource language datasets, with the highest improvement reaching 2.7 BLEU points in one of the low-resource languages.
Keywords
Natural language processing, Natural language generation, Multilingual translation
Citation Format:
Yewei Zhang, Yatao Mu, Jiawei Zhang, "Natural Language Generation with Transformer Self-Translation: A Case Study of Multilingual Translation Models," Journal of Internet Technology, vol. 26, no. 6 , pp. 767-774, Nov. 2025.
Yewei Zhang, Yatao Mu, Jiawei Zhang, "Natural Language Generation with Transformer Self-Translation: A Case Study of Multilingual Translation Models," Journal of Internet Technology, vol. 26, no. 6 , pp. 767-774, Nov. 2025.
Refbacks
- There are currently no refbacks.
Published by Executive Committee, Taiwan Academic Network, Ministry of Education, Taipei, Taiwan, R.O.C
JIT Editorial Office, Office of Library and Information Services, National Dong Hwa University
No. 1, Sec. 2, Da Hsueh Rd., Shoufeng, Hualien 974301, Taiwan, R.O.C.
Tel: +886-3-931-7314 E-mail: jit.editorial@gmail.com
