The Role of Transfer Learning in Multilingual Neural Machine Translation

Authors

  • Sophie Martin Alps Institute of Technology, Switzerland
  • Muhammad Ibrahim Alps Institute of Technology, Switzerland

Abstract

Multilingual Neural Machine Translation (MNMT) benefits significantly from transfer learning, which leverages pre-trained models on high-resource languages to enhance translation quality for low-resource languages. This technique enables cross-lingual knowledge sharing, improving translation accuracy and fluency, and addressing data scarcity issues. By utilizing pre-trained linguistic features, transfer learning reduces the amount of required training data and computational resources. This paper highlights various transfer learning approaches, such as zero-shot and few-shot translation, and examines their impact on translation quality using metrics like BLEU scores. Despite challenges like negative transfer, transfer learning shows immense potential in optimizing MNMT systems, fostering more inclusive multilingual communication. The findings underscore the transformative potential of transfer learning in MNMT, suggesting pathways for future research to optimize model architectures, training strategies, and language pairings.

Downloads

Published

2024-04-15