IJCATR Volume 14 Issue 9

Transformer based Tibetan-Chinese Neural Machine Translation

Chao Tang, Zehua Lv, Ximing Yuan
10.7753/IJCATR1409.1003
keywords : Tibetan-Chinese translation; beam search; recurrent mechanism; transformer

PDF
Neural machine translation has demonstrated good performance in many tasks, but its performance in low-resource languages remains unsatisfactory. To address this issue, this paper proposes a Transformer-based Chinese-Tibetan machine translation model. The model introduces a recurrent mechanism and temporal encoding on the basis of Transformer, enhancing the model's generalization ability and computational efficiency. Additionally, Beam Search is employed on the decoding side to optimize the generation process. To address potential homophonic or phonetically similar errors in the output, word-level language model perplexity is used for evaluation. Experiments use BLEU as the evaluation metric, and the results indicate that under the constraint of limited Tibetan-Chinese parallel corpora, the proposed improved Transformer model effectively enhances translation quality, with a 1.63% increase in BLEU scores.
@artical{c1492025ijcatr14091003,
Title = "Transformer based Tibetan-Chinese Neural Machine Translation",
Journal ="International Journal of Computer Applications Technology and Research (IJCATR)",
Volume = "14",
Issue ="9",
Pages ="19 - 22",
Year = "2025",
Authors ="Chao Tang, Zehua Lv, Ximing Yuan"}