Research on the robustness of neural machine translation systems in word order perturbation

Pre-trained language model is one of the most important models in the natural language processing field, as pre-train-finetune has become the paradigm in various NLP downstream tasks.Previous studies have proved integrating pre-trained language models (e.g., BERT) into neural machine translation (NM...

Full description

Saved in:
Bibliographic Details
Main Authors: Yuran ZHAO, Tang XUE, Gongshen LIU
Format: Article
Language:English
Published: POSTS&TELECOM PRESS Co., LTD 2023-10-01
Series:网络与信息安全学报
Subjects:
Online Access:http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2023078
Tags: Add Tag
No Tags, Be the first to tag this record!