BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese

Published:

Nguyen Luong Tran, Duong Minh Le and Dat Quoc Nguyen

Abstract: We present BARTpho with two versions—BARTphoword and BARTphosyllable—the first public large-scale monolingual sequence-to-sequence models pre-trained for Vietnamese. Our BARTpho uses the “large” architecture and pre-training scheme of the sequence-to-sequence denoising model BART, thus especially suitable for generative NLP tasks. Experiments on a downstream task of Vietnamese text summarization show that in both automatic and human evaluations, our BARTpho outperforms the strong baseline mBART and improves the state-of-the-art. We release BARTpho to facilitate future research and applications of generative Vietnamese NLP tasks. Our BARTpho models are available at: this https URL.

Direct Link