Professional Documents
Culture Documents
BARTpho A0 Poster
BARTpho A0 Poster
{ NGUYEN LUONG TRAN, DUONG MINH LE AND DAT QUOC NGUYEN } VINAI RESEARCH, VIETNAM
REFERENCES
BARTPHO ARCHITECTURE [1] M. Lewis et al. “BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Gener-
ation, Translation, and Comprehension”. In: ACL. 2020.
[2] Y. Liu et al. “Multilingual Denoising Pre-training for Neural Machine Translation”. In: Transactions
of the ACL 8 (2020).
[3] H. Nguyen et al. “VieSum: How Robust Are Transformer-based Models on Vietnamese Summa-
rization?” In: arXiv preprint arXiv:2110.04257v1 (2021).
[4] V.-H. Nguyen et al. “VNDS: A Vietnamese Dataset for Summarization”. In: NICS. 2019.
• Using the standard sequence-to-sequence Transformer architecture and employing the GeLU acti-