HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese

Nguyen Luong Tran; Duong Minh Le; Dat Quoc Nguyen

BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese

Abstract

We present BARTpho with two versions, BARTpho-syllable and BARTpho-word, which are the first public large-scale monolingual sequence-to-sequence models pre-trained for Vietnamese. BARTpho uses the "large" architecture and the pre-training scheme of the sequence-to-sequence denoising autoencoder BART, thus it is especially suitable for generative NLP tasks. We conduct experiments to compare our BARTpho with its competitor mBART on a downstream task of Vietnamese text summarization and show that: in both automatic and human evaluations, BARTpho outperforms the strong baseline mBART and improves the state-of-the-art. We further evaluate and compare BARTpho and mBART on the Vietnamese capitalization and punctuation restoration tasks and also find that BARTpho is more effective than mBART on these two tasks. We publicly release BARTpho to facilitate future research and applications of generative Vietnamese NLP tasks. Our BARTpho models are available at https://github.com/VinAIResearch/BARTpho

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
abstractive-text-summarization-on-vietnewsBARTpho
Rouge-1: 61.14
Rouge-2: 30.31
Rouge-L: 40.15

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp