HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

BARThez: a Skilled Pretrained French Sequence-to-Sequence Model

Moussa Kamal Eddine Antoine J.-P. Tixier Michalis Vazirgiannis

BARThez: a Skilled Pretrained French Sequence-to-Sequence Model

Abstract

Inductive transfer learning has taken the entire NLP field by storm, with models such as BERT and BART setting new state of the art on countless NLU tasks. However, most of the available models and research have been conducted for English. In this work, we introduce BARThez, the first large-scale pretrained seq2seq model for French. Being based on BART, BARThez is particularly well-suited for generative tasks. We evaluate BARThez on five discriminative tasks from the FLUE benchmark and two generative tasks from a novel summarization dataset, OrangeSum, that we created for this research. We show BARThez to be very competitive with state-of-the-art BERT-based French language models such as CamemBERT and FlauBERT. We also continue the pretraining of a multilingual BART on BARThez' corpus, and show our resulting model, mBARThez, to significantly boost BARThez' generative performance. Code, data and models are publicly available.

Code Repositories

Tixierae/OrangeSum
Mentioned in GitHub
moussaKam/BARThez
Official
pytorch
Mentioned in GitHub
moussaKam/OrangeSum
Mentioned in GitHub
huggingface/transformers
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
text-summarization-on-orangesummBARThez (OrangeSum abstract)
ROUGE-1: 32.67
text-summarization-on-orangesumBARThez (OrangeSum abstract)
ROUGE-1: 31.44

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp