mBART50
mBART50, short for multilingual BART 50, is a pretrained multilingual sequence-to-sequence model developed by Facebook AI Research for neural machine translation across 50 languages. It extends the mBART framework by using a single shared encoder–decoder and a common multilingual vocabulary to enable translation between any language pair in the set.
Architecture and training: The model follows a Transformer-based encoder-decoder architecture inspired by BART. It is trained
Applications and use: mBART50 provides a strong foundation for multilingual translation and cross-lingual transfer in NLP
Limitations: Translation quality varies by language and data availability; lower-resource languages often lag behind higher-resource ones.
See also: mBART; multilingual BART; neural machine translation.