Tweeted By @Thom_Wolf
Nice work by @alex_conneau @kakemeister and co. on pretraining multilingual language models to overcome the curse of multilinguality.
— Thomas Wolf (@Thom_Wolf) November 7, 2019
Pretty impressive to see the resulting 100-languages model challenge strong English-only models like XLNet & RoBERTa 👇https://t.co/vMvtAJ2tjP pic.twitter.com/VPJ5QIbPUK