Tweeted By @alex_conneau
Happy to share our latest paper: "Self-training Improves Pretraining for Natural Language Understanding"
— Alexis Conneau (@alex_conneau) October 15, 2020
We show that self-training is complementary to strong unsupervised pretraining (RoBERTa) on a variety of tasks.
Paper: https://t.co/Fi1N9UKao7
Code: https://t.co/SsPSENYw5L pic.twitter.com/n4IUsYfVGF