Tweeted By @sleepinyourhat
I'm still looking through the enormous number of interesting NLP submissions to #ICLR2020, but I'm really excited to see *two* new pretraining methods that outperform XLNet/RoBERTa on NLU tasks with far fewer parameters/FLOPS: https://t.co/j2IarJd2lC https://t.co/O6VnnhuDsm
— Sam Bowman (@sleepinyourhat) September 26, 2019