Tweeted By @tunguz
Introducing MASS – A pre-training method that outperforms BERT and GPT in sequence to sequence language generation tasks https://t.co/BzuZd7MKIL
— Bojan Tunguz (@tunguz) June 24, 2019
Introducing MASS – A pre-training method that outperforms BERT and GPT in sequence to sequence language generation tasks https://t.co/BzuZd7MKIL
— Bojan Tunguz (@tunguz) June 24, 2019