New NLP News—BERT, GPT-2, XLNet, NAACL, ICML, arXiv, EurNLP https://t.co/4URmn0kd9e (via @revue)
— Sebastian Ruder (@seb_ruder) June 24, 2019
New NLP News—BERT, GPT-2, XLNet, NAACL, ICML, arXiv, EurNLP https://t.co/4URmn0kd9e (via @revue)
— Sebastian Ruder (@seb_ruder) June 24, 2019
That is 4x the average salary in the US and 9.5x the poverty line. https://t.co/3OpWKfHZ8H
— Mark 🦑 Riedl (@mark_riedl) June 25, 2019
XLNet is released. Less parameters than largest GPT-2, but trained for longer, with a more powerful architecture. Has anyone tried language generation with it? Is it comparable to GPT-2? (My guess: yes)
— Jeremy Howard (@jeremyphoward) June 25, 2019
If so, then the genie is already out of the bottle.https://t.co/Cohg2Rx8YX