Tweeted By @jeremyphoward
XLNet is released. Less parameters than largest GPT-2, but trained for longer, with a more powerful architecture. Has anyone tried language generation with it? Is it comparable to GPT-2? (My guess: yes)
— Jeremy Howard (@jeremyphoward) June 25, 2019
If so, then the genie is already out of the bottle.https://t.co/Cohg2Rx8YX