Currently working on the coming NAACL "Transfer Learning in NLP" tutorial with @seb_ruder @mattthemathman and @swabhz. Pretty excited!
— Thomas Wolf (@Thom_Wolf) May 18, 2019
And I've discovered you can write a Transformer model like GPT-2 in less than 40 lines of code now!
40 lines of code & 40 GB of data... pic.twitter.com/VVABKHNLB7