🔥Pytorch-Transformers 1.0🔥— Thomas Wolf (@Thom_Wolf) July 16, 2019
Six NLU/NLG architectures: BERT, GPT, GPT-2, Transfo-XL, XLNet, XLM
Total: 27 pretrained models
Still the same
-SOTA scripts: GLUE, SQuAD, Text generation
-Access hidden-states, attentions...
P.S. Journalists, stop treating Thiel like an Oracle. He lies (or makes mistakes, depending on your interpretation) on a regular basis and is not who anyone should turn to for impartial commentary on anything AI-related.— Miles Brundage (@Miles_Brundage) July 16, 2019