‼️ 1.5B parameter GPT-2 model released, but not by OpenAI https://t.co/8tgjUWxjZo
— Mark 🦑. Riedl (@mark_riedl) August 22, 2019
‼️ 1.5B parameter GPT-2 model released, but not by OpenAI https://t.co/8tgjUWxjZo
— Mark 🦑. Riedl (@mark_riedl) August 22, 2019
This replication project trained a 1.5B parameter “OpenGPT-2” model on OpenWebTextCorpus, a 38GB dataset similar to the original, and showed comparable results to original GPT-2 on various benchmarks. 👏🏼https://t.co/m4ZMB8RmdShttps://t.co/ZrqJ0IuHbw https://t.co/o3KBv5VXKJ pic.twitter.com/pGN0p00DBR
— hardmaru (@hardmaru) August 23, 2019