Really impressed with these early fine-tuning results with GPT-2-117M, and great to see that there's also a Colab from @roadrunning01! Giving it a try this weekend :) https://t.co/sVJeZOI8es
— Miles Brundage (@Miles_Brundage) March 16, 2019
Really impressed with these early fine-tuning results with GPT-2-117M, and great to see that there's also a Colab from @roadrunning01! Giving it a try this weekend :) https://t.co/sVJeZOI8es
— Miles Brundage (@Miles_Brundage) March 16, 2019
Some GPT-2-117M outputs after fine-tuning on memorable movie quotes from (https://t.co/vZyr8mjNki), using @roadrunning01's Colab. The format of the training data was: [movie title] [line break] [quote] [sometimes another quotes], sometimes with "<p>"/"</p>" thrown in the mix. pic.twitter.com/CjnoAr8L7t
— Miles Brundage (@Miles_Brundage) March 16, 2019
Next and possibly final GPT-2-117M fine-tuning run of the weekend - quotations from this dataset: https://t.co/yOgEkaSsYe
— Miles Brundage (@Miles_Brundage) March 17, 2019
Reminder that it's super easy to use this Colab by following the instructions at the top, running the cells in order, and changing the dataset link and filename to a different .txt file :) https://t.co/mkyFTJ5V8Z
— Miles Brundage (@Miles_Brundage) March 17, 2019
https://t.co/9IHozCXzzc --- r/ML gets the small GPT2 to write imaginary news. I didn't expect the small model to be that coherent.
— Ilya Sutskever (@ilyasut) March 20, 2019
Fine-tuning GPT-2-117M on jokes - off to a rocky start:
— Miles Brundage (@Miles_Brundage) March 23, 2019
"I just saw this guy with six eyes going to the restroom. He looks like a ghost."
"My girlfriend keeps asking me what I think of her 'soul' ... I do wonder how she can be so narcissistic."
Things transformers say: https://t.co/L3hgSz5Wfe
— Ilya Sutskever (@ilyasut) May 8, 2019
I've released my web UI for GPT-2-117M that allows you to generate text from the original model, backed by Google Cloud Run for massive scalability at mostly no cost!https://t.co/LO4hHuQo2l pic.twitter.com/HNhg25x8bz
— Max Woolf (@minimaxir) June 10, 2019
We've spent a few evenings last week building an interactive demo called *Write with Transformer*
— Thomas Wolf (@Thom_Wolf) June 13, 2019
It lets you interact in a very intimate way with GPT-2, call, control, question the model... and I just can't stop playing with it!
You can try it at https://t.co/EZhtCodnoi https://t.co/me75uCeJ9q
It's here! I've written a (lengthy!) blog post on how to finetune GPT-2 and generate text using gpt-2-simple, along with a history of GPT-2 finetuning and its future. https://t.co/kqLMXdL9IE
— Max Woolf (@minimaxir) September 4, 2019
I've been trying to use a language generation model (GPT-2) to make sketches.
— Robbie Barrat (@DrBeef_) September 26, 2019
The process is heavily inspired by Sol LeWitt's - I use GPT-2 to generate a set of rules describing a drawing; then based on to my interpretation of those rules; I make a processing sketch. pic.twitter.com/cVasjUGrd5
A GPT-2 written essay was submitted to the Economist's youth essay contest.
— Greg Brockman (@gdb) October 2, 2019
One judge, who did not know the essay was written by an AI, gave this review: "It is strongly worded and backs up claims with evidence, but the idea is not incredibly original."https://t.co/RbKrQvN8C0