Tweeted By @jeremyphoward
We took a quick look at whether you can do something like @OpenAI GPT2 with far less resources. @GuggerSylvain trained a model on a single GPU for 20 hours. Here's the 1st response for the 1st thing we tried.
— Jeremy Howard (@jeremyphoward) February 27, 2019
(More details coming once we've done more research.) pic.twitter.com/VuCW68MtI1