Tweeted By @seb_ruder
This is a nice diagram by Zhengyan Zhang and @BakserWang that shows how many recent pretrained language models are connected. The GitHub repo contains a full list of relevant papers: https://t.co/uQNRGqMJAA pic.twitter.com/gJeJkUKAvR
— Sebastian Ruder (@seb_ruder) October 8, 2019