Tweeted By @mark_riedl
GPT-3 has 175 billion parameters, trained on 300 billion tokenshttps://t.co/rE97CQclwl pic.twitter.com/5tJgwwmABN
— Mark Riedl (@mark_riedl) May 29, 2020
GPT-3 has 175 billion parameters, trained on 300 billion tokenshttps://t.co/rE97CQclwl pic.twitter.com/5tJgwwmABN
— Mark Riedl (@mark_riedl) May 29, 2020