Tweeted By @ak92501
mGPT: Few-Shot Learners Go Multilingual
— AK (@ak92501) April 19, 2022
abs: https://t.co/9uxHVoqRXO
introduces two autoregressive GPT-like models with 1.3 billion and 13 billion parameters trained on 60 languages
from 25 language families using Wikipedia and Colossal Clean Crawled Corpus pic.twitter.com/gDGX6qjv8A