Tweeted By @ak92501
Yuan 1.0: Large-Scale Pre-trained Language Model in Zero-Shot and Few-Shot Learning
— AK (@ak92501) October 12, 2021
abs: https://t.co/2bT9if0KTH
singleton language model with 245B parameters, sota results on natural language processing tasks. high-quality Chinese corpus with 5TB high quality texts pic.twitter.com/4p6qxxDm0Y