Tweeted By @dennybritz
“Bigger Models Are More Label-Efficient” - They beat supervised models on ImageNet with just 10% of the labels: https://t.co/AigBnPDQil
— Denny Britz (@dennybritz) June 19, 2020
OpenAI’s Image GPT made a similar point. Interesting b/c intuitively this is not obvious. Maybe bigger = representations are more disentangled? pic.twitter.com/f1QJawYO0O