Tweeted By @stanfordnlp
Humans can do few-shot learning with the help of language (“Live Oak have tiny spikes on their leaves like this”).@jayelmnop, @percyliang & Noah Goodman explore this at #acl2020nlp: Shaping Visual Representations with Language for Few-Shot Classification https://t.co/l14Cg19IS2 pic.twitter.com/LV1nnbO3mx
— Stanford NLP Group (@stanfordnlp) July 4, 2020