Tweeted By @ak92501
Symbolic Knowledge Distillation: from General Language Models to Commonsense Models
— AK (@ak92501) October 15, 2021
abs: https://t.co/tvnpkIUywh
symbolic knowledge distillation, model-to-corpus-to-model pipeline for commonsense that does not require human-authored knowledge–instead, using machine generation pic.twitter.com/XDjvNABTUF