Tweeted By @stanfordnlp
Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models, e.g., getting SoTA results on DiscoEval—at #acl2020nlp by @dan_iter, @kelvin_guu, Larry Lansing & @jurafsky—#NLProc https://t.co/RjiaMCgKzS pic.twitter.com/UO3Iyygm1F
— Stanford NLP Group (@stanfordnlp) July 2, 2020