Tweeted By @sleepinyourhat
New analysis paper from my group! We zoom in on some of @clark_kev et al.'s on syntax-sensitive attention heads in BERT (+RoBERTa, +...), and find interestingly mixed results. https://t.co/n5owXskqFB
— Sam Bowman (@sleepinyourhat) November 27, 2019