Tweeted By @stanfordnlp
At ACL2020 @aclmeeting: Finding Universal Grammatical Relations in Multilingual BERT by @ethanachi, @johnhewtt & @chrmanning shows how you can find a universal syntactic subspace in mBERT, where UD-like relations cluster despite not training on them at all https://t.co/eB8zSPTOgu pic.twitter.com/FoY8rKeypp
— Stanford NLP Group (@stanfordnlp) June 29, 2020