Fresh out of the oven, a summary of XLM - @facebookai Cross-lingual BERT model https://t.co/lNd2UeuVPe
— Rani Horev (@HorevRani) February 11, 2019
Fresh out of the oven, a summary of XLM - @facebookai Cross-lingual BERT model https://t.co/lNd2UeuVPe
— Rani Horev (@HorevRani) February 11, 2019
A new summary of XLM - a new model that upgrades BERT to achieve SOTA results in cross-lingual classification and translation tasks. https://t.co/lNd2UeuVPe
— Rani Horev (@HorevRani) February 11, 2019
Great paper by @alex_conneau & @GuillaumeLample from @facebookai
New model from XLM outperforms BERT on all GLUE tasks, trained on same data. .
— Yann LeCun (@ylecun) June 21, 2019
Get it here: https://t.co/cYYOETEeaj
Tweets from Guillaume & Alex:... https://t.co/2ysUltBH7f
Today, OpenAI released GPT-2 774M (English) and Facebook released XLM pre-trained models for 100 languages. Looks like a glut of #NLProc resources for everyone freely accessible. What a wonderful time to live!https://t.co/1OMd3D5xDohttps://t.co/lyC2eKvp3J
— Delip Rao (@deliprao) August 20, 2019