A very thorough dive into transfer learning for NLP using ULMFiT, from @KeremTurgutlu https://t.co/d4lOvZ2TOD pic.twitter.com/OAQKn3TifI
— Jeremy Howard (@jeremyphoward) June 4, 2019
A very thorough dive into transfer learning for NLP using ULMFiT, from @KeremTurgutlu https://t.co/d4lOvZ2TOD pic.twitter.com/OAQKn3TifI
— Jeremy Howard (@jeremyphoward) June 4, 2019
Why do I care so much about getting people to name the language they're working on (*cough* English *cough*)? Here's part of the reason, from my #SDSS2019 slides:https://t.co/cu3gJ9EChi#BenderRule #NLProc pic.twitter.com/JQxoc2hBfD
— Emily M. Bender (@emilymbender) June 4, 2019
Building a Language User Interface? Let Genie Generate It For You!
— Richard Socher (@RichardSocher) June 4, 2019
Blog: https://t.co/mIyBTaZAHZ
Paper: https://t.co/6u6SZF2LZm
One of the papers from my time as Stanford adj. prof last year. pic.twitter.com/5KjCXnjtg3
naacl_transfer_learning_tutorial - Repository of code for the NAACL tutorial on Transfer Learning in NLP https://t.co/kcg8oEQ0SP
— Python Trending (@pythontrending) June 2, 2019
Colab for our tutorial #NAACLTransfer @seb_ruderhttps://t.co/mmnQCS3UHV
— Thomas Wolf (@Thom_Wolf) June 2, 2019
Here are the materials for our @NAACLHLT tutorial on Transfer Learning in NLP with @Thom_Wolf @swabhz @mattthemathman:
— Sebastian Ruder (@seb_ruder) June 2, 2019
Slides: https://t.co/54KVG0K85z
Colab: https://t.co/iqWPtVFSVg
Code: https://t.co/bka5EsuYtP#NAACLTransfer pic.twitter.com/6wPZu9bmc7
Last reminder! Slides and notebooks for my #naacl2019 tutorial on Modeling Language Change are online. Follow all the action from the comfort of your own laptop.https://t.co/8cW8d342Ar
— Jacob Eisenstein (@jacobeisenstein) June 2, 2019
The paper argues that rather than attempting to debias embeddings, bias should be addressed at the point of action/decision in a system https://t.co/OPBNV501MI
— Rachel Thomas (@math_rachel) May 31, 2019
Was just rereading @aylin_cim @j2bryson @random_walker paper on bias in word embeddings. They use "small baskets" of words (from heavily cited psychology papers) to represent a concept, and compare the distance/similarity between different concepts.
— Rachel Thomas (@math_rachel) May 31, 2019
https://t.co/vkhxUuiJ8N https://t.co/OPBNV501MI
Cross-lingual transfer is a powerful tool for low-resource NLP. But when you build a system for a new language (say Bengali), what language do you transfer from? Our #ACL2019 paper "Choosing Transfer Languages for Cross-lingual Learning" asks this: https://t.co/uXo1JHzx2i 1/7 pic.twitter.com/0gZjTMjqC1
— Graham Neubig (@gneubig) May 31, 2019
Wouldn't it be cool to have @chrmanning explain ELMO, ULMfit, Transformers, and BERT in about an hour long video? #NLP https://t.co/dzqS5RgYMq
— Xavier 🎗️ (@xamat) May 31, 2019
Important paper from Zellers et al. - "Defending Against Neural Fake News": https://t.co/HN2ju1qGHa
— Miles Brundage (@Miles_Brundage) May 30, 2019
Great to see more technical work on this topic, as well as further discussion of appropriate language model publication norms.