A Review of the Neural History of Natural Language Processing-- great neural NLP overview by @seb_ruder https://t.co/NmicL0THJU
— Rachel Thomas (@math_rachel) October 11, 2018
A Review of the Neural History of Natural Language Processing-- great neural NLP overview by @seb_ruder https://t.co/NmicL0THJU
— Rachel Thomas (@math_rachel) October 11, 2018
One of our students did something crazy with transfer learning: froze the randomly added fully connected layer, and only fine-tuned the pre-trained layers.
— Jeremy Howard (@jeremyphoward) October 11, 2018
What's really crazy: that resulted in a top-10 Kaggle finish. See paper for details.https://t.co/1TZT2vfUXs
Our free online lecture series on #NLProc: NLP by @jurafsky & @chrmanning (2012 Coursera nlp-class) https://t.co/wK9Za3WRGo—missing from web for ages for tedious reasons but now back! ❦ NLP with Deep Learning by @chrmanning & @RichardSocher (cs224n 2017) https://t.co/yE6ufYHCyw
— Stanford NLP Group (@stanfordnlp) October 11, 2018
New 'BERT' model from Google just turned up on the https://t.co/ryDQeo2HU2 – Huge improvements on MNLI, CoLA, SST, ... pic.twitter.com/FBC4RokARF
— Sam Bowman (@sleepinyourhat) October 9, 2018
In AllenNLP v0.7.0:https://t.co/SVCKnJmmfj https://t.co/v8oqV6SNsz
— Stanford NLP Group (@stanfordnlp) October 9, 2018
PyText will be open sourced by @fb_engineering later this month, allows rapid prototyping and production deployment of @PyTorch NLP models pic.twitter.com/OU2meSlN86
— Peter Skomoroch (@peteskomoroch) October 2, 2018
New blog post: A Review of the Recent History of Natural Language Processing. The 8 biggest milestones in the last ~15 years of #NLProc. From our NLP session at @DeepIndaba. @_aylienhttps://t.co/7QwfV44fAZ pic.twitter.com/WkwfI7n8dz
— Sebastian Ruder (@seb_ruder) October 1, 2018
A great series of interviews on the Pedagogy of NLP by @david__jurgens, @lucy3_li, starting with NLP experts @jurafsky and @YejinChoinka. If you're interested in teaching or in how NLP changes in the age of DL, check these out!https://t.co/cMWrbbC1z5https://t.co/FdqnkP8Yed
— Sebastian Ruder (@seb_ruder) September 28, 2018
#iclr2019 fun begins.
— harvardnlp (@harvardnlp) September 27, 2018
Some early keyword analysis and topic modeling: https://t.co/3MejoPrSkH pic.twitter.com/o4j9dmZdSu
Good practices in Modern Tensorflow for NLP: A notebook of best practice code snippets covering Eager execution, https://t.co/Dsgick98uu, and tf.estimator by @roamanalytics https://t.co/23YjhlkW9f
— Sebastian Ruder (@seb_ruder) September 27, 2018
Even more #NLProc QA data at #emnlp2018! HotpotQA—a Wikipedia-based dataset requiring multi-document information aggregation and comparisons by Zhilin Yang @qi2peng2 @Saizheng Bengio @professorwcohen @rsalakhu @chrmanning. Paper, data, leaderboard, etc.: https://t.co/x7zSGdtFBi pic.twitter.com/5PYO7LVOSO
— Stanford NLP Group (@stanfordnlp) September 26, 2018
How toxic are @HillaryClinton and @raelDonaldTromp tweets? A #MachineLearning approach with the help of @kaggle: https://t.co/X6DoiO6cwp
— Bojan Tunguz (@tunguz) September 24, 2018