π @Emil_Hvitfeldt takes {ggplot2} 3.2.0's legend options for a spinβ¦
β Mara Averick (@dataandme) June 18, 2019
π "Changing Glyph in legend in ggplot2"https://t.co/tv312fAnix #rstats #dataviz pic.twitter.com/u6IZX2RUwc
π @Emil_Hvitfeldt takes {ggplot2} 3.2.0's legend options for a spinβ¦
β Mara Averick (@dataandme) June 18, 2019
π "Changing Glyph in legend in ggplot2"https://t.co/tv312fAnix #rstats #dataviz pic.twitter.com/u6IZX2RUwc
Nice talk from David Silver on AlphaStar: https://t.co/7wBczTZG6G
β Miles Brundage (@Miles_Brundage) June 18, 2019
"Competing helps develop an intuition for what might work or not." | Best practices from @el_PA_B after competing in the Freesound Audio Tagging competition. Read more π https://t.co/YnJLshK9RT via @TDataScience pic.twitter.com/s6eDttMnj1
β Kaggle (@kaggle) June 14, 2019
A curated list of decision, classification and regression tree research papers from the last 30 years with implementations. It covers NeurIPS, ICML, ICLR, KDD, ICDM, CIKM, AAAI etc.https://t.co/JDj6bkTYth
β Jeremy Howard (@jeremyphoward) June 13, 2019
Generative Adversarial Networks: A Survey and Taxonomy
β ML Review (@ml_review) June 11, 2019
By @wangvilla @sheqi1991 @tomasward
Covering 7 architecture-variant GANs and 9 loss-variant GANs focusing on
(1) High quality image generation
(2) Diverse image generation
(3) Stable training https://t.co/tI9o6Xepr4 pic.twitter.com/Mx6SRCgtzv
New paper out looking into ELMo- and BERT/STILTs-style transfer from a huge range of source tasks! βCan You Tell Me How to Get Past Sesame Street?β https://t.co/tU9dfhyG7Y pic.twitter.com/m8sFdafGoY
β Sam Bowman (@sleepinyourhat) June 11, 2019
Hi all. Iβve posted my slides from my talk today https://t.co/mbl6kpfocm
β Mark Riedl π Mars (Moon) (@mark_riedl) June 8, 2019
Topics covered: (1) the potential benefits of narrative AI systems, (2) historical perspectives on story generation, (3) machine learning for story generation, (4) controlling neural text generation systems https://t.co/o7zx5OOd5N
Notes on the "Limitations of the Empirical Fisher Approximation" by Kunstner et al.: an excellently written discussion paper.https://t.co/GDTbzec3gt
β Ferenc HuszΓ‘rπͺπΊ (@fhuszar) June 6, 2019
A very thorough dive into transfer learning for NLP using ULMFiT, from @KeremTurgutlu https://t.co/d4lOvZ2TOD pic.twitter.com/OAQKn3TifI
β Jeremy Howard (@jeremyphoward) June 4, 2019
The Theory Behind Overfitting, Cross Validation, Regularization, Bagging, and Boosting: T... https://t.co/J5gPgPXOMp pic.twitter.com/LIVHEHYCiT
β arxiv (@arxiv_org) June 2, 2019
Gradient Flow: a new notebook that explains automatic differentiation using eager execution in @TensorFlow. I go over computational graphs, vector valued functions, gradient, etc .. https://t.co/BFfvvslz9b
β Zaid Ψ²ΩΨ― (@zaidalyafeai) May 25, 2019
To balance out yesterday's much more buzzy paper, today I read about analyzing the contributions of each head within Transformer's multi-headed attention, to understand what each is doing, and how necessary it is to performance. https://t.co/1ltTBhgkKP
β Cody Wild (@decodyng) May 25, 2019