When does label smoothing help?https://t.co/GmyCERnssb
β Jeremy Howard (@jeremyphoward) June 13, 2019
When does label smoothing help?https://t.co/GmyCERnssb
β Jeremy Howard (@jeremyphoward) June 13, 2019
And now we have KERMIT https://t.co/hVvQOknF4o to go along with ELMo, BERT, ERNIE, BigBird and Grover.
β Mark Riedl π Mars (Moon) (@mark_riedl) June 12, 2019
Weight Agnostic Neural Networks π¦
β hardmaru (@hardmaru) June 12, 2019
Inspired by precocial species in biology, we set out to search for neural net architectures that can already (sort of) perform various tasks even when they use random weight values.
Article: https://t.co/bpe6V3Rp9m
PDF: https://t.co/7OJGEsRnVV pic.twitter.com/El2uzgxS5I
Congratulations to the Best Papers at the ongoing #ICML2019
β DataScienceNigeria (@DataScienceNIG) June 11, 2019
(1)Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations https://t.co/GKUNkNZOwY
(2)Rates of Convergence for Sparse Variational Gaussian Process Regression https://t.co/dyUQYoq40Y pic.twitter.com/0DlXj6gcWb
Interested in uncertainty applications of neural networks? See our work on examining its use for clinical care. @dusenberrymw Ed Choi Jonas Kemp @JvNixon @Ghassen_ML @kat_heller @iamandrewdai. Many of us will also be at ICML to chat! https://t.co/qvYAW5SUwR pic.twitter.com/HqW3tfmhtG
β Dustin Tran (@dustinvtran) June 11, 2019
New paper out looking into ELMo- and BERT/STILTs-style transfer from a huge range of source tasks! βCan You Tell Me How to Get Past Sesame Street?β https://t.co/tU9dfhyG7Y pic.twitter.com/m8sFdafGoY
β Sam Bowman (@sleepinyourhat) June 11, 2019
In our new paper (my first collaboration at DeepMind, yay!) with Cyprien, @ikekong, & @DaniYogatama, we leverage episodic memory during training (sparse replay) and inference (local adaptation) for continual learning (on QA and classification tasks).https://t.co/M7lgKhVwXZ pic.twitter.com/ZHdl3yAu72
β Sebastian Ruder (@seb_ruder) June 11, 2019
Computer Vision with a Single Robust Classifier
β hardmaru (@hardmaru) June 11, 2019
Interesting work: βUsing ππππ¦ an adversarially trained classifier (standard ResNet w/ cross-entropy), we show that we can perform image generation, super resolution, inpainting, and interactive editingβhttps://t.co/VtmUohz2sZ https://t.co/6c1yxeZgJ9
We've compiled a meta-reading list for our meta-learning tutorial: https://t.co/3i5zohN4KM
β Sergey Levine (@svlevine) June 11, 2019
Short list of the main papers we covered in our meta-learning tutorial:https://t.co/g3eAcsO0vr https://t.co/TdWZNyn9kB
Lots of requests to @seb_ruder & I for full replication details for ULMFiT on IMDb. Here it is! And thanks to @GuggerSylvain it now runs in just 6 hours on a single GPU :) https://t.co/xbl3pIcctD
β Jeremy Howard (@jeremyphoward) June 11, 2019
I'm looking for moderation help on arXiv in machine learning. It would be great to have someone in Europe and someone in Asia so that we can moderate around the clock. DM me if interested.
β Thomas G. Dietterich (@tdietterich) June 9, 2019
In case you don't know (it took me 5 minutes to find the link), all 1,294 papers at #CVPR2019 are already published here. https://t.co/D6s9gtugdv
β Chip Huyen (@chipro) June 8, 2019