That's great. But this is still better:
— Dan ofer (@danofer) January 18, 2019
"A Few Useful Things to Know about Machine Learning"https://t.co/HRo1GyfsOv
That's great. But this is still better:
— Dan ofer (@danofer) January 18, 2019
"A Few Useful Things to Know about Machine Learning"https://t.co/HRo1GyfsOv
Thanks to everyone who watched my talk, “Building an A/B Testing Analytics system with R and Shiny!” Slides available here: https://t.co/Rd3AtrEUya. Recording will also be available at some point.
— Emily Robinson (@robinson_es) January 17, 2019
Don't miss @alexpghayes's slides, where he shows code examples of bootstrapping and examining residuals using broomhttps://t.co/Zd2w2JOYWk #rstudioconf pic.twitter.com/mqBnabaWb0
— David Robinson (@drob) January 17, 2019
How to (properly) teach Git. #programminghttps://t.co/N8VUhXkCWa pic.twitter.com/Q7CAbO2zKT
— Randy Olson (@randal_olson) January 17, 2019
Slides for my #rstudioconf talk “Box plots: a case study in debugging and perseverance” are here: https://t.co/IysLF1uGWF
— Kara Woo (@kara_woo) January 17, 2019
If you missed our #RStudioConf talk, you can get the slides, code, and blog posts all here! https://t.co/AWhBgYis5Y
— Jacqueline Nolis (@skyetetra) January 17, 2019
“Estimating the absolute performance of a model is probably one of the most challenging tasks in machine learning.” A great read on a foundational and oft neglected topic. I learned stuff. https://t.co/VmKpbxvqWu
— Brandon Rohrer (@_brohrer_) January 17, 2019
Useful article “How to teach Git” by @bberrycarmen https://t.co/rDohwitvGz
— hardmaru (@hardmaru) January 17, 2019
Advanced Jupyter Notebooks: A Tutorial
— ML Review (@ml_review) January 17, 2019
By @BenjaminPryke
– %lsmagic (%timeit, %pdb, %debug etc)
– jupyter nbconvert --to pdf
– parameterisation with Papermill
– %load_ext sqlhttps://t.co/Is0oYhvVqI pic.twitter.com/8ZiXIWJj4I
Transformer-XL: Combining Transformers and RNNs Into a State-of-the-art Language Model
— hardmaru (@hardmaru) January 17, 2019
Blog post by @HorevRani giving an overview of the model and key concepts such as the recurrence mechanism and the relative positional encoding scheme.https://t.co/ORv18GkZBv https://t.co/l1OJKvUNyc
Really proud to share "What is torch.nn, really?", which takes you from a neural net written from scratch, refactored step by step using all the key concepts in `torch.nn`.
— Jeremy Howard (@jeremyphoward) January 16, 2019
If you want to really understand how neural nets work in @PyTorch, start here!https://t.co/qJgsZPQnTL
Lecture videos of @geoffreyhinton’s 2013 neural net MOOC are now on his website. I gained a lot of intuition from studying them years ago.
— hardmaru (@hardmaru) January 16, 2019
We now focus on making a specific subset of approaches work better, while back then, everything seemed less rigid. https://t.co/dzmKCyfnql