Check out this helpful walkthrough of Multi-GPU training with Custom Estimators by @krasul .
— TensorFlow (@TensorFlow) September 5, 2018
Read it here ↓ https://t.co/SvVZbO3xbB
Check out this helpful walkthrough of Multi-GPU training with Custom Estimators by @krasul .
— TensorFlow (@TensorFlow) September 5, 2018
Read it here ↓ https://t.co/SvVZbO3xbB
⭐️ great guide by @aosmith16
— Mara Averick (@dataandme) September 5, 2018
"Getting started simulating data in R: some helpful functions and how to use them" https://t.co/cW34RAR05z #rstats
Python Pandas: Tricks & Features You May Not Know via @realpython https://t.co/TvszhnZkhA #python #pandas pic.twitter.com/7j1oiCHFkd
— Python Weekly (@PythonWeekly) September 3, 2018
Slowly. pic.twitter.com/EP6ZFiHyhI
— Chris Albon (@chrisalbon) September 3, 2018
Visual search on AWS—Part 1: Engine implementation https://t.co/kMH7vk67qk
— Nando de Freitas (@NandoDF) September 2, 2018
ICYMI, 📈 chart explanation on point!
— Mara Averick (@dataandme) September 2, 2018
"Exploring ggplot2 boxplots - Defining limits and adjusting style" 👩💻 @DeCiccoDonkhttps://t.co/Jx3DfOwUy0 via @USGS_R #rstats #dataviz pic.twitter.com/L1U8K99lOo
Scipy lecture notes new release: an open online book (also in PDF) on the scientific and data ecosystem in Pythonhttps://t.co/tAJoR0iycr
— Gael Varoquaux (@GaelVaroquaux) September 1, 2018
Minor update, mostly fixing links, typos, and adding support for new package versions#pydata #scipy pic.twitter.com/8jNZ6n2VSh
R Code – Best practices https://t.co/WT8LKuNrrP #rstats #DataScience
— R-bloggers (@Rbloggers) September 1, 2018
"A Programmer's Intuition for Matrix Multiplication"
— Trask (@iamtrask) September 1, 2018
If you've ever wondered - "What is Matrix Multiplication *really* doing?" - this blogpost is for you.
It offers simple intuition for the op that makes up >50% of a neural network.#100DaysOfMLCode https://t.co/uwOLkY4Jsg pic.twitter.com/QDSmmH3UgM
So. Much. Useful. Stuffs. @SuzanBaert on scoped dplyr verbs! #rstats #AmstRday #satRday @satRdays_org
— Mara Averick (@dataandme) September 1, 2018
📽 https://t.co/mLrNvjBaYG pic.twitter.com/EIVqre9Gmj
This week's #KernelAwards winner uses Local Interpretable Model-Agnostic Explanations (LIME) to better understand the predictions of an ML model: https://t.co/FSvY4WtYqp pic.twitter.com/AdQIMzKX9h
— Kaggle (@kaggle) August 31, 2018
If you've wondered - "Which Deep Learning optimizer should I use? SGD? Adagrad? RMSProp?" - this blogpost by @seb_ruder is the best explanation I've seen.
— Trask (@iamtrask) August 31, 2018
It's a surprisingly easy read! https://t.co/ASebqI7N4J
Definitely a good #100DaysOfMLCode project. pic.twitter.com/wrQ1aLVSvT