Learn how to easily build an accurate, fine-tuned ASR model on GPUs using Neural Modules (NeMo) toolkit in this walkthrough + notebook.
— NVIDIA AI Developer (@NVIDIAAIDev) December 18, 2019
Learn how to easily build an accurate, fine-tuned ASR model on GPUs using Neural Modules (NeMo) toolkit in this walkthrough + notebook.
— NVIDIA AI Developer (@NVIDIAAIDev) December 18, 2019
We have a brand new guide on using Mixed Precision in TensorFlow 2.1: https://t.co/xdG7kOKXhh
— François Chollet (@fchollet) December 17, 2019
Speed up training and inference on GPU by up to 3x (and up to 50% on TPU)
Are you referring to the "From Predictive Modeling to Optimization" talk? If so, there's a more detailed article to go with it, if you're interested:https://t.co/sNMLLD9KCa
— Jeremy Howard (@jeremyphoward) December 17, 2019
Area under the curve pic.twitter.com/c5oVJtFtqp
— Chris Albon (@chrisalbon) December 16, 2019
A pretty cool and perhaps underappreciated resource for learning ML: the notebooks of Aurelien Geron's book "Hands-on Machine Learning with Scikit-Learn, Keras and TensorFlow", available on GitHub https://t.co/0lMlmloSiV
— François Chollet (@fchollet) December 16, 2019
Beautiful paper on HMMs and derivatives https://t.co/MZ0iDhyzFJ
— Andrew Gelman (@StatModeling) December 16, 2019
A nice summary of #NeurIPS2019: https://t.co/yiUNh4xroq. Interestingly,
— Adam Kosiorek (@arkosiorek) December 16, 2019
it's almost completely orthogonal to my experience...
I've begun writing on the ggplot2 book again. New introductory chapter to extensions, to be followed be a number of chapters with examples: https://t.co/9Kro0M0LY5
— Thomas Lin Pedersen (@thomasp85) December 16, 2019
Tweetorial: How to Make Your Code Run Faster
— Brandon Rohrer (@_brohrer_) December 16, 2019
I don't need to tell you why it's nice to have your code run fast. Here are some tricks I've found helpful, starting with the most bang for the buck. https://t.co/twMCXSh7Bo
Bag of words pic.twitter.com/n9wrwH1Cie
— Chris Albon (@chrisalbon) December 15, 2019
A visual explanation of R² - which measures the decrease in error that your model provides over a baseline of guessing the mean. pic.twitter.com/2uHY0k5cpd
— Ted Petrou (@TedPetrou) December 14, 2019
I enjoyed Yoshua's #NeurIPS2019 talk! We still have a lot of work to do to define WHAT the benchmarks should be towards System 2 to research HOW to make progress on them. In ML, setting the right problem is often more important than its solution.
— Oriol Vinyals (@OriolVinyalsML) December 12, 2019
Talk: https://t.co/IbgNd3tAPb pic.twitter.com/vYEuW9ytyA