Bayesian hierarchical stacking: Some models are (somewhere) useful https://t.co/FCyazmxOy0
— Andrew Gelman et al. (@StatModeling) August 11, 2021
Bayesian hierarchical stacking: Some models are (somewhere) useful https://t.co/FCyazmxOy0
— Andrew Gelman et al. (@StatModeling) August 11, 2021
A blog post outlining my JuliaCon Keynote from Friday:
— Soumith Chintala (@soumithchintala) August 2, 2021
"Growing open-source: from Torch to PyTorch"https://t.co/OxgaAa2ltv
Some anecdotes and stories from my journey in open-source tied around four dimensions:
principles, scope & risk, measurement, scaling
Just released a new survey on prompting methods, which use language models to solve prediction tasks by providing them with a "prompt" like: "CMU is located in __"
— Graham Neubig (@gneubig) July 30, 2021
We worked really hard to make this well-organized and educational for both NLP experts and beginners, check it out! https://t.co/HLfLFbuINN
Interesting keynote talk by @iyadrahwan #ALIFE2021
— hardmaru (@hardmaru) July 23, 2021
In the Moral Machines project, they ask people from different countries about how they think self-driving cars ought to behave under various scenarios. They discuss whether such internal ethical rules need to be region-specific. https://t.co/F4UfDzlysn pic.twitter.com/q1AirYj1E4
Slides for my #SciPy2021 talk on PyNNDescent can be found here: https://t.co/FePhGpclhp
— Leland McInnes (@leland_mcinnes) July 16, 2021
Was there a main reason that led to Gradient Descent being the popular choice of optimization algorithms in the field of Machine Learning?
— hardmaru (@hardmaru) July 16, 2021
Good discussion 👇https://t.co/YojG0kI2qb
I had a great time talking about https://t.co/GEOZuodrZj, AutoML, Software Engineering for ML with @dpbrinkm and Vishnu Rachakonda on the @mlopscommunity Coffee Sessionshttps://t.co/sbzl8xuuUM
— Jeremy Howard (@jeremyphoward) July 15, 2021
For 20 years I used a wide variety of machine learning and optimization algorithms to tackle predictive modeling challenges.
— Jeremy Howard (@jeremyphoward) July 14, 2021
Nowadays, #deeplearning is part of nearly everything I do. In this @MelbCtrDataSci talk, I explain why I made this my focus.https://t.co/ZORIflwNAu
Today's 💡: @tessy_muiruri is a Risk and Analytics professional from Nairobi, Kenya. She used tweets by Kenyans on political candidates to create an EDA and sentiment analysis model. Timely given Kenya's upcoming elections next year. https://t.co/qrCIs22cF6
— Kaggle (@kaggle) July 14, 2021
A Primer on Pretrained Multilingual Language Models
— Sebastian Ruder (@seb_ruder) July 13, 2021
This survey is a great starting point to learn about anything related to state-of-the-art multilingual models in NLP. https://t.co/HbV3puNa2g pic.twitter.com/mzmkrvOl91
Multi-Task Learning with Deep Neural Networks: A Survey
— Sebastian Ruder (@seb_ruder) July 12, 2021
I learned a lot reading this comprehensive overview by @CrichaelMawshaw. It categorizes recent work into architecture design, optimization methods, and task relationship learning.https://t.co/1f5afgga8T pic.twitter.com/7QZkQXVrqG
Introduction to Modern Statistics is an open-source book that provides an introduction to modern statistics https://t.co/6w7A67WnG2
— Nathan Yau (@flowingdata) July 8, 2021