Pretrained NFNet model weights (F0-F5, F6+SAM) are now available at https://t.co/uNSzeA4uJt, along with a demo Colab! All models are pretrained on ImageNet. https://t.co/nadL5pEgvJ
— Andy Brock (@ajmooch) February 17, 2021
Pretrained NFNet model weights (F0-F5, F6+SAM) are now available at https://t.co/uNSzeA4uJt, along with a demo Colab! All models are pretrained on ImageNet. https://t.co/nadL5pEgvJ
— Andy Brock (@ajmooch) February 17, 2021
New code walkthrough on https://t.co/m6mT8Sa9M5: Switch Transformers, an architecture the makes it possible to increase the representational capacity of a Transformer while keeping its computational cost low. Implemented by Khalid Salamahttps://t.co/nkMu0QwPuo
— François Chollet (@fchollet) February 17, 2021
I am going to start my first lecture of my #DataScience collab software dev course with a watch party of this talk on RMarkdown Driven Development by @EmilyRiederer because I think it is such a narrative on how analysis code can evolve into packages:https://t.co/YZDtUFJyKg
— Tiffany Timbers (@TiffanyTimbers) February 17, 2021
🚨 NEW MODEL ALERT 🚨
— Hugging Face (@huggingface) February 17, 2021
Translate text to, or between 50 languages with mBART-50 from @facebookai !
🇺🇳 One-to-Many model: translate from English to 49 other languages
↔️ Many-to-Many model: translation btw any pair of 50 languages pic.twitter.com/qC5rEaSrfZ
New release of dirty-cat, v0.1 ✨: facilitating machine learning on non-curated data.
— Gael Varoquaux (@GaelVaroquaux) February 17, 2021
Big new feature: the GapEncoder, which encodes on interpretable latent categories inferred from recurrent substrings, and robust to typos and other variationshttps://t.co/Y5Azf61ERE pic.twitter.com/yy6JIVUsZo
On map itself, very good labels, a nice informative touch. Map reports/shows vast scope of the extraordinary weather. Data-driven journalism at its best.
— Edward Tufte (@EdwardTufte) February 17, 2021
Every US reader sees their own local involvement, also in the national context
Source: NYT/NWS, in New York Times today. pic.twitter.com/ovdSTqurKJ
GradInit: Learning to Initialize Neural Networks for Stable and Efficient Training
— AK (@ak92501) February 17, 2021
pdf: https://t.co/XY85zNdYdw
abs: https://t.co/ArRUtJfHeO pic.twitter.com/Bibpe7MwZi
The population of Russia declined by more than 12 million during WWII. Decades of population growth came to an end after the end of the Cold War. Continued decline on the horizon. Source: https://t.co/d0dQ41RPkR pic.twitter.com/kAaJgOi41I
— Simon Kuestenmacher (@simongerman600) February 16, 2021
To discover something, you must first expect to find it.
— François Chollet (@fchollet) February 16, 2021
Gumbel-Softmax aka Concrete distribution maps gumbel noise + temperature scalar to a simplex. This neat paper on Invertible Gaussian Reparameterization shows that you can map other kinds of noise to the simplex and it empirically works well!
— Eric Jang 🇺🇸🇹🇼 (@ericjang11) February 16, 2021
https://t.co/SihvRnaQLg
Streambook (proof-of-concept https://t.co/723ubiihe3)
— Sasha Rush (@srush_nlp) February 16, 2021
Edit python code in any editor, see literate output in the browser, instantly ship as a notebook. (@streamlit + jupytext + watchdog)
(Can someone build this for real? Other languages have it and it is the so nice.) pic.twitter.com/8Xpl9fGU8E
Writing tests can often feel like a waste of time...
— Jim Hester (@jimhester_) February 16, 2021
tests let you refactor your code with more confidence, so you can later improve the
readability, performance or structure.
They also ensure external contributions are correct.
tests help your future self a lot, use them!