Captum is a library for model interpretability. Its algorithms include integrated gradients, conductance, SmoothGrad and VarGrad, and DeepLift. Learn more: https://t.co/IVdye9smGi
— PyTorch (@PyTorch) March 23, 2020
Captum is a library for model interpretability. Its algorithms include integrated gradients, conductance, SmoothGrad and VarGrad, and DeepLift. Learn more: https://t.co/IVdye9smGi
— PyTorch (@PyTorch) March 23, 2020
Announcing Stanza v1.0.0, the new packaging of our Python #NLProc library for many human languages (now including mainland Chinese), greatly improved and including NER. Documentation https://t.co/Lm2WdAddTz Github https://t.co/UyC24Qu5P3 PyPI https://t.co/fOtAHoeFLo (or conda) pic.twitter.com/f70MNxAh9T
— Stanford NLP Group (@stanfordnlp) March 17, 2020
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations by T. Chen et al. https://t.co/5SkbM49luU
— PyTorch Best Practices (@PyTorchPractice) March 12, 2020
PyTorch + Cloud TPU + Colab: a set of code pointers and notebooks to get you started. https://t.co/HhG8vX3Q2R
— PyTorch (@PyTorch) March 10, 2020
ForwardTacotron - a simplified Tacotron without attention for Speech Synthesis, efficient, fast and robust.
— PyTorch (@PyTorch) March 10, 2020
🔈 Samples: https://t.co/xMq58HMQuC
🔤 Github: https://t.co/5Ew7u7AA2P
📕 Colab: https://t.co/uK1Qle1kwK https://t.co/x4q41lzEJh
From PyTorch to JAX: towards neural net frameworks that purify stateful code: https://t.co/Sg3k4XpzTD
— Denny Britz (@dennybritz) March 10, 2020
Great writeup from @sjmielke on how to think about JAX programs and how it all works - from scratch.
Props to A. K. Subramanian for putting out this reference library of variational autoencoder implementations. This is a fabulous resource. https://t.co/igsgsnUTSX pic.twitter.com/mKfAn4qZAi
— Brandon Rohrer (@_brohrer_) March 6, 2020
"pytorch-optimizer -- collections of ready to use optimization algorithms for PyTorch" https://t.co/4yuEIWqhn2
— Sebastian Raschka (@rasbt) March 2, 2020
Ever wondered what @PyTorch nn.Module and nn.Parameter do really? And how hooks actually work? Here's a working implementation from scratch of their key functionality, in one tweet!
— Jeremy Howard (@jeremyphoward) February 27, 2020
From our upcoming book and course:
- https://t.co/bLd3sEXTpV
- https://t.co/guKT7y9VfM pic.twitter.com/wYh1jyb1Bm
Torchmeta is a collection of extensions and data loaders for few-shot learning and meta-learning. It won first place at the Global PyTorch Summer Hackathon last year. Learn more in the blog post from Tristan Deleu, the project author: https://t.co/jFaPaF8GX0
— PyTorch (@PyTorch) February 21, 2020
FastHugs - fastai-v2 and @HuggingFace Transformershttps://t.co/Ht20zhfkDN pic.twitter.com/QqrMAd8EkA
— Jeremy Howard (@jeremyphoward) February 18, 2020
fastai: A Layered API for Deep Learning
— Thomas (@evolvingstuff) February 13, 2020
"quickly and easily provide state-of-the-art results in standard deep learning domains"https://t.co/j61JeIWakmhttps://t.co/bDlZs2saps
Brought to you by @jeremyphoward pic.twitter.com/67QXq0SbCN