Tongzhou Wang from MIT talks about the PyTorch data loading pipeline and components - the dataset, the sampler, and the dataloader. https://t.co/9u5xpAYj2W
— PyTorch (@PyTorch) May 4, 2020
Tongzhou Wang from MIT talks about the PyTorch data loading pipeline and components - the dataset, the sampler, and the dataloader. https://t.co/9u5xpAYj2W
— PyTorch (@PyTorch) May 4, 2020
Minimal PyTorch implementation of YOLOv4 https://t.co/PvXlrfLBcH #deeplearning #machinelearning #ml #ai #neuralnetworks #datascience #pytorch
— PyTorch Best Practices (@PyTorchPractice) April 30, 2020
Nice Medium on how to serve @huggingface BERT in production with pytorch/serve by MFreidank
— Julien Chaumond (@julien_c) April 27, 2020
Hat/tip @joespeezhttps://t.co/cL3BYUi5a4 pic.twitter.com/ryyfbyZLji
torchaudio v0.5 - new transforms, functionals and datasets:
— PyTorch (@PyTorch) April 21, 2020
- Added the Griffin-Lim functional and transform
- Support for allpass, fade, bandpass, bandreject, band, treble, deemph, and riaa filters
- New datasets: LJSpeech and SpeechCommands datasetshttps://t.co/L2xLf3my3Y
TorchServe and [TorchElastic for Kubernetes], new libraries for serving and training models at scale. Learn more: https://t.co/j6tSOaG6yU
— PyTorch (@PyTorch) April 21, 2020
v1.5: autograd API for Hessians/Jacobians, C++ frontend stable and 100% parity with Python, Better performance on GPU and CPU with Tensor Format ‘channels last’, distributed.rpc stable, Custom C++ class binding
— PyTorch (@PyTorch) April 21, 2020
Release notes: https://t.co/P3iisDOoRg
Blog: https://t.co/svnBmPyGHV
Another Transformer variant with lower computational complexity, suitable for long-range tasks, is Sparse Sinkhorn Attention (https://t.co/qWp2AJVdkd) by Yi Tay et al.
— hardmaru (@hardmaru) April 8, 2020
A GitHub Colab reimplementation in PyTorch (https://t.co/B5FcGuTZhy) also combined it with ideas from Reformer. https://t.co/WSwZuSRyPb pic.twitter.com/54fJrRbhEA
Mimicry: a lightweight #PyTorch library aimed towards the reproducibility of GAN research
— Alexandr Kalinin (@alxndrkalinin) April 6, 2020
- standardized implementations of popular GANs
- baseline scores of GANs for comparison
- a framework for GAN training boilerplate codehttps://t.co/qPBkOesJcV pic.twitter.com/663dzlIJLq
"Effective PyTorch" -- A nice GitHub repo with some useful tips on "Optimizing runtime with TorchScript" and "Numerical stability in PyTorch" https://t.co/1d74kO05H8
— Sebastian Raschka (@rasbt) April 6, 2020
torchlayers: an interesting new abstraction layer API on top of PyTorch, which is basically an "Sequential" object w/o the need to call it in the forward method. And, more interestingly, it performs automatic dimensionality and shape deduction: https://t.co/0jIV40V2be
— Sebastian Raschka (@rasbt) March 30, 2020
PyTorch supports 8-bit model quantization using the familiar eager mode Python API to support efficient deployment on servers and edge devices. This blog provides an overview of the quantization support on PyTorch and its incorporation with TorchVision. https://t.co/GqzkJLDlDp
— PyTorch (@PyTorch) March 26, 2020
To enable more efficient on-device ML, PyTorch supports an end-to-end workflow from Python to deployment on iOS and Android. Learn more about PyTorch Mobile: https://t.co/VFnCOXiagX
— PyTorch (@PyTorch) March 24, 2020