fast-autoaugment - Official Implementation of 'Fast AutoAugment' in PyTorch. https://t.co/t9KnmjI4Qv
— Python Trending (@pythontrending) May 2, 2019
fast-autoaugment - Official Implementation of 'Fast AutoAugment' in PyTorch. https://t.co/t9KnmjI4Qv
— Python Trending (@pythontrending) May 2, 2019
Huge, huge OSS release today. You can use state-of-the-art adaptive experimentation (Ax) and Bayesian optimization (BoTorch) software from Facebook to optimize your systems and products. https://t.co/lf99BjPnnH
— Sean J. Taylor (@seanjtaylor) May 1, 2019
Fast custom-RNNs using TorchScript and torch.jit
— PyTorch (@PyTorch) May 1, 2019
- describes how to write custom RNNs in PyTorch that run close to CuDNN speeds, powered by our torch.jit compiler
- details of how we do the optimization in the compilerhttps://t.co/Jlwn2Nel6X pic.twitter.com/kzCmQZSCj5
[v1.1.0] Official TensorBoard Support, Attributes, Dicts, Lists and User-defined types in JIT / TorchScript, Improved Distributed
— PyTorch (@PyTorch) May 1, 2019
Read more about the changes at https://t.co/s59g9uiC9H
As always, get the install commands on https://t.co/DeaBDSRxs8
Buried at the end of this post is a neat discussion of using SWA-Gaussian to get uncertainty estimates from deep learning models. I’m looking forward to checking that out. https://t.co/XU0arMvjGt
— Sean J. Taylor (@seanjtaylor) April 29, 2019
Stochastic Weight Averaging: a simple procedure that improves generalization over SGD at no additional cost.
— PyTorch (@PyTorch) April 29, 2019
Can be used as a drop-in replacement for any other optimizer in PyTorch.
Read more: https://t.co/IRhz40AZKU
guest blogpost by @Pavel_Izmailov and @andrewgwils pic.twitter.com/yU0HKDYr7v
https://t.co/qHvO6kotwe - An inline Bash script runner, for Python. https://t.co/Y9G4pphxed #python #bash
— Python Weekly (@PythonWeekly) April 29, 2019
Stochastic Weight Averaging in Low Precision Training (SWALP)! Our new ICML paper (with PyTorch code). SWALP can match the performance of full-precision training, even with all numbers quantized down to 8 bits! https://t.co/eTWQlvUMYS pic.twitter.com/QUjhFeigY7
— Andrew Gordon Wilson (@andrewgwils) April 29, 2019
Pytorch implementation of Octave convolution https://t.co/6Hygcy8A0W #pytorch #deeplearning #neuralnetwork
— PyTorch Best Practices (@PyTorchPractice) April 24, 2019
New model zoo in @PyTorch for image segmentation by Pavel Yakubovskiy.https://t.co/IkkjRJyRIq
— Vladimir Iglovikov (@viglovikov) April 22, 2019
[1] UNet, FPN, PSPNet Heads
[2] Pre-trained backbones (30+) vgg, densenet, dpn, resnet, seresne(x)t, senet, inceptionresnetv2, etc
[3] Example on how to train on the CamVid dataset.
A Pytorch implementation of "Splitter: Learning Node Representations that Capture Multiple Social Contexts" (WWW 2019). https://t.co/17eVpoWT69 #deeplearning #machinelearning #ml #ai #neuralnetworks #datascience #pytorch
— PyTorch Best Practices (@PyTorchPractice) April 16, 2019
Pytorch implementation of Block Neural Autoregressive Flow https://t.co/6KGOSwLt9N #deeplearning #machinelearning #ml #ai #neuralnetworks #datascience #pytorch
— PyTorch Best Practices (@PyTorchPractice) April 16, 2019