Improved Techniques for Training Single-Image GANs
— roadrunner01 (@ak92501) March 25, 2020
github: https://t.co/NtTbBAPN7G
blog: https://t.co/zyhjO6Ht7n pic.twitter.com/HgaWOPQIPV
Improved Techniques for Training Single-Image GANs
— roadrunner01 (@ak92501) March 25, 2020
github: https://t.co/NtTbBAPN7G
blog: https://t.co/zyhjO6Ht7n pic.twitter.com/HgaWOPQIPV
Can deep neural nets forecast weather accurately? Check out our #NeuralWeatherModel, MetNet, which outperforms physical models at up to eight hour forecasts and runs in a matter of seconds. Learn more in the blog post below! https://t.co/7upa4rxrdQ
— Google AI (@GoogleAI) March 25, 2020
Deep Line Art Video Colorization with a Few References
— roadrunner01 (@ak92501) March 25, 2020
pdf: https://t.co/VnhjNVzIyo
abs: https://t.co/GIWd1ar1db pic.twitter.com/H6VPeTCl2r
NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis https://t.co/d3c882xViW impressive! https://t.co/jXjAEaSNzO
— Andrej Karpathy (@karpathy) March 20, 2020
i like how it suggests that backprop may really be the best algorithm for training neural networkshttps://t.co/pCQJj8wFJ5
— Ilya Sutskever (@ilyasut) March 17, 2020
Semantic Pyramid for Image Generation
— roadrunner01 (@ak92501) March 16, 2020
pdf: https://t.co/ofYxkMhjKZ
abs: https://t.co/L4X2SIDfSL
project page: https://t.co/wTzWvJXQkt pic.twitter.com/p44oQBojBi
Neural Tangents is an open source library we've been working on to make it easy to build, train, and manipulate infinitely wide neural networks.
— Sam Schoenholz (@sschoenholz) March 13, 2020
To appear as a spotlight at ICLR.
code: https://t.co/Ic35RS52Z2
paper: https://t.co/2KqBv44KJt
colab: https://t.co/c8yWJxOJfg https://t.co/LFRLLRCqLw
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations by T. Chen et al. https://t.co/5SkbM49luU
— PyTorch Best Practices (@PyTorchPractice) March 12, 2020
StyleGAN2 Distillation for Feed-forward Image Manipulation
— roadrunner01 (@ak92501) March 10, 2020
pdf: https://t.co/1Ah4u3PPeu
abs: https://t.co/ZFR5hYH2Ts
github: https://t.co/4OcHBiZbHw pic.twitter.com/78H8LTXkMx
Brilliant work on modifying neural networks to run faster on CPU than on GPU. The trick is sparsity, multithreading, and carefully managing overhead.
— Brandon Rohrer (@_brohrer_) March 7, 2020
GPUs are great, but you don't need them to start playing with neural networks.https://t.co/xpcANFi4Id
I've written up a brief paper review of "Molecular Attention Transformers," an intriguing new model for molecular machine learning that blends graph convolutional methods with transformer architectures https://t.co/Xm4GMVzOWU
— Bharath Ramsundar (@rbhar90) March 6, 2020
Slides: https://t.co/QbeG0qb38s
— Thomas Wolf (@Thom_Wolf) March 3, 2020
"Limits, open questions & trends in Transfer Learning for NLP"
A subjective walk through recent papers on computational efficiency, model evaluation, fine-tuning, out-of-domain generalization, sample efficiency, common sense & inductive biases