finetuning data efficient gans 100-shot-obama to cartoonset
— AK (@ak92501) September 23, 2020
github: https://t.co/CZt3YeJ5Gd
dataset: https://t.co/IzSvkrsLyk pic.twitter.com/4vfzvmYpjd
finetuning data efficient gans 100-shot-obama to cartoonset
— AK (@ak92501) September 23, 2020
github: https://t.co/CZt3YeJ5Gd
dataset: https://t.co/IzSvkrsLyk pic.twitter.com/4vfzvmYpjd
In an effort to better understand how view selection affects #ContrastiveLearning models, we analyze the impact of viewpoint mutual information on downstream task performance. Learn more and grab the supporting code with pre-trained models at: https://t.co/6mEwWqT9pb pic.twitter.com/YCPGg8UhuP
— Google AI (@GoogleAI) August 21, 2020
Preprint alert!
— Karan Goel (@krandiash) August 18, 2020
"Model Patching: Closing the Subgroup Performance Gap with Data Augmentation" is now on arXiv!
📑Paper: https://t.co/IDSuIikvSe
🧑💻Code: https://t.co/C8opKih5Gm
📹Video: https://t.co/msrj7ZhPJc
✍️Blog: https://t.co/CF4CUQUKq9
Read on to learn more (1/9) pic.twitter.com/JGlPx2HouV
Slides and code from my #rstatsnyc talk on "Forecasting ensembles using fable" now available at https://t.co/wB1GhYXqE0. #rstats #forecasting Thanks to @mitchoharawild for the packages. pic.twitter.com/GTpKLQxz0Y
— Rob J Hyndman (@robjhyndman) August 14, 2020
Does weight-sharing NAS work?
— Hanxiao Liu (@Hanxiao_6) August 14, 2020
Yes!
Our study shows that weight-sharing NAS is better than random search, and the gap widens on larger and more challenging search spaces.
Paper and Video (CVPR 2020): https://t.co/9uLUf7cxx6
Code: https://t.co/VugdPDbhLE pic.twitter.com/A5CJqsGZUK
A PyTorch implementation of MobileNetV3 for real-time semantic segmentation, with state-of-the-art performance https://t.co/MYE7mwOWy4 #deeplearning #machinelearning #ml #ai #neuralnetworks #datascience #pytorch
— PyTorch Best Practices (@PyTorchPractice) August 12, 2020
No labeled data? No problem.
— Hugging Face (@huggingface) August 11, 2020
The 🤗 Transformers master branch now includes a built-in pipeline for zero-shot text classification, to be included in the next release.
Try it out in the notebook here: https://t.co/31Z6LR1NjK pic.twitter.com/IugPUj3sa1
AnimeGANv2
— AK (@ak92501) August 7, 2020
github: https://t.co/a74PV1427U pic.twitter.com/lHrlDA1STr
Let's build an #AIEconomist for the real world together!
— Richard Socher (@RichardSocher) August 6, 2020
The framework for building RL-friendly economic simulations is now open source!
This has so much potential for good.https://t.co/AF9BVuPt8ihttps://t.co/cipWhvj1PJhttps://t.co/5xferEiUkg pic.twitter.com/oa4asja3lY
Self-attention mechanism can be viewed as the update rule of a Hopfield network with continuous states.
— hardmaru (@hardmaru) August 6, 2020
Deep learning models can take advantage of Hopfield networks as a powerful concept comprising pooling, memory, and attention.https://t.co/FL8PimjVo9https://t.co/HT79M95lkn pic.twitter.com/Ld2eioVsDG
Hopfield Networks is All You Need
— AK (@ak92501) August 6, 2020
pdf: https://t.co/SWFnVFNS8h
abs: https://t.co/erpgXRmPqJ
github: https://t.co/MWrtQlsNNO pic.twitter.com/0VmtHZK9QX
DeLighT: Very Deep and Light-weight Transformer
— AK (@ak92501) August 4, 2020
pdf: https://t.co/3BGksd53Bs
abs: https://t.co/QgKDzHmYy9
github: https://t.co/nZzapo7NCF pic.twitter.com/h0qsg58MRU