Interesting multitask text embeddings by @PinterestEng https://t.co/gABOGQVk0r
β Xavier ποΈ (@xamat) September 6, 2019
Interesting multitask text embeddings by @PinterestEng https://t.co/gABOGQVk0r
β Xavier ποΈ (@xamat) September 6, 2019
Pytorch implementation of the paper "Class-Balanced Loss Based on Effective Number of Samples" https://t.co/i8zfOFPiQd #deeplearning #machinelearning #ml #ai #neuralnetworks #datascience #pytorch
β PyTorch Best Practices (@PyTorchPractice) September 5, 2019
Today we unveiled the latest version of Aristo, an AI system capable of scoring over 90% on an 8th grade science exam β this achievement is the result of years of research into machine reasoning and natural language processing.https://t.co/Bk9zWVHJYB
β Allen Institute for Artificial Intelligence (AI2) (@allen_ai) September 4, 2019
by @CadeMetz via @nytimes
Face-to-Parameter Translation for Game Character Auto-Creation
β hardmaru (@hardmaru) September 4, 2019
Their method allows the automatic creation of game characters based on input face photos. An βimitatorβ is used to model the behavior of the game engine, to make entire thing differentiable.https://t.co/ghsZ9jia49 pic.twitter.com/hnntYWhZx4
A single-layer RNN can approximate stacked and bidirectional RNNs, and topologies in between
β Thomas Lahore (@evolvingstuff) September 4, 2019
"single-layer RNN can perfectly mimic an arbitrarily deep stacked RNN under specific constraints on its weight matrix and a delay between input and output"https://t.co/nmeasSiw0S pic.twitter.com/qbR1ms0fos
FFORMPP: Feature-based forecast model performance prediction. https://t.co/ymyKcvK7U2 pic.twitter.com/m3tu35nLKO
β arxiv (@arxiv_org) September 3, 2019
A Generative Adversarial Network for Chinese Ink Wash Painting Style Transfer
β hardmaru (@hardmaru) September 3, 2019
Incorporating domain knowledge into style transfer such as voids, brush strokes, and ink wash tone as constraints, help address key techniques used in Chinese ink wash paintinghttps://t.co/1NKW28lr6w pic.twitter.com/P9RQDDgbOj
They trained an agent to play SimCity.
β hardmaru (@hardmaru) September 2, 2019
The results look kind of interesting! (cc @togelius) https://t.co/KVU1JjtUFu
I had a thread with other works that also point to the same type of results regarding random search for NAS when the search space includes hand-engineered prior such as conv operation. In particular, https://t.co/BrvrwhbkwS by Li and Talwalkar. https://t.co/lhmVXPRPeR
β hardmaru (@hardmaru) September 2, 2019
Random Search Outperforms State-Of-The-Art NAS Algorithms https://t.co/4lJXK2EJA6
β /MachineLearning (@slashML) September 1, 2019
Exploring Weight Agnostic Neural Networks, where the architectures alone can be highly effective https://t.co/MrykybAbYk
β Ben Hamner (@benhamner) August 31, 2019
Dimensions of Dialogue
β hardmaru (@hardmaru) August 30, 2019
New writing systems are created by challenging two neural networks to communicate information using images. Results look really interesting!https://t.co/NtBjcyZWv4 https://t.co/HpsNkOppkG pic.twitter.com/CGhFqR45kg