Paying U-Attention to Textures: Multi-Stage Hourglass Vision Transformer for Universal Texture Synthesis
β AK (@ak92501) February 24, 2022
abs: https://t.co/apd0do8HZH pic.twitter.com/d4zfL8wMlO
Paying U-Attention to Textures: Multi-Stage Hourglass Vision Transformer for Universal Texture Synthesis
β AK (@ak92501) February 24, 2022
abs: https://t.co/apd0do8HZH pic.twitter.com/d4zfL8wMlO
Self-Supervised Transformers for Unsupervised Object Discovery using Normalized Cut
β AK (@ak92501) February 24, 2022
abs: https://t.co/FotRPtnsBG
project page: https://t.co/ZoNZrniFcX pic.twitter.com/i3XqB53uCZ
A New Generation of Perspective API: Efficient Multilingual Character-level Transformers
β AK (@ak92501) February 24, 2022
abs: https://t.co/hSzS6MkjDx pic.twitter.com/Qd5KcKksVB
Monash time series forecasting repository https://t.co/Mcc6N4fiJL #rstats #forecasting
β Rob J Hyndman (@robjhyndman) February 23, 2022
On @huggingface we host more and more git repos so we are switching the Infra behind our git server to Gitaly, an open source Infra project from @gitlab
β Julien Chaumond (@julien_c) February 22, 2022
The goal is to make the Hub scalable and robust for the next 5 years of MLπ₯
more info in this thread https://t.co/rwwy5ZYYfR pic.twitter.com/ScbhkibefD
Visual Attention Network
β AK (@ak92501) February 22, 2022
abs: https://t.co/K0tUUFx3qk
github: https://t.co/wPaXMoXVwL pic.twitter.com/rjMYHrrWgF
Great figure illustrating the different types of deep generative models via @lilianweng (https://t.co/4NJZzr9HKF) & nice list of cons. GANs: unstable training, low diversity; VAE: surrogate loss; Flow-based: special architecture for reversible transf (Diffusion: slow to sample) pic.twitter.com/zl4RLIW3EB
β Sebastian Raschka (@rasbt) February 21, 2022
SGPT: GPT Sentence Embeddings for Semantic Search
β AK (@ak92501) February 21, 2022
abs: https://t.co/zLqojUmqwl
github: https://t.co/Rj3H5waw2t pic.twitter.com/lNAESbFASS
Machine learning has a hallucination problem
β Gary Marcus (@GaryMarcus) February 20, 2022
new review from β¦@pascalefungβ© and others: https://t.co/P71maYqOhc pic.twitter.com/cvpI6zmq6T
The world's shipping lanes and flight paths. #travel #dataviz
β Randy Olson (@randal_olson) February 19, 2022
Source: https://t.co/XaC31abRoO pic.twitter.com/5ciMjAZoas
How Do Vision Transformers Work?
β Jeremy Howard (@jeremyphoward) February 19, 2022
"...we propose AlterNet, a model in which Conv blocks at the end of a stage are replaced with MSA blocks. AlterNet outperforms CNNs not only in large data regimes but also in small data regimes." https://t.co/edPXnu0cn8
I played a bit recently with accelerating pandas workloads on a single machine and it seems the winning combo for me is:
β Radek Osmulski πΊπ¦ (@radekosmulski) February 18, 2022
β use swap for OOM
β use pandas (dask is nice but limited API, modin can be slow)
But I found this cool little gem - swifter π₯
take a look π pic.twitter.com/mDC71gOhAK