Tweeted By @ak92501
Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning
— AK (@ak92501) June 7, 2021
pdf: https://t.co/vvC4sfE9i3
abs: https://t.co/mxlAWuAyiY
architecture that takes the entire dataset as input and uses self-attention to model complex relationships
between datapoints pic.twitter.com/5PQmSRDxwe