Tweeted By @hardmaru
Self-attention mechanism can be viewed as the update rule of a Hopfield network with continuous states.
— hardmaru (@hardmaru) August 6, 2020
Deep learning models can take advantage of Hopfield networks as a powerful concept comprising pooling, memory, and attention.https://t.co/FL8PimjVo9https://t.co/HT79M95lkn pic.twitter.com/Ld2eioVsDG