Tweeted By @ak92501
SOFT: Softmax-free Transformer with Linear Complexity
— AK (@ak92501) October 25, 2021
abs: https://t.co/EralXVH5CZ
github: https://t.co/4miqmwAGcA
introduced a softmax-free self-attention mechanism for linearizing Transformer’s complexity in space and time pic.twitter.com/85Mw5MJOUc