Tweeted By @GoogleAI
Check out Performer, a generalized attention framework based on the Transformer architecture, which implements the novel FAVOR+ algorithm to provide linearly scalable, low-variance and unbiased estimation of attention mechanisms. Read more at https://t.co/lLzhXJRIbh pic.twitter.com/k153ltoqMD
— Google AI (@GoogleAI) October 23, 2020