Tweeted By @hardmaru
Transformer’s attention mechanism can be linked to other cool ideas in AI
— hardmaru (@hardmaru) September 26, 2020
- Indirect Encoding in Neuroevolutionhttps://t.co/G740mhjBv4
- Hopfield Networks with continuous stateshttps://t.co/FL8PimjVo9
- Graph Neural Networks with multi-head attentionhttps://t.co/PACMnKT50F