Tweeted By @ak92501
Graph Kernel Attention Transformers
— AK (@ak92501) July 19, 2021
pdf: https://t.co/Uyy5ZcTD1I
abs: https://t.co/KTxHRuYvVV
comparison of method with 9 different GNN classes on tasks, showing consistent gains coming from GKATs pic.twitter.com/acoHALTKjv