Moving away from negative pairs in self-supervised representation learning: our new SotA method, Bootstrap Your Own Latent (BYOL), narrows the gap between self-supervised & supervised methods simply by predicting previous versions of itself.
— DeepMind (@DeepMind) June 16, 2020
See here: https://t.co/qyaSXnPQjN pic.twitter.com/QOqmo34UdT