Tweeted By @AndrewYNg
Just read this cool paper: Neural net pretraining keeps improving when you train on an unprecedented 3.5 billion (that's really big) labeled images and transfer to new task. IMO we're still nowhere near the limits of pretraining/transfer learning. https://t.co/n92buesXph
— Andrew Ng (@AndrewYNg) August 15, 2018