Tweeted By @jeremyphoward
One of our students did something crazy with transfer learning: froze the randomly added fully connected layer, and only fine-tuned the pre-trained layers.
— Jeremy Howard (@jeremyphoward) October 11, 2018
What's really crazy: that resulted in a top-10 Kaggle finish. See paper for details.https://t.co/1TZT2vfUXs