Tweeted By @jeremyphoward
Here's a little table you might find handy if you're wondering which language wikipedias are likely to be useful for language model pre-training, e.g. for ULMFiT. It's sorted by "depth" (see link for definition) multiplied by number of articles.https://t.co/FOO2HvCH9v pic.twitter.com/TuKAJFaAUO
— Jeremy Howard (@jeremyphoward) June 5, 2019