Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by seb_ruder on 2019-10-22 (UTC).

Most of the world’s text is not in English. We are releasing MultiFiT to train and fine-tune language models efficiently in any language.

Post: https://t.co/LXMo4KGcVS
Paper: https://t.co/3kh3fpfKcj
With @eisenjulian @PiotrCzapla Marcin Kadras @GuggerSylvain @jeremyphoward pic.twitter.com/QtcWhKqxyL

— Sebastian Ruder (@seb_ruder) October 22, 2019
nlpresearch
by jeremyphoward on 2019-10-23 (UTC).

One year ago, @seb_ruder asked the https://t.co/GEOZuodrZj community for help with multilingual language modeling.@eisenjulian @PiotrCzapla @misterkardas answered the call, and we now have MultiFiT, an EMNLP paper & code for multilingual training!https://t.co/FwY6QGV0gA pic.twitter.com/JYQRpz8wHT

— Jeremy Howard (@jeremyphoward) October 23, 2019
nlpresearch

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib