Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by Thom_Wolf on 2019-02-11 (UTC).

PT-BERT 0.5 out💥
Pretty big release w. not 1 but TWO new pretrained models:
-classic: OpenAI's GPT
-brand-new: Transformer-XL by Google/CMU
As always both should be super easy to use

So...BERT now stands for Big-&-Extending-Repository-of-Transformers😅

Happy Transfer Learning! pic.twitter.com/yIdTHHJKKA

— Thomas Wolf (@Thom_Wolf) February 11, 2019
pytorchnlptoolresearchw_code
by HorevRani on 2019-02-11 (UTC).

Fresh out of the oven, a summary of XLM - @facebookai Cross-lingual BERT model https://t.co/lNd2UeuVPe

— Rani Horev (@HorevRani) February 11, 2019
nlpresearch

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib