Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by Tim_Dettmers on 2020-11-28 (UTC).

I am curious why people are not talking more about the OpenAI scaling law papers. For me, they seem very significant. What I heard so far: "Too complicated. I don't understand and I don't care", "NLP is not physics". Other criticism? Any insights why people ignore it?

— Tim Dettmers (@Tim_Dettmers) November 28, 2020
nlpresearch
by Tim_Dettmers on 2020-11-29 (UTC).

I was made aware of two papers that are similar and preceded both OpenAI papers. I think these add more data points to scaling behavior for language (and also vision). These should be shared more widely! https://t.co/ZCXYCt3DgN https://t.co/QNn8KznjXe

— Tim Dettmers (@Tim_Dettmers) November 29, 2020
researchnlp

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib