Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by ada_rob on 2020-02-11 (UTC).

New preprint: How Much Knowledge Can You Pack into the Parameters of a Language Model?

We show that T5 outperforms all previous open-domain QA systems *without using any external knowledge or context*.

Joint work w/ @colinraffel & Noam Shazeer.https://t.co/Ojg3wSUDQq
(1/5) pic.twitter.com/3adQ59LFYr

— Adam Roberts (@ada_rob) February 11, 2020
researchnlp
by ada_rob on 2020-02-11 (UTC).

We evaluated on Natural Questions, WebQuestions, and TriviaQA, outperforming all previous open-domain systems on NQ and WQ.

(3/5) pic.twitter.com/Xv23zOBDm9

— Adam Roberts (@ada_rob) February 11, 2020
researchnlp

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib