Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by hardmaru on 2020-02-11 (UTC).

Turing-NLG: A 17-billion-parameter language model

“Any model with more than 1.3B parameters cannot fit into a single GPU (even one with 32GB memory)… The resulting T-NLG model has 78 Transformer layers with a hidden size of 4256 and 28 attention heads.”https://t.co/bRjEacrZma

— hardmaru (@hardmaru) February 11, 2020
nlpresearch
by ericjang11 on 2020-02-11 (UTC).

The search engine of the future will know "who Jason Mraz is engaged to" not by querying some semi-manually curated semantic triplet graph, but simply running a "fact answering neural network" on the raw question. https://t.co/3MfRxQVEfN

— Eric Jang 🇺🇸🇹🇼 (@ericjang11) February 11, 2020
researchnlp

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib