Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by mark_riedl on 2019-08-22 (UTC).

‼️ 1.5B parameter GPT-2 model released, but not by OpenAI https://t.co/8tgjUWxjZo

— Mark 🦑. Riedl (@mark_riedl) August 22, 2019
nlpresearchtool
by hardmaru on 2019-08-23 (UTC).

This replication project trained a 1.5B parameter “OpenGPT-2” model on OpenWebTextCorpus, a 38GB dataset similar to the original, and showed comparable results to original GPT-2 on various benchmarks. 👏🏼https://t.co/m4ZMB8RmdShttps://t.co/ZrqJ0IuHbw https://t.co/o3KBv5VXKJ pic.twitter.com/pGN0p00DBR

— hardmaru (@hardmaru) August 23, 2019
nlpresearchtool

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib