Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by hardmaru on 2018-08-30 (UTC).

Revisiting Character-Based Neural Machine Translation with Capacity and Compression, from @GoogleAI. “We show that deep models operating at the character level outperform identical models operating over word fragments.” https://t.co/RR7GOI2ku4

— hardmaru (@hardmaru) August 30, 2018
researchnlp
by hardmaru on 2018-08-30 (UTC).

I find character-level language modeling much more simpler and elegant compared to using traditional word token pipelines. I think it is also a more interesting research problem to study languages at the character level compared to word level.

— hardmaru (@hardmaru) August 30, 2018
thoughtnlp

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib