Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by gneubig on 2018-08-28 (UTC).

#EMNLP2018 paper by @eaplatanios on Context Sensitive Parameter Generation for Universal NMT! https://t.co/93mLfECxnC
Inspired by @hardmaru's HyperNetworks, we learn to generate NMT parameters for the languages we want to translate. Nice results on multilingual and zero-shot MT! pic.twitter.com/QOYrSwoPMq

— Graham Neubig (@gneubig) August 28, 2018
researchnlp
by hardmaru on 2018-08-28 (UTC).

They trained a model to generate the model parameters of the encoder and decoder LSTMs used for multilingual machine translation! https://t.co/B5kTkZ00HD

— hardmaru (@hardmaru) August 28, 2018
researchnlp

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib