Tweeted By @evolvingstuff
Transformer/LSTM hybrids!!
— Thomas Lahore (@evolvingstuff) April 23, 2019
Language Models with Transformers
"we explore effective Transformer architectures for language model, including adding additional LSTM layers to better capture the sequential context while still keeping computation efficient"https://t.co/KVWjpsACwO pic.twitter.com/4B96N4Sa57