Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by peteskomoroch on 2020-01-16 (UTC).

Google Reformer: Transformer that can process text sequences of lengths up to 1 million words on a single accelerator using only 16GB of memory https://t.co/PoPANCE3PL via @googleai

— Peter Skomoroch (@peteskomoroch) January 16, 2020
researchnlp
by GaryMarcus on 2020-01-16 (UTC).

Compare headline with the actual output:
“Google's AI language model Reformer can process the entirety of novels”
vs
“There was a time when the door, when anxious--he did most of all kicking his weary. It was a scarcely realisease talking ears fellow... https://t.co/78xVXHV4T9

— Gary Marcus (@GaryMarcus) January 16, 2020
misc
by pythontrending on 2020-01-18 (UTC).

reformer-pytorch - Reformer, the efficient Transformer, implemented in Pytorch https://t.co/SgcipMWpUQ

— Python Trending (@pythontrending) January 18, 2020
w_codenlpresearch
by evolvingstuff on 2020-01-20 (UTC).

Reformer: The Efficient Transformer

"we replace dot-product attention by one that uses locality-sensitive hashing, changing its complexity from O(L^2) to O(L log L), where L is the length of the sequence"

paper: https://t.co/3o1scnoCCT

code: https://t.co/OjLbTyILln

— Thomas Lahore (@evolvingstuff) January 20, 2020
researchnlp
by hardmaru on 2020-04-08 (UTC).

Another Transformer variant with lower computational complexity, suitable for long-range tasks, is Sparse Sinkhorn Attention (https://t.co/qWp2AJVdkd) by Yi Tay et al.

A GitHub Colab reimplementation in PyTorch (https://t.co/B5FcGuTZhy) also combined it with ideas from Reformer. https://t.co/WSwZuSRyPb pic.twitter.com/54fJrRbhEA

— hardmaru (@hardmaru) April 8, 2020
researchnlpw_codepytorch

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib