Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by ak92501 on 2021-05-18 (UTC).

Rethinking “Batch” in BatchNorm
pdf: https://t.co/ZfLGqlGxPv
abs: https://t.co/oJArBeNN90 pic.twitter.com/TgqI1HkQv7

— AK (@ak92501) May 18, 2021
research
by karpathy on 2021-05-18 (UTC).

This paper gives me anxiety. BatchNorm is the most deviously subtly complex layer in deep learning. Many issues (silently) root cause to it. Yet it is ubiquitous because it works well (it multi-task helps optimization/regularization) and can be fused to affines at inference time. https://t.co/3EC2Abm8Ry

— Andrej Karpathy (@karpathy) May 18, 2021
researchthought

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib