Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by ml_review on 2019-10-13 (UTC).

Mish: Self Regularized Non-Monotonic Activation Function
By @DigantaMisra1
𝑓(𝑥)=𝑥⋅𝑡𝑎𝑛ℎ(𝑠𝑜𝑓𝑡𝑝𝑙𝑢𝑠(𝑥))
Increased accuracy over Swish/ReLU
Increased performance over Swish

Githubhttps://t.co/RYzuj0xhDN
ArXivhttps://t.co/YJKTd4yKvr pic.twitter.com/zlyQ0hwggt

— ML Review (@ml_review) October 13, 2019
researchw_code
by jeremyphoward on 2019-10-14 (UTC).

This new activation function has seen quite a bit of success in the @fastdotai community already, including generating a forum discussion with over 500 posts (including many from the author of the paper)! https://t.co/lL05utISsp https://t.co/6LU0CiCXrg

— Jeremy Howard (@jeremyphoward) October 14, 2019
research

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib