Mish: Self Regularized Non-Monotonic Activation Function
— ML Review (@ml_review) October 13, 2019
By @DigantaMisra1
𝑓(𝑥)=𝑥⋅𝑡𝑎𝑛ℎ(𝑠𝑜𝑓𝑡𝑝𝑙𝑢𝑠(𝑥))
Increased accuracy over Swish/ReLU
Increased performance over Swish
Githubhttps://t.co/RYzuj0xhDN
ArXivhttps://t.co/YJKTd4yKvr pic.twitter.com/zlyQ0hwggt