Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by andrewgwils on 2019-04-29 (UTC).

Stochastic Weight Averaging in Low Precision Training (SWALP)! Our new ICML paper (with PyTorch code). SWALP can match the performance of full-precision training, even with all numbers quantized down to 8 bits! https://t.co/eTWQlvUMYS pic.twitter.com/QUjhFeigY7

— Andrew Gordon Wilson (@andrewgwils) April 29, 2019
pytorchresearch
by PyTorch on 2019-04-29 (UTC).

Stochastic Weight Averaging: a simple procedure that improves generalization over SGD at no additional cost.
Can be used as a drop-in replacement for any other optimizer in PyTorch.
Read more: https://t.co/IRhz40AZKU
guest blogpost by @Pavel_Izmailov and @andrewgwils pic.twitter.com/yU0HKDYr7v

— PyTorch (@PyTorch) April 29, 2019
pytorchw_codelearningtoolsurvey
by deliprao on 2019-04-29 (UTC).

One line change: pic.twitter.com/2DWLuQcjue

— Delip Rao (@deliprao) April 29, 2019
tiptool
by seanjtaylor on 2019-04-29 (UTC).

Buried at the end of this post is a neat discussion of using SWA-Gaussian to get uncertainty estimates from deep learning models. I’m looking forward to checking that out. https://t.co/XU0arMvjGt

— Sean J. Taylor (@seanjtaylor) April 29, 2019
toolpytorch

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib