Bayesian Methods Pros And Cons https://t.co/eZ2bbpDzwV pic.twitter.com/bw3OvywEOE
— Chris Albon (@chrisalbon) August 10, 2018
Bayesian Methods Pros And Cons https://t.co/eZ2bbpDzwV pic.twitter.com/bw3OvywEOE
— Chris Albon (@chrisalbon) August 10, 2018
New Course: Fundamentals of Bayesian Data Analysis in R https://t.co/BIPmhlvRVQ #rstats #DataScience
— R-bloggers (@Rbloggers) August 9, 2018
New blog post: how (not) to introduce newcomers to #Bayesian analysis #teaching #statisticshttps://t.co/FjBetsUYnX
— Robert Grant (@robertstats) August 7, 2018
Variational Inference: A Review for Statisticians [41pp]
— ML Review (@ml_review) August 7, 2018
By @blei_lab
Mean-field variational inference
Bayesian mixture of Gaussians
Variational inference with exponential familieshttps://t.co/uGI8awaWZ2 #MachineLearning
ICYMI, 📃 rstanarm, RStan…
— Mara Averick (@dataandme) July 30, 2018
"bayesplot: cheatsheets for the Stan ecosystem" ✏️ Edward Roualdes
https://t.co/LTFm9wMReK #rstan #rstats #dataviz pic.twitter.com/cmN2HHwLgU
Awesome MCMC animation site by Chi Feng! On Github! https://t.co/uN9W1amPyi
— Andrew Gelman (@StatModeling) July 26, 2018
📃 rstanarm, RStan…
— Mara Averick (@dataandme) July 23, 2018
"bayesplot: cheatsheets for the Stan ecosystem" ✏️ Edward Roualdes
https://t.co/LTFm9wMReK #rstan #rstats #dataviz pic.twitter.com/I2D3OoDx50
#brms version 2.4 is now on CRAN with extended functionality for non-linear and mixture models, more options to amend the generated @mcmc_stan code and lots of other new features. Check out https://t.co/sxO4SRqNJ9 #rstats
— Paul Bürkner (@paulbuerkner) July 21, 2018
How do we specify priors for Bayesian neural networks? Check out our work on Noise Contrastive Priors at the ICML Deep Generative Models workshop 11:40am+. @danijarh, @alexirpan, Timothy Lillicrap, James Davidson https://t.co/OS3gmKin9g pic.twitter.com/cNhfeCYuNb
— Dustin Tran (@dustinvtran) July 15, 2018
Love this paper, it restores your faith in humanity.
— Neil Lawrence (@lawrennd) July 12, 2018
Great challenge, great science, and brings together a number of ideas across different fields.
More of this type of work please! @dennisprangle #ICML2008 https://t.co/p4D6nx6TW0
There is no better way to finish off some Bayesian optimisation than with a bit of local optimisation: congratulations to Mark McLeod on his #icml2018 paper! Featuring a novel stopping criterion and IMHO a killer acronym. https://t.co/SSV6gbXuU4 pic.twitter.com/fCEAzzOpcK
— Michael A Osborne (@maosbot) May 23, 2018
Visual guide to Bayesian Thinking https://t.co/GuedRgwD4J #MachineLearning #DataScience #Statistics
— Kirk Borne (@KirkDBorne) September 1, 2017
Now with Legos: https://t.co/8HlX82PVyA pic.twitter.com/rw1ZPj5DlN