Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by hardmaru on 2018-06-28 (UTC).

Guided Evolutionary Strategies: Escaping the curse of dimensionality in random search, from Google Brain: @Niru_M @Luke_Metz @GeorgeJTucker @JaschaSD.
Paper: https://t.co/6UmZdNyzLz
GitHub: https://t.co/BrKeRlkEBS
Notebook: https://t.co/AzuBF7coAP pic.twitter.com/HIGEEi7FkN

— hardmaru (@hardmaru) June 28, 2018
research
by hardmaru on 2018-06-28 (UTC).

This algorithm beats CMA-ES (kind of like the LSTM of the ES world) for a few tasks. CMA-ES (from Nicolaus Hansen) is still my algorithm of choice for blackbox optimisation. I wonder if this algo will consistently beat CMA-ES on a variety of different tasks and make me use it ..

— hardmaru (@hardmaru) June 28, 2018
research
by jaschasd on 2018-06-28 (UTC).

Guided evolutionary strategies: escaping the curse of dimensionality in random search. A principled method to leverage training signals which are not the gradient, but which may be correlated with the gradient. Work with @niru_m @Luke_Metz @georgejtucker. https://t.co/LNPHDUrDFu

— Jascha (@jaschasd) June 28, 2018
research
by hardmaru on 2018-06-28 (UTC).

The great thing about this paper is how they made the results and method completely open, and we can easily try it out using the code in the colab notebook on other tasks: https://t.co/5Vh9lyAwOK

— hardmaru (@hardmaru) June 28, 2018
research

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib