Homepage
Close
Menu

Site Navigation

  • Home
  • Archive(TODO)
    • By Day
    • By Month
  • About(TODO)
  • Stats
Close
by OpenAI on 2018-05-16 (UTC).

AI and Compute: Our analysis showing that the amount of compute used in the largest AI training runs has had a doubling period of 3.5 months since 2012 (net increase of 300,000x): https://t.co/YH4tZXirtU

— OpenAI (@OpenAI) May 16, 2018
misc
by Smerity on 2018-05-16 (UTC).

Sitting at the forefront of computing, we like to imagine we have a good grip on rates of progress. Understanding that Moore's Law - hailed as a miracle enabler of the computing era - is a _snail_ compared to AI's compute trend is vital. Thanks for raising this @OpenAI. https://t.co/S3eR30HrPy

— Smerity (@Smerity) May 16, 2018
misc
by Smerity on 2018-05-16 (UTC).

This doesn't state that brilliant research can't arise from efficient hardware use, it's simply saying that if brute compute can be used advantageously for training an agent (search, self play, ...) then it will be. Compute will become engrained within and across devices.

— Smerity (@Smerity) May 16, 2018
by Smerity on 2018-05-16 (UTC).

Being guarded against hype is important - but being cognizant of how quickly the landscape may change in our field of endeavor (and the industries and communities tied to it) is equally important.

— Smerity (@Smerity) May 16, 2018

Tags

learning tutorial misc nlp rstats gan ethics research dataviz survey python tool security kaggle video thought bayesian humour tensorflow w_code bias dataset pytorch cv tip application javascript forecast swift golang rl jax julia gnn causal surey diffusion
© Copyright Philosophy 2018 Site Template by Colorlib