AI and Compute: Our analysis showing that the amount of compute used in the largest AI training runs has had a doubling period of 3.5 months since 2012 (net increase of 300,000x): https://t.co/YH4tZXirtU
— OpenAI (@OpenAI) May 16, 2018
AI and Compute: Our analysis showing that the amount of compute used in the largest AI training runs has had a doubling period of 3.5 months since 2012 (net increase of 300,000x): https://t.co/YH4tZXirtU
— OpenAI (@OpenAI) May 16, 2018
Sitting at the forefront of computing, we like to imagine we have a good grip on rates of progress. Understanding that Moore's Law - hailed as a miracle enabler of the computing era - is a _snail_ compared to AI's compute trend is vital. Thanks for raising this @OpenAI. https://t.co/S3eR30HrPy
— Smerity (@Smerity) May 16, 2018
This doesn't state that brilliant research can't arise from efficient hardware use, it's simply saying that if brute compute can be used advantageously for training an agent (search, self play, ...) then it will be. Compute will become engrained within and across devices.
— Smerity (@Smerity) May 16, 2018
Being guarded against hype is important - but being cognizant of how quickly the landscape may change in our field of endeavor (and the industries and communities tied to it) is equally important.
— Smerity (@Smerity) May 16, 2018