Sunday, June 24

Beyond Moore’s law: AI training compute doubling every few months

LinkedIn Google+ Pinterest Tumblr +

OpenAI released an analysis showing that since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.5 month-doubling time.

By comparison, Moore’s Law had an 18-month doubling period.

Since 2012, this metric has grown by more than 300,000x (an 18-month doubling period would yield only a 12x increase).

Improvements in compute have been a key component of AI progress, so as long as this trend continues, it’s worth preparing for the implications of systems far outside today’s capabilities, wrote OpenAI in a blog post.

OpenAI is a non-profit AI research company that describes its aims as “discovering and enacting the path to safe artificial general intelligence”. It was co-founded by Elon Musk, who was also OpenAI’s former chair until he stepped down earlier this year.

According to OpenAI, this was to avoid conflict of interest with Musk’s work at Tesla and its AI-supported autonomous driving technology.

Compute advances AI

Three factors drive the advance of AI: algorithmic innovation, data (which can be either supervised data or interactive environments), and the amount of compute available for training.

Algorithmic innovation and data are difficult to track, but compute is unusually quantifiable, providing an opportunity to measure one input to AI progress.

“For this analysis, we believe the relevant number is not the speed of a single GPU, nor the capacity of the biggest datacenter, but the amount of compute that is used to train a single model — this is the number most likely to correlate to how powerful our best models are,” OpenAI wrote.

Source: OpenAI

OpenAI noted that AlphaGoZero/AlphaZero is the most visible public example of massive algorithmic parallelism, but many other applications at this scale are now algorithmically possible, and may already be happening in a production context.

Trends will likely continue

OpenAI said it sees multiple reasons to believe that the trend in the graph could continue.

For one, many hardware startups are developing AI-specific chips, some of which claim they will achieve a substantial increase in FLOPS/Watt (which is correlated to FLOPS/$) over the next 1-2 years.

 

Read research on FPGA and ASIC chips getting in on AI action

 

Not surprisingly, one of the conclusions OpenAI draws from the research is that even the reasonable potential for rapid increases in capabilities means it is critical to start addressing both safety and malicious use of AI today.

But there’s also an acknowledgment that there’s no predicting the future.

“Past trends are not sufficient to predict how long the trend will continue into the future, or what will happen while it continues,” wrote OpenAI researchers.

Share.

About Author

Leave A Reply