The post-exponential era of AI and Moore’s Law

INSUBCONTINENT EXCLUSIVE:
My MacBook Pro is three years old, and for the first time in my life, a three-year-old primary computer doesn&t feel like a crisis which
must be resolved immediately
True, this is partly because I&m waiting for Apple to fix their keyboard debacle, and partly because I still cannot stomach the Touch Bar
But it is also because three years of performance growth ain&t what it used to be. It is no exaggeration to say that Moore Law, the
mindbogglingly relentless exponential growth in our world computing power, has been the most significant force in the world for the last
fifty years
So its slow deceleration and/or demise are a big deal, and not just because the repercussions are now making their way into every home and
every pocket. We&ve all lived in hope that some other field would go exponential, giving us another, similar, era, of course
AI/machine learning was the great hope, especially the distant dream of a machine-learning feedback loop, AI improving AI at an exponential
pace for decades
That now seems awfully unlikely. In truth it always did
A couple of years ago I was talking to the CEO of an AI company who argued that AI progress was basically an S-curve, and we had already
reached its top for sound processing, were nearing it for image and video, but were only halfway up the curve for text
No prize for guessing which one his company specialized in — but it seems to have been entirely correct. Earlier this week OpenAI released
an update to their analysis from last year regarding how the computing power used by AI1 is increasing
The outcome? It &has been increasing exponentially with a 3.4-month doubling time (by comparison, Moore Law had a 2-year doubling period)
Since 2012, this metric has grown by more than 300,000x (a 2-year doubling period would yield only a 7x increase).& That … a lot of
computing power to improve the state of the AI art, and it clear that this growth in compute cannot continue
Not &will not&; can not
Sadly, the exponential growth in the need for computing power to train AI has happened almost exactly contemporaneously with the
diminishment of the exponential growth of Moore Law
Throwing more money at the problem won&t help — again, we&re talking about exponential rates of growth here, linear expense adjustments
won&t move the needle. The takeaway is that, even if we assume great efficiency breakthroughs and performance improvements to reduce the
rate of doubling, AI progress seems to be increasingly compute-limited at a time when our collective growth in computing power is beginning
to falter
Perhaps there&ll be some sort of breakthrough, but in the absence of one, it sounds a whole lot like we&re looking at AI/machine-learning
progress leveling off, not long from now, and for the foreseeable future. 1It measures &the largest AI training runs,& technically, but this
seems trend-instructive.