You can think of artificial intelligence (AI) in the same way you think about cloud computing, if you think about either of them through an environmental lens: an enormous and growing source of carbon emissions, with the very real potential to choke out humans’ ability to breathe clean air long before a sentient and ornery AI goes all Skynet on us.
At the moment, data centers—the enormous rooms full of stacks and stacks of servers that juggle dank memes, fire tweets, your vitally important Google docs and all the other data that is stored somewhere other than on your phone and in your home computer—use about 2% of the world’s electricity.
Of that, servers that run AI—processing all the data and making the decisions and computations that a machine mimicking a human brain must handle in order to achieve “deep learning”—use about 0.1% of the world’s electricity, according to a recent MIT Technology Review article.
The likelihood that figure will grow, it turns out, is quite good.
Until recently, more than a few scholarly dives into AI and the electricity grid focused on how AI could make power usage or other current carbon-emission-generating pastimes smarter. Putting AI in charge of power loads over a “smart” grid would lead to greater efficiency, and maybe less electricity use overall.
Left unaccounted for in such models, however, was how much electricity AI itself would require. If (almost) everything in an “internet of things” world is connected to the internet and also able to machine-learn—your car, your delivery drone, you name it—then everything will require some kind of data storage solution. And that will require electricity.
Put in global perspective, 0.1% of the total power load is less juice than is currently used by the American legal marijuana industry, which by most guesses is using about 1% of the nation’s electricity load. (It’s also about half the demand required by the Bitcoin network, where famous Chinese Bitcoin mines run on dirty coal power plants.)
Gary Dickerson is the president and CEO of Applied Materials, a major semiconductor producer and thus an automatic expert in the demand for circuitry required to handle the transfer of data. According to The MIT Technology Review, Dickerson recently told a conference audience in San Francisco that—unless super-efficient semiconductors are innovated in the next five years—data centers handling AI demands could account for 10% of the world’s electricity use by 2025, a hundred-fold increase in a half-decade.
Dickerson’s forecast is a worst-case scenario. Other tech execs have given estimates that vary wildly. Some think data centers, period, will suck 10% of the global electricity load. Yet others think that usage will remain relatively flat, in part because of large companies’ abilities to handle vast amounts of data in more efficient ways. Google, for example, is using AI technology to cool its data centers, reducing demand for power by 40%. And Facebook has “vast data arrays” where more machines are asked to do less and thus use similar amounts of power—even while the data they handle grows in size.
At the same time, even consistent demand for power poses a problem. Three of the world’s largest data centers, according to a recent estimation, are in areas that get very hot—two of them, in Utah and in Nevada, in literal deserts. And fossil fuels still account for almost two-thirds of the world’s electricity generation. That number is likely to remain stable or grow, as the U.S. continues to export natural gas and as China builds coal-fired power plants around the world as part of its “belt-and-road” soft-power initiative and to fulfill domestic demand as its economy begins to grow.
Even a much more benign AI-demand scenario than Dickerson’s worst-case will require a significant shift in technique in order to not cause civilization-threatening havoc.
In the meantime, do enjoy the memes, and the gaming—they’re not coming cheaply.