The Artificial Intelligence arms race between nations dwarfs the nuclear weapons competition of decades past. Investments in AI are estimated to eclipse $7 trillion worldwide by 2030. The capital intensive technology will require massive amounts of resources, including electricity, water and land.
Many countries are waking up to the perils of the explosive growth of generative AI and the rapid deployment of the technology. It is dawning on leaders at the national, state and local level that there are unprecedented challenges fueled by the AI gold rush with little time to adapt.
At the birth of AI, experts worried the technology would replace millions of jobs. That issue temporarily has been taken a backseat as the world watches the relentless building boom of data centers. A new AI data center is expected to come online every day this year, totaling 504 by year's end.
Capital required to finance this rapid expansion in AI data centers is expected to hit $6.7 trillion by 2030, according to a study by McKinsey & Company. The price tag includes money for land, site development, power and cooling generators, hardware and human capital.
Private sector investment in AI topped $100 billion in the U.S. last year, nearly 10 times as much as China. During the period from 2013 to 2023, private sector firms spent $470 billion, four times more than China. The government, mostly defense, spent $5.2 billion during the same ten year period.
Amazon leads the tech titans with a cap ex investment this year projected to top $100 billion and potentially could reach $118 billion. The corporate behemoth operates more than 100 data centers worldwide, each of which houses about 50,000 servers to support cloud computing services.
This insatiable demand for capital is stressing corporate balance sheets and forcing a recalibration of the financial resources needed to build the backbone of the new economy. For perspective, that $7 trillion figure represents more than the Gross Domestic Product of every country but two: the U.S. and China.
Although data centers have been around since the 1940's, training and using AI requires enormous amounts of computing power in data centers. AI centers consume seven to eight times more energy than a typical computing workload, according to a Massachusetts Institute of Technology (MIT) study.
AI data centers house advanced computing, network and storage architectures, buttressed by energy and cooling systems to handle high density workloads. AI centers are crammed with graphics processing units (GPU) that generate intense heat.
A unique feature of generative AI is the increased fluctuations in energy use which occur over different phases in training machine learning. One study estimated the training process to deploy a recent Open AI model consumed 1,287 megawatt hours of electricity, enough to power 120 average homes.
Scientists estimate the power requirements for data centers nearly doubled just between 2022 and the end of 2023. MIT researchers calculated that by 2026 the electricity consumption of all data centers will approach 1,050 terawatt hours. Each terawatt equals one trillion watts of electricity.
A major new International Energy Agency (IEA) report calculates that data centers worldwide are expected to more than double by 2030, requiring around 945 terawatt hours of electricity, less than the MIT estimate by still a hefty amount. That is more than the entire electricity consumption of Japan.
An already taxed electricity grid has prompted major technology companies to invest in their own energy facilities and to strike agreements for dedicated electricity resources. Microsoft, for instance, has agreed to purchase $16 billion in energy from the restarted Three Mile Island nuclear facility.
Goggle is collaborating with Karios Power and the Tennessee Valley Authority to deploy advanced nuclear energy to supplement the electricity grid to power its data centers in Tennessee and Alabama. Amazon is partnering with Talen Energy to secure nuclear power from the Susquehanna power station.
Current data centers are already contributing to rising consumer electricity rates. The 13-state region served by PJM Interconnection is home to the largest concentration of data centers. Residential consumers were hit with a 20% spike in rates this summer as PJM's costs soared $9 billion.
Today's hyper scale AI data centers require as much as 1,200 acres or more to build. To accommodate those acreage requirements, data centers are being constructed farther and farther away from cities. Currently, data centers have gobbled up 282.8 million square feet of land in the U.S.
Northern Virginia, home to a high concentration of data centers, reports 51 million square feet of land dedicated to the facilities. Operators require large tracts of land to develop multiple buildings over time. The acreage includes buffer zones for cooling plants, backup generators and electrical substations.
Water consumption is often an overlooked issue when it comes to AI data centers. Training AI models generates significant heat, increasing the need for water to cool and to keep the humidity low. One study found that as much as 720 billion gallons of water annually will be needed by 2028 for AI data centers.
Googles's data center in Henderson, Nevada, consumed 352 million gallons of water in 2024, according to data obtained by the Las Vegas Review-Journal. Goggle reported using more than 6 billion gallons of water in 2023 for all of its data centers.
The water and power resource drain is already causing some cities to rethink support for construction of massive data center projects. Tucson's city council recently defeated a proposal for a 290-acre data center in Pima County over concerns about water and electricity consumption.
The project would have generated $250 million in tax revenue and created 3,000 temporary construction jobs and provided 180 permanent positions. Local officials identified the company as Amazon, but the firm declined to comment on the proposed Henderson facility.
This litany of thorny issues facing AI should not detract from its enormous potential. Goldman Sachs predicts AI will boost the global GDP by $7 trillion over ten years. McKinsey projects generative AI will add between $2.6 trillion and $4.4 trillion annually to the world's economy.
The winner in the AI race will be those countries that encourage industry to address the requirements for power, water, acreage and financing before it's too late. The good news is the AI transformation is already fueling cutting edge solutions that will help fulfill the promise of the technological revolution.