Showing posts with label Artificial Intelligence. Show all posts
Showing posts with label Artificial Intelligence. Show all posts

Monday, August 25, 2025

Harnessing AI's Power Without Sapping Resources

The Artificial Intelligence arms race between nations dwarfs the nuclear weapons competition of decades past.  Investments in AI are estimated to eclipse $7 trillion worldwide by 2030.  The capital intensive technology will require massive amounts of resources, including electricity, water and land.

Many countries are waking up to the perils of the explosive growth of generative AI and the rapid deployment of the technology.  It is dawning on leaders at the national, state and local level that there are unprecedented challenges fueled by the AI gold rush with little time to adapt.

At the birth of AI,  experts worried the technology would replace millions of jobs. That issue temporarily has been taken a backseat as the world watches the relentless building boom of data centers. A new AI data center is expected to come online every day this year, totaling 504 by year's end.

Capital required to finance this rapid expansion in AI data centers is expected to hit $6.7 trillion by 2030, according to a study by McKinsey & Company.  The price tag includes money for land, site development, power and cooling generators, hardware and human capital.

Private sector investment in AI topped $100 billion in the U.S. last year, nearly 10 times as much as China.  During the period from 2013 to 2023, private sector firms spent $470 billion, four times more than China. The government, mostly defense, spent $5.2 billion during the same ten year period.

Amazon leads the tech titans with a cap ex investment this year projected to top $100 billion and potentially could reach $118 billion. The corporate behemoth operates more than 100 data centers worldwide, each of which houses about 50,000 servers to support cloud computing services.

This insatiable demand for capital is stressing corporate balance sheets and forcing a recalibration of the financial resources needed to build the backbone of the new economy.  For perspective, that $7 trillion figure represents more than the Gross Domestic Product of every country but two: the U.S. and China.

Although data centers have been around since the 1940's, training and using AI requires enormous amounts of computing power in data centers.   AI centers consume seven to eight times more energy than a typical computing workload, according to a Massachusetts Institute of Technology (MIT)  study.

AI data centers house advanced computing, network and storage architectures, buttressed by energy and cooling systems to handle high density workloads. AI centers are crammed with graphics processing units (GPU) that generate intense heat.    

A unique feature of generative AI is the increased fluctuations in energy use which occur over different phases in training machine learning.  One study estimated the training process to deploy a recent Open AI model consumed 1,287 megawatt hours of electricity, enough to power 120 average homes. 

Scientists estimate the power requirements for data centers nearly doubled just between 2022 and the end of 2023.  MIT researchers calculated that by 2026 the electricity consumption of all data centers will approach 1,050 terawatt hours. Each terawatt equals one trillion watts of electricity.  

A major new International Energy Agency (IEA) report calculates that data centers worldwide are expected to more than double by 2030, requiring around 945 terawatt hours of electricity, less than the MIT estimate by still a hefty amount.  That is more than the entire electricity consumption of Japan.

An already taxed electricity grid has prompted major technology companies to invest in their own energy facilities and to strike agreements for  dedicated electricity resources. Microsoft, for instance, has agreed to purchase  $16 billion in energy from the restarted Three Mile Island nuclear facility. 

Goggle is collaborating with Karios Power and the Tennessee Valley Authority to deploy advanced nuclear energy to supplement the electricity grid to power its data centers in Tennessee and Alabama. Amazon is partnering with Talen Energy to secure nuclear power from the Susquehanna power station.

Current data centers are already contributing to rising consumer electricity rates. The 13-state region served by PJM Interconnection is home to the largest concentration of data centers. Residential consumers were hit with a 20% spike in rates this summer as PJM's costs soared $9 billion.

Today's hyper scale AI data centers require as much as 1,200 acres or more to build.  To accommodate those acreage requirements, data centers are being constructed farther and farther away from cities. Currently, data centers have gobbled up 282.8 million square feet of land in the U.S.   

Northern Virginia, home to a high concentration of data centers, reports 51 million square feet of land dedicated to the facilities.  Operators require large tracts of land to develop multiple buildings over time. The acreage includes buffer zones for cooling plants, backup generators and electrical substations. 

Water consumption is often an overlooked issue when it comes to AI data centers. Training AI models generates significant heat, increasing the need for water to cool and to keep the humidity low. One study found that as much as 720 billion gallons of water annually will be needed by 2028 for AI data centers.

Googles's data center in Henderson, Nevada, consumed 352 million gallons of water in 2024, according to data obtained by the Las Vegas Review-Journal.  Goggle reported using more than 6 billion gallons of water in 2023 for all of its data centers.

The water and power resource drain is already causing some cities to rethink support for construction of massive data center projects. Tucson's city council recently defeated a proposal for a 290-acre data center in Pima County over concerns about water and electricity consumption.

The project would have generated $250 million in tax revenue and created 3,000 temporary construction jobs and provided 180 permanent positions.  Local officials identified the company as Amazon, but the firm declined to comment on the proposed Henderson facility.

This litany of thorny issues facing AI should not detract from its enormous potential.  Goldman Sachs predicts AI will boost the global GDP by $7 trillion over ten years.  McKinsey projects generative AI will add between $2.6 trillion and $4.4 trillion annually to the world's economy. 

The winner in the AI race will be those countries that encourage industry to address the requirements for power, water, acreage and financing before it's too late. The good news is the AI transformation is already fueling cutting edge solutions that will help fulfill the promise of the technological revolution.    

Monday, July 31, 2023

Artificial Intelligence Heralds Profound Changes

Artificial Intelligence (AI) is a disruptive technology that offers the promise to usher in the next industrial revolution.  However, despite the boom in AI applications,  there are deepening concerns that the technology may negatively impact jobs, national security, privacy and spread misinformation. 

AI is the next evolution in machine learning.  AI technologies enable computers to perform a variety of advanced functions, including the ability to understand and translate written language.  It is capable of creating new content such as text, images or audio as well as analyzing vast amounts of data.  

That's a laymen's definition of the technology.  AI is a board field that encompasses many different disciplines, including computer science, data analytics and statistics, hardware and software engineering, linguistics, neuroscience and even philosophy and psychology.  

AI burst into the American consciousness with the release of ChatGPT, free app developed by OpenAI, an AI and research company.  The app facilitates an almost human-like conversation with a chatbox that answers questions and can assist with tasks such as composing emails, essays or even creating poetry. 

Released in November of last year, CatGPT is the fastest-growing app in history, garnering more than 100 million active users.  Nearly every teenager has downloaded the app on his or her wireless phone.  This writer has been experimenting with the app for months.  Think Apple's Siri on steroids.

One measure of the AI boom is the stock price of firms operating in the AI space, including tech giants Microsoft, Goggle and Open AI.  No firm has benefited more than Nvidia, which has ridden the wave to a 222% increase in market value.  Nvidia makes a powerful chip that's become the workhorse for AI.

AI has attracted the interest of Congress and a legion of critics and champions.  Yet there is no denying there are many innovative applications for AI to automate workflow and processes, reduce human errors and eliminate repetitive tasks.

AI is being deployed in the healthcare industry at a dizzying rate. Since health cost are nearly one-fifth (19.7%) of the total U.S. economy, the potential value of AI in healthcare from the administrative side to the delivery of healthcare has enormous potential to reduce costs and improve efficiency.  

The technology is already being used by the Centers for Disease Control to analyze public health data.  Increasingly, AI is being deployed to assist in analyzing imaging data from MRI's and CT cans. AI can handle some tasks preformed by radiologists, a profession in the throes of declining specialists.

In one example, AI is being used to analyze cell images to determine which drugs are most effective for patients with neurodegenerative diseases.  Conventional computers are too slow to spot changes in neurons when patients are treated with different drugs.

That's just for starters.  AI is being deployed in medical training, to assist medical professionals in clinical settings, remote monitoring of patients and for diagnostics.  One AI software can detect current issues and predict the patient's likelihood of developing the breast cancer in the next several years. 

Beyond healthcare, virtually every industry is looking at ways to incorporate AI into their business. Microsoft and Google are working to integrate cutting edge AI into their search engines.  Engineering firms are finding ways AI can make their teams more effective.  

The software industry is exploring ways to use AI can eliminate the need for certain tasks generally performed by early career or junior programers. Digital news platforms are employing AI to create stories, edited by news people.  Schools are eyeing ChatGPT as a tool for students and teachers.

There is no question AI potentially will replace some workers, especially with accelerated advancements in the technology and industry's rapid integration of AI into their businesses. 

Another issue virtually unreported is the energy and resource drain that will be created with the growth of AI. A report rom the School of Engineering and Applied Science at University of Pennsylvania raises concerns as AI applications begin to scale up exponentially.

An estimate from the Semiconductor Research Corporation predicts the increasing deployment of AI will "soon hit a wall where our silicon supply chains won't be able to keep up with the amount of data created." Computer memory is stored on components made from silicon.

Companies operating AI systems store data in massive facilities all over the country.  These facilities carbon emissions doubled between 2017 and 2020.  These centers consume on the order of 20 to 40 megawatts of power, roughly enough to power 16,000 households with electricity. 

Like many technologies, there is a dark side to AI.  Google's CEO Sundar Pichai is among a growing number of business leaders flagging the capability of the technology to fabricate imagines of public figures and average Americans that are nearly indistinguishable from reality.  

Imagine in a political election AI is used to produce a video and audio fake of a candidate making racist or anti-American statements.  What if the forgery goes viral on social media before it can be detected? These so-called deepfakes have nabbed the attention of the Department of Defense.

In the context of national security, a fake could dupe military or intelligence personnel into divulging sensitive information to an adversary, posing as a trusted colleague.  The Pentagon recently awarded a contract to a startup DeepMedia to design a deepfake detection computer. 

Many in Congress have been calling for guardrails to regulate AI. Before Congress could act, the White House announced  that seven of the nation's top AI developers agreed to guidelines aimed at ensuring the "safe" deployment of AI.  

Amazon, Google, Meta, Microsoft, OpenAI, Anthropic and Inflection agreed to the outline. The guidelines are voluntary and there are no penalties for violating the open-ended agreement.

Regulation often can restrain new technologies in the U.S., while foreign competitors are unleashed to push forward and leapfrog American companies. However, in this instance, there needs to be some rules to prevent the misuse of a relatively new technology by a few bad actors. 

Once AI becomes embedded in every business, it will be too late to govern the technology's applications without major business and political upheaval.