Operators of the so-called hyperscale data centers running ChatGPT and other AI chatbots are gobbling up energy much faster than renewable sources can supply it. Experts estimate a huge energy gap. Those who close it will profit.
One expert, Forrester Research
FORR
senior analyst Abhijit Sunil, told me in an October 20 interview, about two groups of publicly-traded companies — data centers and providers of so-called immersive cooling systems — that could benefit from the soaring demand for computing created by Generative AI.
Publicly traded data centers include Equinix
EQIX
, Digital Realty, and Akamai. Publicly-traded providers of immersive cooling systems are Vertiv, Schneider Electric, and Lenovo, Sunil told me.
Of these, Equinix and Vertiv appear to have the most upside investment potential due to their expectations-beating growth, market leadership, and significant investments to satisfy growing demand.
Generative AI’s Recent And Projected Energy Demand
Generative AI has resulted in a significant increase in demand for energy. While experts agree on that point, they differ on how much energy demand will change in the future and what changes in technology will help to keep the demand for energy from outstripping the supply.
Here is what I take away from their comments:
- Datacenter energy demand grew much faster than the ability to satisfy that demand with renewable energy.
- That trend is likely to continue over the next decade due to the power-hungry computing requirements to train and operate Large Language Models.
- New computing strategies and technological innovations could mute the growth in that energy demand.
Datacenter Energy Demand Growing Faster Than Renewable Supply
Datacenter energy demand by hyperscale data center providers — e.g., Microsoft, Google, Amazon, and Meta — has grown faster than the renewable energy supply. As Benjamin C. Lee, Professor, University of Pennsylvania Department of Electrical and Systems Engineering Department of Computer and Information Science, wrote me in an October 5 email, “Datacenter energy usage has increased at astonishing rates.”
To back that up, Lee and Harvard professor David Brooks found between 2015 and 2021, operators of hyperscale datacenters increase their electricity usage at a 25% compound annual growth rate. Lee’s email indicated this “is a concern because investments in renewable energy have only grown by 7% per year on average, as reported by the U.S. Energy Information Administration (EIA).”
Lee expects hyperscale electricity demand to grow faster in the future. Since ChatGPT was launched after the time frame of their analysis, Lee does not know how much additional electricity consumption is due to Generative AI — which took off in November 2022 with the launch of ChatGPT. The effect of Large Language Models on electricity demand “is not yet visible in sustainability reports, but the 25% historical growth likely provides a lower bound on future growth (i.e., numbers will only get bigger),” Lee told me.
The Inflation Reduction Act could spur further investment in renewable energy generation to “mitigate the carbon footprint associated with electricity usage,” Lee wrote. “But I believe growth in electricity demand will outpace growth in carbon-free supply for the foreseeable future,” he concluded.
Generative AI Adoption Could Boost Energy Demand
Many analysts agree Generative AI will result in a significant boost to energy demand. They argue demand for ChatGPT and its peers will soar, Generative AI is a power hog, and the chips used to operate Generative AI models are energy-intensive. Despite their efforts to use more renewable energy, analysts suggest hyperscalers will need to use the traditional electricity grid to meet demand.
Here are the key points:
- AI adoption will grow rapidly in the next three years. AI adoption is just getting started. “We’re maybe at 1% of where the AI adoption will be in the next two to three years,” said Arijit Sengupta, founder and CEO of Aible, an enterprise AI solution company. “The world is actually headed for a really bad energy crisis because of AI unless we fix a few things,” Sengupta told Yahoo! Finance.
- ChatGPT 10 to 100 times more power hungry than email. Energy consumption will dramatically increase. University of Washington shows that hundreds of millions of queries on ChatGPT can cost the equivalent energy consumed by 33,000 US households — around one gigawatt-hour a day, according to Yahoo! Finance. A ChatGPT inquiry will “probably 10 to 100 times more power hungry” than an email, Professor of electrical and computer engineering Sajjad Moazeni told Yahoo! Finance.
- GPUs require up to 15 times more power. Data centers are increasingly shifted from simpler central processing units to to more advanced graphics processing units — made by companies such as Nvidia (NVDA) — which are the most energy intensive. “GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive,” Brady Brim-Deforest, CEO of Formula Monks, an AI technology consulting company, explained to Yahoo! Finance.
- Hyperscalers are investing in renewable energy to match their energy consumption. Google Cloud, Microsoft Azure, and Amazon Web Services all invest in renewable energy to match their annual electricity consumption. Microsoft’s Azure says it has removed as much carbon as it emitted — e.g., 100% carbon neutral — since 2012 and will be carbon negative by 2030. Amazon has said it expects renewable energy to power 100% of its operations by 2025. Google aims to achieve net-zero emissions across all of its operations by 2030, reported Yahoo! Finance. Lee said hyperscalers might need to draw energy from the grid — because they can’t produce enough renewable energy during certain demand-intensive times of day. He is studying ways to shift computing to, say, the middle of the night when energy demand is lower, he told Yahoo! Finance.
Generative AI Computing Could Be Redesigned To Flatten Its Energy Consumption
Technical changes could flatten the growth of Generative AI computing’s energy demand.
One approach is to redesign how LLMs are built and operated. Generative AI systems perform three computing activities which consume roughly equal amounts of electricity. As Lee explained, based on results of a study co-authored with Facebook, these activities consumer the following percentages of total electricity:
- Pre-processing: 30%. Pre-processing collects and prepares data for machine learning.
- Training: 30%. Training learns the hundreds of billions of parameter values aimed at enabling a model to respond to queries with precision.
- Serving: 40%. Serving deploys a trained model so users and applications can issue queries and receive responses.
Changes to the ways training and serving are performed could alter Generative AI’s demand for energy. Lee envisions less frequent training computations as increases in the size of LLMs result in diminishing marginal increases in model accuracy. Under this scenario, energy demand growth might slow down.
However, a spike in demand for smaller models tuned to specific users and applications could lower the costs of building individual LLMs while increasing overall energy demand due to the computing requirements of satisfying the proliferation of demand for these “many mice,” Lee noted.
When it comes to serving LLMs, energy demand could increase or decrease. If most LLM queries are carried out through developers querying LLMs through application programing interfaces, Lee expects energy costs to rise. If AI is integrated into software — e.g., Microsoft Copilot — energy demand could flatten.
New hardware could flatten growth in power consumption in the next decade. “We expect flat power consumption over the next decade. Older infrastructure is being discontinued. Newer chips are more energy efficient because they are more dense and consume less power,” Forrester’s Sunil told me.
The immediate problem of spiking Generative AI energy demand is “somewhat overhyped,” Sunil said. “Despite the increase from the new workloads, the flat energy consumption is a result of the greater efficiency of new workloads,” he added.
Startups are working on ways to reduce Generative AI energy consumption. Using server resources only when needed — dubbed serverless technology — is one way to cut energy consumption. “We can literally cut down energy use … in these [types] of AI workloads by one-third by just moving to serverless,” Aible’s Sengupta told Yahoo Finance.
How Investors Can Profit From Generative AI’s Computing Demand Growth
Investors may be able to profit from Generative AI’s hunger for computing power by investing in data centers and/or publicly-traded providers of technology for cooling those data centers.
One analyst sees data center operators as winners. “The actual data usage and how all this comes together is going to be more concentrated to a couple of companies out there,” Angelo Zino Vice President, Senior Industry Analyst at CFRA Research, told Yahoo! Finance.
“More and more companies are essentially renting space in the cloud rather than kind of investing and building their own data centers, just because in the future I think it’s going to be a lot more costly in nature,” he added.
While growth in demand for data centers may be more self-evident, the demand for so-called liquid cooling is less obvious.
Due to the amount of heat generated by densely packed servers doing calculations all day long boosts the amount of heat that must be eliminated to keep the equipment from overheating.
Liquid cooling — which circulates water or other coolants through heat exchangers to absorb the heat generated by computer components — is more efficient than fans or air conditioning, KPMG managing director Brian Lewis told Network World.
Liquid cooling has practical implications for data center operators. “Liquid cooling adds weight because it is sitting on the floor and is embedded into the circuit boards. It is heavier than air cooled which has big fans blowing in the data center,” Sunil told me.
As I noted earlier, Equinix and Vertiv strike me as interesting investment opportunities in the datacenter and liquid cooling markets, respectively.
Equinix’s Performance And Prospects
Equinix — a Redwood City, Calif.-based operator of 248 data centers in 31 countries — is a leader in the “global colocation data center market,” according to the company.
Its recent performance and prospects appear compelling. In its third quarter — ending September 2023, Equinix reported strong growth — driven by “integration of AI into enterprise business strategies” along with profit increases and winning new customers.
Here are the key numbers from GuruFocus:
- Q3 revenue: $2.06 billion — up 12% from the year before.
- Q3 net income: $276 million — up 30%.
- Q3 deals closed: 4,200 “across more than 3,100 customers.”
- Q3 cash dividend: $4.26 per share, up 25%.
- 2023 revenue outlook: Between $8.166 and $8.206 billion — up between 12% and 13%
- 2023 adjusted EBITDA outlook: between $3.680 and $3.710 billion — up in the range of 14% and 15%.
Equinix is proud of its results. “We delivered another solid quarter of results and continue to drive strong value creation on a per share basis, raising …our dividend for the full year,” Charles Meyers, President and CEO, said in a statement.
“We expect Equinix’s broad portfolio of offerings, in tandem with our key technology partners, will allow us to capture high-value opportunities across the AI value chain, positioning Platform Equinix to be the place where private AI happens and allowing customers to place compute resources in proximity to data,” he added.
Equinix is continuing to invest in global data centers. The company has 56 major projects underway across 39 markets in 23 countries. New projects added in the third quarter include “new builds in Madrid, Osaka, Sao Paulo, and Silicon Valley [and a] $42 million investment in its fourth International Business Exchange (IBX) data center in Mumbai,” reported GuruFocus.
Vertiv’s Performance And Prospects
Liquid cooling represents a fast-growing market opportunity and Vertiv is well-positioned to win. According to Polaris Market Research, the global data center liquid cooling market was valued at $1.81 billion in 2021 and is forecast to grow at a 24% average annual rate over the next five years.
Vertiv — a Westerville, Ohio-based provider of cooling and power management technology for datacenter customers — posted better than expected results in its most recent quarter.
Here are the key numbers:
- Q3 revenue: $1.74 billion — up 17.6% from the year before, according to Zachs Equity Research.
- Q3 earnings per share: $0.52 — up 126% from 23 cents a share in the previous year’s third quarter and 18.2% more than investors expected, according to Zachs Equity Research
- Fiscal 2023 revenue growth forecast: $6.82 billion up nearly 20% according to analysts polled by FactSet, reported Investor’s Business Daily.
- Fiscal 2023 earnings per share forecast: $1.61,up 203% noted IBD.
This fall Deutsche Bank and Evercore ISI expressed optimisim about Vertiv’s stock
Deutsche Bank wrote Vertiv “clearly caught the AI wave,” in a Sept. 14 research note. The firm — which raised its price target to $48 — sees potential for over 50% further upside “if data center investments continue growing at a double-digit clip.”
Meanwhile, Evercore
EVR
ISI wrote Vertiv is an industry leader. In an October 1 investor note, the firm wrote Vertiv “has the top market share for thermal management solutions and is a powerhouse in the arena,” according to an Oct. 1 note cited by IBD.
Evercore added the company’s association with Leibert — the inventor of precision cooling — gives Vertiv an advantage over rivals such as Eaton and APC. Moreover, Evercore – with a $50 price target on the company’s shares — noted Vertiv offers contracts on energy savings “to earn higher payouts, if a higher percentage of operating expenses is saved,” IBD reported.
Read the full article here