When Microsoft (MSFT) pulled the plug on planned data centers in Ohio last month and a Wells Fargo (WFC) report suggested Amazon (AMZN) Web Services was reconsidering some leases, market watchers quickly diagnosed the symptoms: AI bubble concerns, demand uncertainty, and the inevitable cooldown after years of breakneck expansion.
There was just one problem with that analysis: The companies building these data centers say it’s wrong.
Rather than signaling doubt about AI’s future, recent data center adjustments by Amazon and Microsoft reflect an industry confronting harsh realities: power grids that take years to expand, land speculators inflating prices sixfold, and utilities overwhelmed with requests for more electricity than actually exists. The question isn’t whether AI infrastructure demand is real — it’s whether the real estate market and power grid can handle what’s coming.
A focus on each lease update from the tech giants reflects a fundamental misunderstanding of how the data center market operates, said Andy Cvengros, a 20-year data center industry veteran at JLL (JLL) who represents major tech companies in their real estate deals. Unlike typical real estate deals, hyperscalers work with the same partners across multiple markets and treat their portfolios holistically. That means cancellations in one location often coincide with expansions elsewhere. Moreover, the massive scale and long timelines involved make these adjustments routine business rather than strategic retreats.
“This stuff happens all the time,” Cvengros said. “Whereas two years ago, nobody followed any of this.”
The tech CEOs also reject the retreat narrative. Amazon’s Andy Jassy recently told shareholders they would be “very happy” with the company’s $100 billion AI infrastructure spending. Microsoft’s Satya Nadella dismissed data center adjustments as routine business that simply gets more attention now. “We’ve always been making adjustments to build, lease, what pace we build all through the last 10, 15 years,” Nadella said on an earnings call. “It’s just that you all pay a lot more attention to what we do quarter-over-quarter nowadays.”
That aligns with what Cvengros sees on the ground. Rather than pulling back, hyperscalers are “pausing to re-architect their strategy, ultimately to come back and continue doing what they’re doing in a large way,” he said. “We have not seen them slow down by any means.”
The real constraint isn’t wavering demand but basic infrastructure. Power grids across the country are struggling (or failing) to keep up with AI’s explosive energy requirements.
The scale of the challenge is unprecedented. In 2023, data centers consumed more than 4% of American electricity, and that could rise to 12% by 2028, according to the Department of Energy. New facilities are regularly requesting 500 megawatts or more — enough to power hundreds of thousands of homes. In Virginia alone, the state’s largest utility has connected 75 new data centers since 2019, driving statewide electricity sales up 7% and prompting projections of 85% demand growth over the next 15 years.
The tech industry is adjusting its buildout plans in response. “We are finding the best right sizing,” said Henrique Cecci, a Gartner (IT) analyst who tracks the data center market. Beyond infrastructure constraints, the data center sector is moderating its energy demand forecast, going from five- to six-fold growth expectations to a more realistic three- to four-times increase. “Nobody knows exactly how big AI is, but everybody agrees it’s a big thing,” Cecci said.
The supply side tells a grimmer story. Utilities ordering necessary grid technology like combustion turbines today won’t receive them until 2029, according to the Electric Power Research Institute (EPRI), an independent energy research institute. Traditional grid buildout takes four to seven years under normal circumstances, but supply chain bottlenecks have made the situation worse.
The industry has navigated major technological shifts before. When virtualization was introduced in the early 2000s, companies could suddenly consolidate dozens of physical servers into just a few machines, dramatically increasing efficiency. Data centers built in that era suddenly found themselves using only 10% to 20% of their footprint, Cvengros said.
A similar adaptation could happen with AI infrastructure if efficiency improvements outpace usage growth. The emergence of vastly more efficient AI models like China’s DeepSeek has already prompted industry discussions about whether massive buildouts are necessary, with some companies reconsidering the scale of their planned expansions.
But the technology’s trajectory suggests otherwise. Nvidia’s (NVDA) roadmap shows future server racks will consume 600 kilowatts each — a 30-fold increase from the 10-20 kilowatt standard that prevailed for the past two decades. Nvidia CEO Jensen Huang has signaled that while chips will be more efficient, they’ll also be more powerful and require more energy, telling the industry to start building for higher power demands now.
The demand is also shifting toward inference — running AI models for end users — rather than just training them. Unlike training, which happens in intensive but limited bursts to build models, inference runs continuously every time someone uses ChatGPT, asks Siri a question, or gets AI-powered search results. This usage pattern could multiply exponentially if AI applications reach broader adoption, creating sustained rather than cyclical demand for computing power.
The companies navigating this transition successfully will be those with real business models and deep pockets, not land speculators. “We’re nearing a 2% vacancy nationally,” Cvengros said, describing an extremely tight market where serious operators continue building while speculation gets shaken out. “I don’t necessarily see a bubble popping.”
Source: Quartz
You need to login in order to like this post: click here
YOU MIGHT ALSO LIKE