How data centers get built
Posted: September 08, 2025

$7 trillion. That’s the amount McKinsey estimates could be spent on data centers by 2030. For context, that’s more than the combined GDP of the U.K. and France, the world’s sixth- and seventh-largest economies.
What effect such spending will have on the global economy—and society—is still largely unknown. For now, the most tangible downstream effect has been on the grid. In the U.S.—where the global data center build-out is at its most frenzied—electricity prices are increasing at double the rate of inflation. Analysis by the U.S. Department of Energy in 2024 found that data centers are expected to consume between 6.7% and 12% of the country’s electricity by 2028—up from 4.4% in 2023.

Our Industrial Life
Get your bi-weekly newsletter sharing fresh perspectives on complicated issues, new technology, and open questions shaping our industrial world.
Meeting this new source of demand is proving to be a challenge for utilities, which are used to making investment decisions on a 30-year horizon. For hyperscalers, meanwhile, power availability is increasingly the number one consideration when selecting the location of a new data center—and its unavailability is a major bottleneck to their growth. “In May, Mr. Jassy told investors that Amazon could have sold more cloud services if it had more data centers,” reported the New York Times. “’The single biggest constraint is power,’ Mr. Jassy said.”
Given Amazon will spend over $100 billion on data center construction this year, Jassy likely knows what he’s talking about—but the size of that figure also indicates how much else, besides power, goes into getting a data center up and running.
From the server racks and switchgears to the cooling systems and uninterruptible power supplies, data centers are whirring symphonies of expensive, specialized machinery. And because data centers are essential engines of the digital economy, the symphony inside them can never stop.
Here's how these essential pieces of modern infrastructure get built.
Server racks and their power densities
The basic unit of a data center is a “rack.” A rack contains servers, and servers contain chips. Racks aren’t measured in physical size. What matters is their power demands—which are growing rapidly.
The size of a data center (also measured in power) is determined by the power density of racks multiplied by the number of racks proposed. If a data center contained 1,000 racks, with each rack requiring 17 kilowatts of power, its size would be 17 megawatts.
Until recently, a data center of this size would have been considered large, but new AI “hyperscale” data centers are often 100 MW or more.
Planning and design: power, water and site selection
The size of a data center naturally determines much of its requirements. It needs enough electricity to power both the chips and the cooling systems—which is a lot. According to the International Energy Agency (IEA), the annual electricity consumption of a 100 MW data center can be equivalent to 100,000 homes.[1]
The cooling systems may also require vast quantities of fresh water. The IEA estimates that on average a 100 MW data center consumes 2 million liters of water a day—the same amount as 6,500 households.[2] As Bloomberg reports, water availability is becoming an increasing issue when selecting sites for data centers.
But the challenges associated with cooling go beyond water availability. Those designing data centers aren’t just thinking about how to cool existing racks—they also have to factor in how to cool the racks of the future.
“The biggest challenge is designing for flexibility and to be futureproof,” Wendy Torell, Senior Research Analyst, Data Center Research & Strategy at Schneider Electric, told Our Industrial Life. “There's so much change happening on the IT side so rapidly, and it's impacting the way you power and cool. But cooling is the most disruptive because there's this big shift from air cooling to liquid cooling. And the operating temperatures of systems are changing fast.”
Need for power and water, plus physical space and fibre internet connection, also narrows down the potential locations of data centers. To date, they are mostly located in major cities. However, a scarcity of space in these markets is leading developers to explore new locations.
Construction and commissioning
Building the shell of a data center tends to be the most straightforward part of construction. More complicated is the installation of specialized electrical and mechanical machinery, which often have long lead times.
The people doing this highly technical design and installation work are mechanical, electrical and plumbing (MEP) engineers. MEP engineers are essential in many kinds of construction projects (such as hospitals and offices), but they are especially important in data centers for two reasons.
The first is that, in data centers, the systems that MEP engineers are responsible for are highly interconnected. The second is that each system is individually essential and simply cannot be allowed to fail.
“What’s the difference between a data center and an office building?” says Victor Avelar, Chief Research Analyst, Data Center Research & Strategy at Schneider Electric. “If an office building goes down, it’s not the end of the world. If some of these data centers go down, it could alter the economy.”
Because the uptime of each system is essential, a high degree of redundancy is built into MEP systems. Many data centers often have more than double the required equipment, to ensure that even a catastrophic failure would not lead to any downtime.
Mechanical engineers are responsible for a data center’s cooling systems. Their work covers things like the design of HVAC systems, cooling methods (most notably, air cooling versus liquid cooling), and planning the flow and separation of hot and cold air.
Electrical engineers are concerned with power supply and distribution. Their scope includes everything from drawing enough power from the grid and supplying it to server racks, to the installation of uninterruptible power supplies (UPS) and back-up diesel generators.
Plumbing engineers are responsible for water systems. These are critical in data centers because the transportation of chilled water throughout the building (and the subsequent evaporation of hot water) plays a central role in temperature control. Another important facet of plumbing systems is fire prevention: every floor of a data center is fitted with sprinklers. In a cutting-edge facility belonging to Equinix, the sprinkler system is linked to an ultra-precise smoke detection system, to ensure that, in the event of a fire, only necessary sprinkler heads go off.[3]
MEP engineers are highly sought after. “The data center industry has faced shortages of staff and skills for more than a decade,” says the Uptime Institute in its most recent report. 46% of firms are struggling to find qualified candidates and “talent shortfalls in electrical and mechanical trades continue to be significant.”
Once the data center is fully constructed, the commissioning process begins with rigorous testing. The entire system is brought up to full capacity and monitored to ensure it operates correctly under load. During this phase, thermal imaging is used to identify and fix any hot spots that could lead to equipment failure. Only after these tests are completed and all issues are resolved is the data center fully commissioned and ready for use.
The cost of building a data center
Most of the cost is tied up in the IT equipment and other machinery—so called “critical capacity.” The long lead times for some of this equipment can, as a recent report by Cushman and Wakefield noted, “delay project timelines, causing a chain reaction across all construction stages.”
CBRE estimates that in the U.S., data centers cost $10–$14 million/MW of capacity. At those prices, a 100 MW facility costs $100–$140 billion. Given those kinds of figures, McKinsey’s $7-trillion forecast perhaps isn’t as farfetched as it initially seems.
Whatever the final number ends up being, one thing’s for sure: we aren’t close to it yet. “Eventually investment will tail off,” says Schneider’s Victor Avelar, “but right now we're still very early days—there's still a lot of build out.”
[1] IEA, Energy and AI, April 2025, p. 38.
[2] IEA, Energy and AI, April 2025, p. 242.
[3] Linus Tech Tips, I Can’t BELIEVE They Let Me in Here!, 19;15 onwards.