FOR FREE MAGAZINE

Data centers, AI’s lifeline

John Denslinger is a former executive VP Murata, president SyChip Wireless, and president/CEO ECIA, the industry’s trade association. His career spans 40 years in electronics

In this article, John Denslinger argues that good or bad, AI is fast becoming inseparable to human activity and AI’s lifeline is definitely the data center.

For the past year, much has been written about the explosive growth of AI. The enablers are savvy hardware and software companies, often called the darlings of Wall Street. You know the names. They top the NASDAQ leader board almost daily. While these component companies garner headlines, data centers are the real AI lifeline.

Each and every online action lives on in data centers making it indispensable in the age of digitalization. But not all data centers are the same. Actually, there are four common types: onsite, colocation, hyperscale and edge. An onsite data center is typically located on company grounds exclusively serving that company’s need. A colocation data center offers space to one or more host businesses where each tenant provides their own IT equipment. The hyperscale data center is probably the one most talked about. These are massive mission-critical facilities associated with big-data companies like Amazon, Microsoft, IBM, Alphabet and Meta. Last is edge, perhaps the most sophisticated in terms of location. Edge architecture performs best moving time-sensitive data between client and server where latency and bandwidth are real concerns. The location is ideally situated between data source and data processing. Which ones attract the most investment? In the last two years, 90 per cent of new builds were colocation and hyperscale. It would seem there is a data center option fitting every need.

Today, roughly 5,400 data centers operate in the US. Vacancy in primary markets sits at a record low 3.7 per cent according to CBRE. Demand is definitely strong for more capacity, but energy availability ranks as the number one constraint to new installations. Major utilities are limiting future commitments to two to five-year timelines realizing the enormous headwinds facing traditional energy generation based on fossil fuels and nuclear. Sustainable energy alternatives remain a preference but continuity is a concern. While it’s widely expected that AI may actually improve America’s grid efficiency, additional energy generation is still needed.

Data centers consume lots and lots of power. BCG reports that 2.5 per cent of the total US electricity output in 2022 was consumed by data centers. BCG projects that figure may grow to 7.5 per cent by 2030 with GenAI and IoT the overwhelming drivers of demand. While it might seem surprisingly high, here are just a few underlying dynamics worth noting (source):

  • ChatGPT consumes 10 times more power than a simple Google search (BCG)

  • The anticipated Google AI search application could double/triple again ChatGPT’s power consumption (BCG)

  • AI chips and central processing will consume seven times more power than previous generations (WSJ)

  • Nvidia’s next-gen B100 AI GPU could draw an astounding 1,000W (Dell Technologies)

  • Nvidia’s Blackwell GPUs require liquid cooling to reach full potential. Newer data centers built for AI clusters won’t be a problem, but existing facilities may need reconfiguration (GTC2024 / The Register)

  • Over the next five years twice as much data will be created and captured than the last 10 years (JLL)

  • Major tech companies will spend $1 trillion in data centers over the next five years (Business Insider)

  • 50 billion IoT connections by 2025 (McKinsey)

The outlook: US data center demand will grow 10 per cent annually through 2030 according to a January 2023 McKinsey assessment. There is demand for: (1) new facilities; (2) industrial equipment like cooling apparatus, electrical and plumbing systems; (3) IT hardware and software; (4) management services; and (5) connectivity to utilities on one end and clients on the other.