How to build an ai data center

ifp.org/how-to-build-an-ai-data-center%..., by Institute for Progress (IFP), June 20, 2024

This piece is the first in a new series called Compute in America: Building the Next Generation of AI Infrastructure at Home. In this series, we examine the challenges of accelerating the American AI data center buildout. Future pieces will be shared here.

Indicator 2015 2022 Change
Internet Users 3 billiong 5.3 billiong 78%
Internet Traffic 0.6 ZB 4.4 ZB 600%
Data center workload 180 million 800 million 340%
Data center energy use 200 TWh 240 - 340 TWh 20 - 70%

The Influence of AI

We can divide the likely impact of AI on data centers into two separate questions: the impact on individual data centers and the regions where they’re built and the impact of data centers overall on aggregate power consumption. 

For individual data centers, AI will likely continue driving them to be larger and more power-intensive. As we noted earlier, training and running AI models requires an enormous amount of computation, and the specialized computers designed for AI consume enormous amounts of power. While a rack in a typical data center will consume on the order of [5 to 10 kilowatts of power](https://www.datacenterfrontier.com/design/article/55020771/data-center-world-experts-drill-down-for-ai-facility-design-and-construction-case-study), a rack in an Nvidia superPOD data center containing 32 H100s (special graphics processing units, or GPUs, designed for AI workloads that Nvidia is selling by the millions) can consume more than 40 kilowatts. And while Nvidia’s new GB200 NVL72 can train and run AI models more efficiently, it consumes much more power in an absolute sense, using an astonishing 120 kilowatts per rack. Future AI-specific chips may have even higher power consumption. Even if future chips are more computationally efficient (and they likely will be), they will still consume much larger amounts of power.

Notes mentioning this note