Artificial intelligence (AI)

1 APR 2024 ⎯ 12 MINS READ

The Race to Power the AI Revolution: A Global Struggle for Datacenter Supremacy


The explosive growth of artificial intelligence, particularly the rise of powerful generative models, is putting unprecedented strain on the global datacenter industry. Demand for AI computing power is surging at a breakneck pace, with estimates showing total AI datacenter critical power demand doubling from 49 gigawatts (GW) in 2023 to a staggering 96 GW by 2026. This dramatic trajectory is far outpacing the ability of the industry to build out new datacenter capacity, setting the stage for a looming power crunch that threatens to constrain the continued advancement of AI.

Figure 1_ IEA Electricity 2024.png Figure 1: IEA Electricity 2024

At the heart of this challenge lie the unique requirements of AI workloads. Training large language models and other cutting-edge AI systems is an extraordinarily power-hungry endeavor, with individual GPU servers capable of drawing over 10 kilowatts (kW) of power. These workloads are also less sensitive to latency compared to traditional enterprise or cloud computing applications, placing a premium on the availability of abundant, low-cost electricity.

Regions that can provide stable, low-carbon power grids and the logistical capability to rapidly scale up power generation capacity will have a significant advantage in hosting the AI computing infrastructure of the future. In this regard, the United States is emerging as a frontrunner, leveraging its ample natural gas reserves, growing renewable energy capacity, and relatively low electricity prices. In contrast, other potential AI hubs like Europe and Asia face significant headwinds, with higher power costs, greater reliance on imported fossil fuels, and more challenging regulatory environments for rapid datacenter buildouts.

Figure 2_ US EIA, Various National and Regional Electrical Distribution Organizations.webp Figure 2: US EIA, Various National and Regional Electrical Distribution Organizations

Power Density Challenges and the Race to Innovate Cooling

The immense power demands of AI compute are also pushing the limits of traditional datacenter designs. Whereas typical enterprise or cloud workloads might see power densities of 12-15 kW per rack, the next generation of AI-optimized hardware can draw over 10 kW per server. This means datacenters will need to support power densities of 30-40 kW per rack or higher to accommodate these advanced systems.

Server deployment positions for various rack densities.png Figure 3: Server deployment positions for various rack densities

Achieving these power densities will require the widespread deployment of innovative cooling solutions, such as direct-to-chip liquid cooling, which can reduce per-rack power consumption by up to 10% and improve overall power usage effectiveness (PUE) by 0.2-0.3. Specialized air cooling techniques, like rear-door heat exchangers, will also be essential for handling the heat loads.

Retrofitting existing datacenters to handle these power densities can be extremely challenging and costly, as operators must reconfigure electrical infrastructure, plumbing, and physical layouts. Companies like Meta have recognized this challenge and have pivoted to designing new, AI-optimized facilities from the ground up rather than trying to adapt their legacy footprint.

Figure 4_ NVIDIA DGX SuperPOD Datacenter Design.webp Figure 4: NVIDIA DGX SuperPOD Datacenter Design

The Carbon Footprint of AI and the Sustainability Imperative

While the power-hungry nature of AI compute poses significant operational challenges, these workloads' environmental impact is also under increasing scrutiny. Training large language models can have a substantial carbon footprint, with one GPT-3 run estimated to generate 588.9 metric tons of CO2 - equivalent to the annual emissions of 128 passenger vehicles.

As model sizes and training compute requirements continue to grow exponentially, managing the environmental impact of AI will be a critical industry-wide challenge. Factors like the carbon intensity of the local power grid, the mix of renewable and fossil fuel-based generation, and the efficiency of the training process will all play a key role in determining the carbon footprint of AI deployments.

Regions with access to cleaner power, like France's heavy reliance on nuclear energy, will have a natural advantage in this regard. Datacenters in these locations can achieve much lower emissions per kilowatt-hour of electricity consumed compared to regions with a higher proportion of coal or natural gas generation.

Addressing the environmental impact of AI will require a multi-pronged approach, encompassing both technological and policy-driven solutions. Continued advancements in energy-efficient chip design, advanced cooling systems, and renewable power integration will be crucial on the technology front. Meanwhile, governments and regulatory bodies will need to establish guidelines and incentives to encourage the development of sustainable AI practices.

The Race for AI Datacenter Supremacy

To emerge as a true "AI Superpower" capable of supporting the exponential growth of AI computing, countries and regions must possess a unique combination of attributes related to their energy infrastructure and power supply:

  1. Inexpensive, abundant electricity: Low power prices are crucial given the immense energy demands of AI workloads. Regions with access to reliable, low-cost power generation will have a significant competitive advantage.

  2. Stable, robust energy supply: The ability to rapidly scale up power generation capacity and withstand geopolitical or weather-related disruptions to the energy supply chain is key. Countries that can guarantee the consistent availability of electricity will be better positioned to host mission-critical AI infrastructure.

  3. Low-carbon power mix: A greener grid, with a high proportion of renewable or nuclear energy sources, is essential for reducing the environmental impact of AI and aligning with the sustainability goals of significant technology companies.

The United States, with its shale gas abundance, rapidly expanding renewable energy capacity and relatively low electricity prices, is well-positioned to lead in this regard. In contrast, regions like Europe and Asia face significant hurdles, with higher power costs, greater reliance on imported fossil fuels, and more challenging regulatory environments for rapid datacenter buildouts.

Figure 5_ US EIA, Various National and Regional Electrical Distribution Organizations.webp Figure 5: US EIA, Various National and Regional Electrical Distribution Organizations

Underlying this global competition is the critical importance of semiconductor export controls, which have effectively tilted the playing field in favor of the US and its allies. By restricting China's access to the most advanced AI chips, these nations have cemented their position as the dominant players in the field of AI computing.

As the industry continues to evolve, the ability to rapidly scale up power generation capacity, maintain grid stability, and minimize the carbon footprint of AI deployments will be the key determinants of which countries and regions emerge as the true leaders in this transformative technology. The "Real AI Superpowers" of the future will be those that can harmonize their energy infrastructure, semiconductor access, and sustainability efforts to unlock the full potential of artificial intelligence.

Hani Zahirović / Chief Technology Officer

Artificial intelligence (AI)

Engineering & Technology




Hani Zahirović

With over a decade in software development, Hani sees himself as a problem solver. He's led teams in planning, coding, testing, and fostering growth. Now, as a CTO, he shapes Bloomteq's tech direction for the future.

Latest news

Subscribe to Our Newsletter

Sign up for our newsletter to receive the latest updates, insights, and industry news.


/ Kolodvorska 12, 71000 Sarajevo, BiH

/ E-mail:

/ Call: +387 33 82 18 22

© 2024 Bloomteq. All Rights Reserved.

Privacy Policy