Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Mar 16 2026
Artificial Intelligence

AI Is Causing a Power and Cooling Headache for Agencies

Plus, hyperscalers are making it harder to acquire the components to cure it.

The latest NVIDIA processors present a challenge for agencies, from the server to the grid, because the high-intensity artificial intelligence workloads they enable come with major power and cooling demands.

Agencies require either trusted partners or assessments to determine the generator, uninterruptible power supply and transfer switch needs for their particular missions.

Further complicating matters: Private sector organizations aren’t balking at any price, even when AI infrastructure costs exceed $1 billion, but they are when estimated timelines reach 17 months. Industry wants to move quickly, but federal budgets and uncertainty around long-term program funding are constants, which can leave agencies at a disadvantage.

Speed will continue to be a problem for all parties involved because AI-driven hyperscaler purchases are fueling memory, processor and hard drive shortages. One of CDW Government’s top partners, Schneider Electric, just allocated $2.3 billion to two hyperscalers; fortunately, we have the most diverse partner ecosystem in the industry, an emerging technology program for adding new ones quickly, and a dedicated power and cooling team.

Click the banner below to begin exploring power and cooling solutions.

 

AI Power and Cooling Decisions Agencies Face

Many organizations debate whether to put AI-ready infrastructure in an existing facility or to colocate, renting the space inside a third party’s data center. The government restricts “colo” in many cases, however, so a third option is a prefabricated data center.

CDW suppliers Schneider Electric, Eaton and Vertiv all offer prefab data centers, which are essentially placed outside an existing facility with liquid cooling built in.

Liquid cooling is another component agencies need to decide on because it’s required for intense AI workloads, and data centers must be retrofitted to support it. Going with the prefab option eliminates 85% of that work.

Click the banner below for the latest federal IT and cybersecurity insights.

 

DOD Power and Cooling Needs at the Tactical Edge

CDW Government boasts a dedicated Department of Defense practice. When it comes to DOD’s mission, supporting the warfighter is top of mind, and power and cooling requirements at the tactical edge are different. High-powered compute must reside closer to warfighters on ships, airplanes and battlefields.

The first thing DOD must decide with any project is whether an assessment is necessary; there simply isn’t time when a military operation demands immediate support.

Edge-specific needs often aren’t addressed as they should be during the design phase of a DOD solution. For instance, ship servers, compute and storage come in a ruggedized box that can be airdropped from a C-130 transport plane. But it’s harder for the unit to disperse heat in that confined space, demanding a power and cooling work-around.

Large data centers tend to use single-phase, direct-to-chip cooling, but some CDW solutions come with power and cooling built into the rack to meet DOD’s needs. A popular option is single-phase immersion cooling, a 10U rack that comes with services and liquid built in and could go on a ship, as long as it’s not a spill risk.

Anything being dropped from a plane can’t use traditional power supplies because it can’t be assumed it will end up in a data center.

Other logistical issues DOD must sort out at the tactical edge: How is it getting power, via nuclear reactor or microgrid? And does the environment demand air or liquid cooling? Air cooling won’t work in a desert.

This article is part of FedTech’s CapITal blog series.

CapITal blog logo

SeventyFour/Getty Images