Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Oct 30 2025
Artificial Intelligence

Government Supercomputers Are Evolving to Handle AI Workloads

New machines feature a combination of CPUs, GPUs and other accelerators to accomplish more science.

The Department of Energy’s National Laboratories and their supercomputing counterparts within the National Science Foundation are exchanging information, whenever one gets a new machine, to boost scientific outputs across the board.

Los Alamos National Lab and Lawrence Berkeley National Lab’s National Energy Research Scientific Computing Center have a “long history” of collaboration, ever since they realized they were buying similar supercomputing technologies, said Jim Lujan, HPC platforms and projects program director at LANL, during a panel discussion at NVIDIA GTC on Wednesday.

These facilities deal with the same pool of vendors (including NVIDIA and Dell) to build their supercomputers, and their learning experiences with applications in, say, physics on these systems are mutually beneficial.

“It is a little bit of a sewing circle,” said NERSC-10 Project Director Hai Ah Nam.

Click the banner below to get a read on the current AI landscape.

 

AI Is Driving the Future of Supercomputing and Benchmarks

The competition among supercomputing centers lies in getting the attention of Congress to fund their research into scientific problems.

The LINPACK Benchmark measures how fast supercomputers can solve a dense system of linear equations and is used to rank supercomputers in the TOP500 list.

“We’ve given the LINPACK Benchmark a lot of grief over the years, but having something to rank machines has probably driven more growth than anything else we could have possibly done,” said Dan Stanzione, associate vice president for research and executive director at the Texas Advanced Computing Center. “If you have a way to rank them, and you’re behind another country, you can ask for more money.”

That competitive pressure has benefitted DOE and NSF, but like any benchmark, it’s not representative of all the scientific workloads they see. And artificial intelligence workloads now drive the direction of supercomputer design, Stanzione said.

New benchmarks are emerging in the AI era to determine how much science these supercomputers can accomplish, and speed is less of a priority. New machines feature a combination of compute and graphics processing units and other accelerators to handle AI apps and modeling spaces, Lujan said.

Click the banner below for the latest federal IT and cybersecurity insights.

 

Faster Time-to-Insight at a Time of Government Efficiency

AI tools don’t necessarily function optimally in a high-performance computing environment as opposed to a cloud environment. That’s why supercomputing centers are working to create the ideal hybrid environment, Nam said.

The National Nuclear Security Administration is deploying two new supercomputers — Mission (in the classified workspace) and Vision (in the unclassified workspace) — at LANL. Unlike LANL’s traditional modeling and simulation environment for managing the national nuclear stockpile, AI tools will play a big part in the new one because they offer faster time-to-resolution at a time of federal fiscal awareness.

“It’s all geared around time-to-insight,” Lujan said. “The future is more about, how do we enable workflows?”

Photo Courtesy of Lawrence Livermore National Laboratory