Graphics processing units “are a cost-effective way of increasing the performance of your applications,” says Buddy Bland, project director at Oak Ridge National Laboratory.

Apr 22 2014
Hardware

Welcome to the Dawn of GPU Computing

A technology used to make games run faster can now help solve agencies’ Big Data challenges.

The U.S. Defense Department collects a massive amount of surveillance data via sensors: still images, infrared images, video and more. It then checks the data for anomalies. But for now, not all of that data can be sorted in real time.

"There's too much in the pipe for the current computers to handle all at once," says Mark Barnell, high-performance computing director in the High Performance Systems branch of the Air Force Research Lab.

But Barnell sees a fix on the horizon: running the data through computers accelerated by graphics processing units (GPUs). The processors are a cost-effective way of supercharging the military's computing power, he says. With the help of GPUs, Barnell says, humans could access relevant surveillance data more quickly and launch proactive interventions, and perhaps even prevent terror attacks such as the Boston Marathon bombings.

"That would be one of the goals," Barnell says. "If you see somebody put a backpack in a garbage can, wouldn't it be nice if a computer system could flag that and immediately notify the right authorities? You can't put a human at every camera."

Parallel Power

GPUs were once the domain of computer gaming, but scientists saw the potential in their processing power and programmed the devices to crunch large data sets, as well as complex graphics and virtual worlds. "Before that, it was like trying to make a washing machine compute," says Steve Conway, research vice president in IDC's High Performance Computing group.

It turned out the highly parallel structure of GPUs are well suited to certain types of computational problems. And depending on the problem, using GPUs as accelerators could make computing "anywhere from two to 30 times faster" than using CPUs alone, Conway says.

As a result, GPU computing has been a hit with agencies such as the DOD, the Energy Department and the National Institutes of Health. And it has been the focus of considerable federal investment. In the future, as more agencies explore ways of handling Big Data analytics, GPUs could put high-performance computing power in many more hands.

When scientists at the Energy Department's Oak Ridge National Laboratory upgraded the facility's Jaguar supercomputer, they added nearly 19,000 Nvidia Tesla K20 GPU accelerators to boost performance.

"There have been other supercomputers that have been built using GPUs, but this is one of the first that has been really large and available for the general [research] public," says Buddy Bland, project director at the facility.

The overhauled computer, which is now called Titan and finished testing last June, is 10 times faster than its predecessor. It is currently the fastest computer in the United States and sits at No. 2 in the Top 500 world rankings. "We went from a machine that was about 2.6 petaflops [floating-point operations per second] to one that's 27 petaflops," Bland says.

The National Science Foundation has provided funding for several GPU-accelerated infrastructure projects. The increased computational power offered by GPU acceleration is important, but so is the technology's relatively low energy consumption, says Irene Qualters, director for the Division of Advanced Cyberinfrastructure at NSF.

77%

Percentage of high-performance computing sites that employed co-processors, including GPUs, in 2013

SOURCE: IDC Worldwide Study of HPC End-User Sites (IDC, October 2013)

Qualters explains that energy bills for supercomputers can be astronomical, and that power infrastructure is often built right alongside data centers. "This issue of power is not just a side issue," she says. "If you're going to operate at scale, you need to be able to manage your power."

Because of its energy efficiency, GPU computing could be "one of the steps" on the way toward building an exascale machine, says Bland. Exascale computing is seen by many as the holy grail of supercomputing and represents a thousandfold performance increase over the petascale threshold, which system designers crossed in 2008. Many experts cite excessive power consumption as a major inhibitor in the push to exascale supercomputers.

"Whether it's exactly a GPU, or just some of the features used to design GPUs that bring about exascale computing," Bland says, "that's still several years out."

Can't Get Something for Nothing

Not all computing problems can be accelerated significantly with help from GPUs. Because the configuration requires that data be sent from the CPU to the GPU for computation, and then back to the CPU, a bottleneck can form in certain types of problems, says Jack Dongarra, professor of computer science at the University of Tennessee and an expert in supercomputing.

"An example would be if you had n pieces of data and were doing n calcu­lations," Dongarra says. "You'd do the calculations quickly, then spend time moving data back to the host side."

Another challenge is that scientists have to do more programming to make their problems work in a hybrid CPU-GPU computing model. "It's very manpower-intensive," says Barry Schneider, formerly an advanced cyber­infrastructure director for NSF and now a research scientist at the National Institute of Standards and Technology. "You don't get something for nothing."

Still, the extra programming is getting easier. Nvidia built a platform that allows researchers to use a language they're familiar with, and there are now about 300 standardized applications designed for GPU computing, such as programs that simulate molecular behavior.

Like Schneider, Bland cautions that GPU computing is not going to make all computing challenges run faster. "But if you have applications that can use GPUs, and there are more of those every day," he says, "they are a cost-effective way of increasing the performance of your applications dramatically."

David Luttrell
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT