What Is High-Performance Computing?
“High-performance computing is the aggregation of computing power,” says Frank Downs, a member of the ISACA Emerging Trends Working Group.
“While all computing power is somewhat aggregated,” he says, “HPC is the aggregation of many different computers and systems to tackle one problem.”
Downs highlights the use of HPC frameworks to help create the first-ever picture of a black hole. Aggregating individual compute instances made it possible to sift through massive amounts of stellar data and stitch together the results to create a historic image.
Cameron Chehreh, CTO and vice president of presales engineering at Dell EMC Federal, offers a similar assessment. He notes that HPC “is the practice of combining the total computing power of multiple computers to handle larger amounts of data and solve large problems.”
HPC, Chehreh notes, “has origins in the 1960s and has been critical to increasing innovation and supporting discovery across industries.”
In practice, HPC systems typically take the form of large clusters made up of individual computing nodes. According to Chehreh, these may include processing power through CPUs and GPUs on servers; tools such as NVIDIA and Intel software development kits; frameworks including TensorFlow, MXNet and Caffe; and essential platforms with Kubernetes and Pivotal Cloud Foundry.
Cloud solutions play a critical role in new HPC deployments as way to decouple performance from local compute services. Using robust and reliable public, private or multicloud services now makes it possible to create customized computing nodes designed to deliver a more unified HPC approach.
How Can HPC Benefit Government Agencies?
For federal government agencies, HPC solutions offer multiple benefits including:
Increased speed. “One of the biggest benefits of HPC is speed,” Chehreh says. This is especially critical as the volume of data processed by government agencies increases exponentially — “having HPC solutions store and analyze data at increased speeds allows decisions to be made quicker and with more accuracy.”
Reduced waste. Federal agencies can also reduce IT waste with the adoption of HPC models. “Agencies can find use for older systems and technologies by bringing them into HPC clusters,” Downs says. This piece-by-piece approach also offers performance benefits, according to Downs: “If one part breaks, you don’t lose your computing power.”
Improved agility. While legacy tools and technologies remain commonplace for many federal agencies, they’re not up to the challenge of today’s data-driven IT environments. “1,000 times the data created by 1,000 times more users will break traditional IT infrastructure,” Chehreh says. The ability to handle these data volumes in real time is now critical to deliver relevant, actionable insight.