Agencies making the transition to a cloud environment are using enterprise container platforms to help bolster their legacy platforms, increase security and make their work more efficient.
One of the leading players in the space, Docker, defines a container as “a standard unit of software that packages up code and all its dependencies, so the application runs quickly and reliably from one computing environment to another.”
End users can access this neutral compute space to conduct research, develop software, and generally make more effective use of their legacy technologies within a modern framework.
At the National Institutes of Health, for example, containers are helping to support high-level scientific research. In a heterogenous IT environment, containers help researchers to bypass legacy hurdles that might have stymied their work.
“For example, we have an application that someone developed on one flavor of Linux that cannot easily run on another flavor of Linux. It may require different libraries,” says Susan Chacko, lead scientist in the high-performance computing group at NIH’s Center for Information Technology.
Containerization breaks that logjam. “The container itself will run the flavor of Linux that the application was written for, even though the operating system on the main computer may be completely different. So, it enables a kind of application availability that we didn’t have before,” Chacko says.
Containers Help Agencies Upgrade Technology
Some agency IT shops go a step further, offering Containers as a Service, which includes security and governance tools.
“Containers are recognized as specifically helpful in avoiding compatibility issues,” says Gene Moran, author of Pitching the Big Top: How to Master the 3-Ring Circus of Federal Sales.
Federal agencies can leverage the technology in cases where legacy systems won’t interoperate, yet cannot readily be replaced, he adds.
“The cumbersome budget and funding process necessitates these considerations, as wholesale replacement and upgrades across enterprises are rarely an option in federal acquisitions,” Moran says. “In this light, container technology is a proven method to help overcome the challenges of government systems that have been largely cobbled together over time.”
In 2019, the NIH Biowulf computing cluster staff built 70 different applications in containers, representing 10 percent to 15 percent of the newly installed applications on the 105,000+ core/45+ PB Linux supercomputing cluster.
Those container-based efforts have engendered new efficiencies by enabling NIH to deploy cutting-edge applications in isolation from the existing operating system.
“We manage about 700 applications on the Biowulf cluster, and if we make a major change on the cluster operating system, it could affect any number of those applications,” Chacko says. Containers help developers sidestep that potential peril.
MORE FROM FEDTECH: Find out how your agency can effectively implement DevOps.
Containers Help Feds Avoid Breaking Apps
Sometimes researchers develop scientific applications on a leading-edge version of Linux. “To install that application, we wouldn’t have to upgrade the whole cluster and potentially break other applications,” Chacko says. “Instead, that would be something we’d put into a container.”
The National Institutes of Health uses containers in its Biowulf supercomputing system to avoid software conflicts, says HPC Lead Scientist Susan Chacko. Source: NIH
It’s common for one application to require several libraries or dependencies. A container saves the IT team from having to maintain dozens of versions. “The container system keeps it all very clean,” Chacko says.
READ MORE: Find out how the USDA uses containers to build apps.
Containers Help with DevSecOps
The Department of Defense is looking to containers to drive DevSecOps, an organizational software engineering culture that aims to unify software development, security and operations.
The DOD is migrating 37 programs to DevSecOps, which mandates container use, says Nicolas Chaillan, chief software officer for the Air Force and DOD enterprise DevSecOps co-lead. A typical DOD DevSecOps stack has dozens of products that include Docker container management; Splunk for monitoring; and cloud platforms such as Microsoft Azure and Google Cloud.
“We don’t want to be locked into any cloud platform,” says Chaillan.
The main orchestration tool is open-source Kubernetes (Docker is another). “It’s what manages the containers — running them, restarting them, updating them, making sure they scale. The orchestrator does the management; not so much what you are running, but how you run it.”
MORE FROM FEDTECH: Bust these myths about containers, Kubernetes and app platforms.
Containers Give Software Developers More Flexibility
Containers give software developers new agility. “You can spin up a container faster than a virtual machine. Most VMs take five to 30 seconds to spin up based on size, while a container takes a few milliseconds. That is a big part of the attraction, because it makes you more flexible,” Chaillan says.
This should prove helpful when it comes to software development. “You can work on an individual piece in a way that is modular and flexible, so you can try new things just by swapping containers,” he says. “The container will behave the same regardless of whether it is a weapons system or a business system. We get the same behavior.”
That same neutrality gives DOD a big win when it comes to leveraging containers in support of legacy systems.
“We are moving pretty much every software system, everything from an F-16 jet to an F-35, into this containerized model,” Chaillan says. “Software continuously evolves, even on these legacy systems, and the container allows us to centrally update regardless of the database or the programming language that’s being used.
“If you use Java or C as a programming language, you can have a container for that, and you can update to the latest version without any downtime.”