Migrating to New Technology While Maintaining Old IT
Federal agencies hold on to IT systems like few other organizations.
The research firm IDC estimates that more than 75 percent of all federal technology spending goes toward maintaining legacy systems, but the reasons why can be complex —
and unavoidable.
Agencies face a host of technological and cultural issues when it comes to modernization, not the least of which is ensuring that new technologies can run in tandem with older systems needed to keep services operating.
Gartner has even coined a term for these hybrid environments of new technology paired with legacy systems: bimodal IT. Analysts say bimodal describes how organizations keep their existing systems reliable and secure while deploying new technologies to enable innovation and agility. The accompanying challenges scale in lockstep with the number of systems in an organization, Gartner says.
For many federal IT chiefs, that has led to a hard realization: Cloud services offer great potential, but only if they can coexist with COBOL.
Reducing the Static
“When you’re modernizing a system that’s in active use, it’s like rebuilding an airplane while flying,” says John Skudlarek, Federal Communications Commission deputy CIO.
Between new and old, the FCC currently has about 100 technology systems. That’s down from a high of more than 200, but it’s still about one for every 17 employees. And on top of everything else, most of the FCC’s legacy systems (half of which are 10 years old or older) use custom code.
To tackle these unique challenges, agencies like the FCC are turning to cloud-based systems built on open-source code. This strategy enables centralized service catalogs that multiple organizations within an agency can choose from rather than building and maintaining separate, custom systems for each.
Skudlarek says the FCC took a Software as a Service approach with its new consumer help desk, which will become the model for future projects to replace legacy systems.
“The new systems are easier to maintain, more agile, more resilient and more scalable,” Skudlarek says. “Through cloud computing, we deliver solutions faster to our mission customers via an approach that’s usually less expensive than attempting to build a custom solution on-premises.”
Estimated amount the federal government spends on IT operations and maintenance
SOURCE: E-Commerce Times, “Feds Put Big Money Into Innovations,” March 2015
Cutting Over Without Cutting Off
The team developing the Intelligence Community IT Environment (IC ITE) initiative is taking a similar tack. The program aims to unify systems in use at multiple intelligence agencies, starting with the National Geospatial-Intelligence Agency, which will move to a new common desktop service.
“The major challenge was to create a baseline that was compatible with the installed desktop hardware at NGA,” says Deputy CIO Shishu Gupta. To help ease the transition, the agency used a contractor for the IC ITE project that it had used for a previous system rollout.
“That ensured a good understanding of the architecture and underlying technical details,” Gupta says. “It also allowed the team to improve upon the stability and security of the existing systems.”
The result? At the start of summer, NGA was halfway through its migration with almost no disruption in service to users, he says.
For the project, NGA also has begun moving mission application programs to the IC cloud. To ensure success, Gupta’s team wanted to ensure the new system could stand on its own before the agency had to rely on it.
“By creating a copy of the existing program on an unclassified cloud, the team was able to test out the capability before moving it to our classified cloud,” Gupta says. “This will all be invisible to our customers, who will not know these services are coming from the cloud.”
Drawing the Line
While this bimodal IT approach has worked well for NGA, Gupta points out that maintaining a legacy system will not always be the right tactic. Sometimes, a legacy system can’t pair well with a newer system and will actually prevent innovation.
“In cases where the technology differences are quite large, it often makes sense from a cost and schedule standpoint to rewrite or replace an application,” he says. “Gartner talks about these as opportunities to rethink the application itself. In doing so, this provides tremendous opportunities to use the capabilities of the new environment.”
Sometimes, the IT team must draw a line in the software stack, says Gunnar Hellekson, chief strategist for Red Hat’s U.S. public sector unit. An agency then would keep standardized and commoditized elements that provide efficiency, and ditch antiquated custom elements in favor of new development that drives innovation, he explains.
“Customized software can mature and graduate to become a below-the-line standardized part,” Hellekson says. “That evolutionary model is what makes a bimodal approach so powerful: It permits an IT shop to learn and mature over time.”
Hellekson pointed to Oak Ridge National Laboratory, which had dozens of research projects running on siloed software stacks.
“They used OpenStack to standardize everything from the operating system down,” Hellekson says. “That means less time waiting for hardware, less time configuring operating systems and more time doing research tied to their mission.”
Looking Ahead
Such success stories should comfort agencies still developing or just beginning to develop bimodal IT strategies. It’s a lot of work, Skudlarek says, but it’s also inevitable if agencies want to move forward.
“There’s a lot riding on this to make sure the FCC is properly prepared for the future,” Skudlarek points out. “With the right partners, the right ‘fuel’ to fund the modernization and the right team to get this done, we will pull it off.”