How Agencies Can Optimize Their Data Centers via Hybrid IT
The Justice Department continues to own and manage its own data centers because of the sensitive nature of its work, but the agency is also taking advantage of the secure commercial cloud.
DOJ has adopted public cloud services such as Microsoft Office 365 (for email and collaboration) and Box (for online file storage and sharing).
Some of the agency’s components have also migrated mission-critical applications to government-only cloud providers that meet stringent security and availability requirements.
Some applications and data are still kept in-house, however, because of the security level of the information.
“There’s no one-size-fits-all,” says David Rubin, the department’s director for service engineering. “We are continually evolving and assessing the best way to house data. Where it makes sense, we will move to the cloud.”
The information that remains under agency control often lives on consolidated data centers that have been made as efficient as possible. That’s required by the Data Center Optimization Initiative.
Federal IT leaders are deploying virtualization and technologies such as hyperconverged IT infrastructure to save space and reduce power and cooling costs.
“We are constantly fine-tuning efficiencies and looking to increase virtualization density on our hardware to get the most performance and power efficiency out of the same space,” says Matt Conner, CISO for the National Geospatial-Intelligence Agency (NGA).
SIGN UP: Get more news from the FedTech newsletter in your inbox every two weeks!
Social Security Administration Embraces Hybrid Cloud
Federal IT leaders use different criteria to determine whether to move workloads to the cloud or keep them in-house. The decision depends on circumstances, such as an agency’s mission, performance requirements and economics, they say.
The Social Security Administration, for example, has fully consolidated into two main data centers, but it’s still trying to improve optimization. One way to do that is to implement a hybrid cloud, says CIO Rajive K. Mathur.
Some SSA applications already run in a commercial cloud. But the department is also building a private, on-premises cloud and plans to begin using an additional commercial cloud provider and services (Microsoft Azure and Office 365) by the end of fiscal 2018, Mathur says.
Many applications and data will stay in the SSA’s private cloud for better performance, he says.
“The main driver for deciding which cloud an application fits in will be the proximity to the data,” Mathur says. “Many of our applications will continue to interface with legacy applications and data that may not tolerate the added latency of connecting to an off-premises cloud.”
NSA Aims to Make Its Clouds Secure
The National Security Agency has embraced the hybrid cloud and has no qualms about using a public provider for unclassified work when it’s complemented by appropriate network and security monitoring.
For classified work, the NSA uses a combination of government-only variants of commercial clouds designed for the intelligence community’s secret and top secret information, such as one sponsored by the CIA, as well as unique NSA-built, secure private clouds.
These cloud variants are protected by high-grade cryptography and security solutions designed to isolate national security systems from the rest of the world, says NSA CIO Greg Smithberger.
“We are trying to get the same benefits of leveraging the commercial technology as aggressively as we can while maintaining our security boundaries,” he says.
The agency uses government-only variants of commercial utility clouds for applications, such as administrative and training modules, which spike in usage at certain predictable periods. The agency can spin capacity up or down when needed, Smithberger says.
But not everything can go off-premises. For example, the NSA has created private, secure cloud services that provide data analytics for itself and other agencies in the intelligence community. And sometimes it’s more affordable to keep apps and data in-house.
“For applications that are running at full bore all the time with no down period, it may not be cost-effective to implement in an external cloud. When you operate at the scope and scale of the NSA, you can buy the hardware and run it as cheaply yourself,” Smithberger says.
Agencies Take Steps to Optimize Data Centers
Agency IT leaders say data center optimization is a continuous work in progress. The SSA, for example, is optimizing its two data centers beyond building a private cloud.
The agency is currently deploying an all-flash, Tier 1 storage system, which will provide faster performance and reduce energy and cooling requirements by 30 to 40 percent compared with hard-disk systems, Mathur says.
The agency also is implementing a data center infrastructure management tool to automatically collect and report data on power usage effectiveness, energy metering, facility utilization, virtualization and server utilization.
To further boost efficiency, SSA is installing a hot-aisle containment solution and increasing the computer room air handler temperature to improve PUE. It also plans to raise the temperature of the chilled water it uses for cooling.
“We feel we have made significant progress modernizing and optimizing our data center infrastructure,” Mathur says.
DOJ Moves to Close Data Centers
The DOJ has made steady progress with its consolidation effort. The department, which started with 110 data centers, is on track to consolidate to three core enterprise data centers by fiscal 2019. It has already closed 78 data centers and is working to eliminate the remaining 29, Rubin says.
Today, most systems are virtualized. While closing data centers, the DOJ is making sure it’s not just moving hardware.
An in-house DOJ team prioritizes cloud migration and makes an assessment to see if applications and data can go to a more secure, government-only cloud service. If not, the team makes sure servers and storage are fully utilized during the migration to a core data center.
“The goal is not to forklift a rack of equipment and drop it in there,” Rubin says. “We want to maximize everything we can within a rack and not have a less utilized server sitting by itself. So, we explore ways to optimize through virtualization, moving it to the cloud or to a shared environment.”