Aug 03 2022
Data Center

Virtualization, Consolidation Help Agencies Cut Back on Physical Data Centers

Through virtualization and other technologies, federal agencies have minimized or eliminated their physical data centers.

Turn back the clock a decade or so, and you would see a far different data center landscape in the federal space than exists today. 

For one, there would be more data centers —many more, in fact —and they would be largely running applications on bare-metal servers. Those servers would have shockingly low utilization rates by today’s standards, meaning agencies over spent on inefficient and sometimes redundant environments. 

Since that time, the federal government has supported efforts to reduce and optimize datacenter resources. Those efforts have been so successful that data center closures, once a key metric, may be dropped from future Federal Information Technology Acquisition Reform Act (FITARA) scorecards. 

“This is probably one of the best success stories you’re going to hear in government,” says Dave Powner, director for strategic engagement and partnerships at MITRE, a nonprofit that manages federally funded research and development centers that support federal agencies.

“We had all these data centers all over the place, and we were severely underutilizing them. The big lesson is that if you have the right metrics, clear targets, transparency and oversight, you can accomplish a lot.”

Click the banner to get customized cloud content by becoming an Insider.

Natural Disaster Sparks USPTO Data Center Move 

The Alexandria, Va., headquarters of the U.S. Patent and Trademark Office are in a flood zone. “That has caused us problems in the past,” says Ian Neil, chief of USPTO’s server and storage services and the data center relocation lead for the agency. 

“Every year during hurricane season, we have to group together and make a decision about whether we shut the datacenter down,” he says. “And that’s not a good place to be.”

Officials decided to close USPTO’s three remaining physical data centers (two in Alexandria and a disaster recovery site in Pennsylvania) and move IT resources to a colocation center in Manassas, Va., about 30 miles from headquarters.The move will consolidate what was previously 43,000 square feet of infrastructure into a 10,000-square-foot space. 

Although the agency had long since virtualized most of its data center environment, its facilities were simply not equipped to handle densely packed infrastructure. 

“The Alexandria data centers were built in the early 2000s, and we didn’t have the power and cooling capabilities within those data centers to consolidate and condense into a high-compute environment,” Neil says. “If we had a rack of servers, we were only able to fill it halfway because we ran out of power and cooling.” 

READ MORE: Federal website consolidation eases the online journey for citizens. 

Data Center Consolidation Provides Resiliency and Savings

To consolidate its three existing data centers into the smaller colocation space, USPTO is relying on several technologies, including flash storage. Most of the compute infrastructure will be Cisco UCS blade servers (at a higher density than is possible at the agency’s existing facilities), but it’s also planning to utilize hyper converged infrastructure (HCI).

Databases are being upgraded to Oracle Database 19c or higher, so the agency can leverage the vendor’s Global Data Services (GDS) capabilities. The organization’s plan is to migrate as much of its lab environment to the public cloud as feasible, both to improve resiliency and save on costs, since the agency will be able to spin down testing resources when they aren’t being used. 

“My opinion is that we need to have a hybrid environment,” Neil says. “We need to be able to move things from on-premises to the cloud and from the cloud to on-premises. I don’t like putting all our eggs in one basket.” 

Neil says the move is scheduled to be completed around March 2023, with minimal impact to users. While the move is a major step, the path was paved by earlier optimization efforts, he notes. 

“We were probably at about 30 percent resource utilization 11 or 12 years ago, and now we’re up to 60 to 70 percent due to increased virtualization,” he says. “We’ve done 50 percent more work with the same amount of equipment.”

$6.6 Billion

Cost savings between 2012 and 2021 from federal data center optimization efforts

Source: Government Accountability Office, “Data Center Optimization:Agencies Continue to Report Mixed Progress against OMB’s Targets,” March 2022

More Efficient Infrastructure through Virtualization 

The General Services Administration once had more than 140 data centers, but the agency no longer operates any of its own physical data centers. 

Mark Robinson, director of the infrastructure integration division in GSA’s Office of Enterprise Infrastructure, notes that this is the result of a long, gradual process. 

“GSA started by participating in the Federal Data Center Consolidation Initiative, with a focus on reducing the cost of data center hardware, software and operations through efficient computing platforms while promoting the use of green IT,” Robinson says.“ Next came FITARA, which mandated agencies to review IT investments to reduce duplication and waste.“

Finally, the Data Center Optimization Initiative served as a framework for achieving data center consolidation and optimization goals via performance metrics, cost and savings targets, and closure targets.”

To achieve resource reduction and optimization goals, GSA migrated services from agency-owned data centers to more flexible cloud environments, as well as colocation centers. 

DISCOVER: How agencies can best implement zero-trust architecture.

During the past five years, GSA has shifted away from traditional three-tier infrastructure toward HCI, with built-in virtualization and software-defined networking (SDN). Ultimately, the organization opted to consolidate equipment in colocated multiagency data centers that are managed by the Environmental Protection Agency and NASA

“By consolidating to an HCI/SDN architecture with its integrated scalability attributes, in combination with the high availability features that come with virtualization, GSA has consolidated and reduced the number of physical servers and previously hosted locations,” Robinson says. 

“GSA has also deployed high-speed connections to be able to burst into the cloud for compute and storage,” he added.“This has allowed the agency to better manage operations and to use compute and network resources more efficiently in the data center to support multiple high-performing workloads.

A Long-Term Effort at Optimization 

The State Department started the 2021 fiscal year with 141 data centers, had a goal of shutting down two of them and ended up closing three, achieving a savings of $32 million. Yet the department’s data center optimization efforts date back to at least 2006, when officials began to enthusiastically embrace virtualization. 

“It’s been a long-term effort,” says C. Melonie Cannon, acting director of the State Department’s Systems Integration Office. “IT is ever changing.” 

The State Department made a concerted effort to optimize and consolidate its infrastructure, powered by solutions from VMware and other vendors, but much of the change was also cultural, Cannon says. 

“At first, people were hugging onto their ‘pizza box’ physical servers like we were trying to take something from them,” she says. “Now, they understand that we’re trying to do the right thing for the department.” 

Along with virtualization, Cannon says, the State Department has embraced both the public cloud and HCI, relying on vendors like Microsoft Azure and Nutanix. “The goal is to have our customers in the cloud as much as possible,” she says. 

DIVE DEEPER: Take a measured approach to software-defined networking.

C. Melonie Cannon
"The goal is to have our ­customers in the cloud as much as possible.”

C. Melonie Cannon Acting Director, Systems Integration Office, State Department

Virtualization Contributes to Data Center Consolidation

The department has seen success consolidating its three enterprise data centers. One opened around 2009 with roughly 5,000 physical servers taking up 11,000 square feet of space. Consultants brought in to analyze the facility reported that resource utilization was under 10 percent for more than half of the infrastructure in the data center.

“That kicked us into gear,” Cannon says. “We said, ‘We’re wasting money, we’re burning energy. We need to come up with a better method.’”

By embracing a virtualization-first policy, the department was able to shrink that facility by nearly 90 percent, from 11,000 square feet to about 1,200 square feet. 

Cost savings from the improvements often went directly back to individual State Department bureaus, Cannon notes. Also, cloud and HCI investments have allowed the department to become more agile and flexible. 

For instance, the department now spins up resources to support seasonal demands for visa lotteries and internship programs, and then dials down that infrastructure during slower periods. 

“We don’t have to keep that up and running at full capacity,” Cannon says. “We’re not paying for 12 months of service to cover only four months of utilization. It’s your tax dollar and my tax dollar. So, we want to do the right thing.”

Erik Isakson/Getty Images
Close

Learn from Your Peers

What can you glean about security from other IT pros? Check out new CDW research and insight from our experts.