The federal government has clearly made progress on consolidating its data centers, but now the Office of Management and Budget wants to go after more than just the easy wins of shuttering aging data center facilities.
Late in November, OMB issued updated guidance that effectively forms a new data center optimization policy under the Trump administration. The guidance, which is open to public comment through Dec. 26, would replace the Data Center Optimization Initiative (DCOI) memorandum M-16-19, which was issued in August 2016.
Under the new guidance, OMB is prioritizing the increased virtualization of federal IT systems. The guidance also imposes a broad freeze on the construction of new data centers, stating that agencies “may not budget any funds or resources toward initiating a new data center or significantly expanding an existing data center without approval from OMB,” although there is an exemption for data centers that qualify as “key mission facilities for data management.”
Meanwhile, as FCW notes, the guidance emphasizes that consolidation and closure of data centers is the highest priority, followed by optimization, which encompasses the shift to virtualization technologies, availability, energy metering and server utilization. The guidance updates sever metrics used to evaluate the success of consolidation efforts, according to FedScoop.
“After eight years of work in consolidating and closing Federal data centers, OMB has seen diminishing returns from agencies resulting from their closures,” Federal CIO Suzette Kent wrote in the memo. “Much of the ‘low-hanging’ fruit of easily consolidated infrastructure has been picked, and to realize further efficiencies will require continued investment to address the more complex areas where savings is achievable.”
“While optimization will be the new priority, consolidation and closures should continue wherever applicable,” Kent says. “OMB will focus on targeted improvements in key areas where agencies can make meaningful improvements and achieve further cost savings.”
Virtualization Takes Center Stage in New Policy
According to the federal IT Dashboard, agencies have closed 3,216 smaller, nontiered data centers out of a goal of 4,477, and closed 210 larger tiered data centers out of a goal of 471. The Agriculture Department, NASA, the Office of Personnel Management, the Education Department and the Social Security Administration are among those agencies that have met their targets.
The new policy says that, as with the old policy, “agencies shall continue to principally reduce application, system, and database inventories to essential enterprise levels by increasing the use of virtualization to enable pooling of storage, network and computer resources, and dynamic allocation on-demand.”
Agencies are also supposed to evaluate options for the consolidation and closure of existing data centers where practical, in alignment with the Cloud Smart strategy. The Cloud Smart strategy says agencies should use risk-based decision-making and service delivery when considering cloud technologies. The policy states that agencies should consider “transitioning to provisioned services, including cloud technologies, to the furthest extent practical; migrating to inter-agency shared services, intra-agency shared services, or collocated data centers; and migrating to more optimized data centers within the agency’s data center inventory.”
In terms of virtualization, the new guidance says OMB “prioritizes the increased virtualization of Federal systems as critical for IT modernization efforts, to drive efficiency and application portability.” OMB expects “all new agency applications to use virtualization whenever possible and appropriate.”
The DCOI policy did not take into account that the number of virtual client applications fluctuated based on demand, making it not ideal to try and calculate those as a ratio to physical hosts. The older policy also did not include mainframes that can support virtualization or include servers that host containers through popular tools.
Now, OMB will “require agencies to report the number of servers and mainframes that are currently serving as hosts for virtualized or containerized systems in their agency-managed data centers.” Agencies will also need to report their cloud investments as part of their data center inventories.
“Given that transitioning applications to the cloud may reduce the count of virtual hosts in their data centers, and given that cloud providers use virtualized systems by definition, agencies may report systems under their cloud investments towards this total count to more accurately reflect the state of their virtualized portfolio,” the policy states. Jeff Reilly, the senior director of the Americas commercial presales for Dell EMC, told the “Innovation in Government” show that software-defined networking technology allows agencies to redefine their IT infrastructure and become more nimble, according to Federal News Network.
“Most organizations are 80 or 90 percent virtualized, and all of that has really provided us a way to add infrastructure capacity on demand, be able to move a virtual machine operating systems or applications wherever we need them to be based on service levels or infrastructure needs,” Reilly said. “We are seeing cloud adoption and data centers being a part of that cloud adoption strategy. Data centers are a part of a hybrid model, where I can run things on premise where I need security, policies and to control my data. And then how can I use the public cloud for burst capacity or put cloud native workloads out to those providers.”
The trend toward SDN and IT automation is also encouraging the private and public sectors to adopt more hyperconverged infrastructure solutions, Reilly said. HCI platforms combine computing, storage, networking and virtualization capabilities into a single appliance, all pre-integrated and controlled by one management layer.
“It’s growing at 150 percent across most industries and it’s one of the hottest areas in the federal government because of the simplicity of deployment. If I need to add capacity or get a workload out more quickly, hyper convergence is a very easy way of doing that,” Reilly said, according to Federal News Network. “What we are seeing is it started out in what we call appliance models where people are buying one or two or eight appliances. Now you are scaling it up to where you need a switch, you are tying it into your network and you need to make it bigger. You are really buying into a software strategy.”
However, he said, agencies need to ensure they have a long-term strategy for how to use software and automation, since most will remain in a hybrid cloud environment for the near future, according to Federal News Network.