Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Aug 04 2017
Data Center

Why Federal Agencies Need to Future-Proof Their Data Centers

Here’s how agencies can ensure they’re getting the most out of their data center dollars.

Data centers account for nearly 2 percent of all electric energy consumption in the United States and 10 percent across the federal government. The Energy Efficient Government Technology Act, proposed legislation that aims to reduce those figures, unanimously passed the House of Representatives this year. The bill aims to make agency-owned data centers more energy efficient. Such a measure could lead to a projected savings of more than $5 billion by the end of 2020.

Regardless of whether the bill clears both chambers and becomes law, Congress’s desire to reduce energy consumption is not expected to wane. As a result, IT managers should acknowledge this reality and consider several approaches today to prepare their agencies for the future.

Make Data Centers More Energy-Efficient 

Under the Data Center Optimization Initiative, agencies have made steady progress consolidating data centers, a trend that is likely to continue. One measure of data center efficiency that DCOI considers is power usage effectiveness. As electricity flows from the power grid through the data center’s mechanical and electrical infrastructure, not all energy consumed translates directly into IT services; PUE, a simple ratio of the energy used by a data center versus the energy delivered to IT equipment, provides a benchmark for measuring energy efficiency.

Two approaches to improve this ratio are gaining traction: elevated data center temperatures and air-side economizing. The American Society of Heating, Refrigerating and Air-Conditioning Engineers reports that modern IT equipment in data centers can tolerate ambient temperatures of up to 80 degrees. That’s as much as 15 degrees higher than the temperatures some IT leaders have used in the past. Increasing temperatures to this level can result in significant PUE improvements and can reduce total cost of ownership (TCO).

Meanwhile, air-side economizing avoids traditional HVAC infrastructure and costs by simply supplying filtered outside air to servers. Even in warmer climates, this can be an effective approach for as many as 5,000 hours per year. In regions such as the Pacific Northwest, this approach is successful year-round.

The National Center for Atmospheric Research, in partnership with the National Science Foundation, operates a data center in Cheyenne, Wyo., designed with a PUE under 1.1. Wyoming has one of the coldest average temperatures in the country, and this allows NCAR to leverage air-side economizing for more than 350 days of the year. The federal facility also uses the warmth generated by IT equipment to heat office space and to melt ice and snow on exterior walkways. Not coincidentally, Microsoft also takes advantage of the IT-friendly climate and runs a data center nearby.

Embrace Virtualization to Speed Data Center Consolidation 

Agency IT managers can support further data center consolidation by turning to technologies that reduce dependencies on physical location. For example, DCOI sets metrics for using virtualization, and as a result the White House tracks and publishes the number of operating systems on each physical server in an agency’s data center. This is part of an effort to maximize each agency’s use of available servers and applaud those agencies, such as the General Services Administration and the Environmental Protection Agency, that have taken steps to future-proof their data centers.

More important, these figures can be a starting point for consolidation because cost, effort and downtime are easier to minimize when agencies are migrating virtual (rather than physical) servers. Hyperconverged deployments may even simplify the overall IT environment.

In addition, container technologies can help boost utilization in virtualized environments — a first step toward a cloud-based environment. The Federal Risk and Authorization Management Program (FedRAMP) keeps a list of approved Infrastructure as a Service, Platform as a Service and Software as a Service providers that could be used where a public cloud is needed.

In those cases, staff should perform careful analyses to ensure the costs of the public cloud solution do not exceed the costs of owning an on-premises solution. As an alternative, agencies may employ a hybrid cloud model in which resources at data centers fulfill the majority of application requirements, while inevitable demand peaks are fulfilled by public cloud resources.

In this scenario, agency leaders can find a solution that achieves lower TCO, meets server and data center utilization metrics mandated by DCOI and is flexible enough to respond to spikes in demand. But they also must carefully consider the agency’s mission and how users will rely upon the infrastructure.

Software-Defined Networking Provide Flexibility 

As agencies continue to adopt cloud computing, they can implement advanced networking solutions to prepare for the future. Layering cloud environments on top of software-defined networks lets applications specify network requirements.

This way, as the need for more capacity is identified, cloud environments can add new physical servers to meet demand.

Use Disaster Recovery as a Service to Gain Efficiencies  

Resources allocated to disaster recovery or business continuity functions are another source of low utilization. Even though these applications require fewer resources much of the time, they may still count against agency metrics. Consider outsourcing this function to a Disaster Recovery as a Service provider.

Then, test the solution regularly and update alongside the production environment as part of the normal change management process.

David Vogin