Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Dec 12 2016
Management

The 4 Biggest Trends in Federal IT in 2016

There were several major developments in federal technology this year, including IT modernization, data center optimization, efforts to move to the cloud — and, of course, cybersecurity.

The federal government is still humming, but at a much slower pace, now that Congress has left town for the holidays — and passed a continuing resolution at the 11th hour to keep the government funded until April 28.

In the waning days of the Obama administration, it’s worth taking stock of the major technology developments that dominated 2016. Many of the biggest trends will likely loom large as the Trump administration commences Jan. 20. That’s because some may be the subject of congressional action, such as IT modernization, while others — data center optimization and efforts to enhance cybersecurity — are long-running federal initiatives.

There were significant happenings in the world of federal IT this past year, including the push toward category management to save money on mobile services, software purchases and computer hardware. These trends dominated the news and are likely going to demand attention from federal IT leaders in 2017 and beyond.

Do you think we missed out on a major federal IT trend from 2016? Please let us know in the comments! Here is our recap of federal technology from this year:    

The Push for IT modernization

As part of President Obama’s Cybersecurity National Action Plan, which was outlined in early February with the presentation of the fiscal 2017 budget proposal, the administration proposed a $3.1 billion revolving fund to speed up IT modernization. The fund, formally known as the Information Technology Modernization Fund, or ITMF, was designed to address the fact that the federal government spends roughly 80 percent of its $80 billion annual IT budget on maintaining legacy systems, many of which were designed to automate processes, and some of which are decades old.

In early April, the White House proposed legislation to create the ITMF, which would be administered by the General Services Administration (GSA). Federal CIO Tony Scott became a tireless advocate for the fund. In June, Scott noted that by spending more and more money each year on simply maintaining legacy systems agencies “have missed multiple generations of advancements in technology,” especially because every five years or so technology enhancements deliver “double the capacity or the capability for the same dollar.”

The proposal wasn’t killed by Congress, and lawmakers sought to put their own stamp on it. In September, the House of Representatives passed the Modernizing Government Technology Act of 2016, which didn’t appropriate any new money, but would authorize working capital funds at the 24 agencies governed by the Chief Financial Officers Act of 1990. As FCW reported, these funds “drive IT modernization and bank the savings achieved from retiring expensive legacy IT and shifting to managed services.”

The bill also authorizes a governmentwide revolving fund that the GSA would manage, akin to the ITMF. The Senate did not act on the bill, shelving the idea for now. However, its proponents argue that the need for IT modernization is not going away, and the next Congress is likely to take up the matter.

The Data Center Optimization Initiative

In March, the Office of Management and Budget issued a memo outlining the Data Center Optimization Initiative (DCOI), designed to supersede the 2010 Federal Data Center Consolidation Initiative (FDCCI) and shift agencies toward making their data center operations more energy-efficient. The final policy was released on Aug. 1, and has many different elements.

Scott said it will require “agencies to implement strategies to consolidate inefficient infrastructure, optimize existing facilities, improve security posture, achieve cost savings, and transition to more efficient infrastructure, such as cloud services and interagency shared services.”

To comply with DCOI, agencies will have to meet five metrics for tiered data centers by Sept. 30, 2018. Those metrics are:

  • Install energy-metering tools in all tiered data centers to measure power consumption.
  • Maintain a Power Usage Effectiveness (PUE) score of less than 1.5, but preferably less than 1.2.
  • House at least four virtual servers per physical server.
  • Use at least 80 percent of a tiered data center’s floor space.
  • Achieve a server utilization rate of at least 65 percent.

The new policy will have far-reaching implications. It will likely spur cloud adoption, thanks to its mandate on virtualization. As agencies shutter their data centers, they will need to retrain staff to perform other IT tasks. DCOI will also likely push agencies to bring more connected sensors into data centers to monitor their servers, power and cooling infrastructure in an attempt to boost efficiency. And it is likely to lead agencies to increase their adoption of software-defined data centers and networking, so that they can have more control of networks.

Agencies are already taking heed. The Defense Department, which is behind schedule on data center closures, said in August that it would launch a “data center closure team to assess and recommend closures of the costliest and least efficient facilities beginning in the first quarter of fiscal year 2017.”

Revamping FedRAMP

For more than six years, the federal government has been pursuing a formal “Cloud First” policy that requires agencies to default to using a cloud-based technology if they can find a secure, reliable and cost-effective solution.

The General Service Administration’s Federal Risk and Authorization Management Program, better known as FedRAMP, is tasked with providing a standardized, governmentwide approach to assess the security, authorize and continuously monitor the cloud service providers (CSP) that can work with agencies.  

However, federal IT officials had expressed frustration with the FedRAMP approval process, which prompted the organization in March to unveil changes designed to speed up the process. The new process, called "FedRAMP Accelerated," requires CSPs that want to work with the Joint Authorization Board and get FedRAMP approval to team up with third-party assessment organization, or 3PAO. That organization would conduct an initial capabilities assessment before the CSP provided detailed documentation to FedRAMP. If the 3PAO approved the CSP, and the FedRAMP team agrees, that CSP would be declared “FedRAMP ready.”

In late June, the GSA unveiled the “High Baseline Requirements” for FedRAMP, designed to increase cloud adoption for highly sensitive applications and systems. On Aug. 6, GSA unveiled the FedRAMP Readiness Assessment Report Template, which basically serves as a pre-audit for CSPs, letting them demonstrate their readiness to achieve a FedRAMP authorization.

In September, all of the efforts started to pay off. FedRAMP approved the first CSP under the new program, Microsoft Dynamics Customer Relationship Manager Online, in just 15 weeks, compared to two years for its last authorization. Looking ahead to 2017, FedRAMP wants to continue to transform the security authorization process and increase the number of CSPs agencies can choose from.

Enhancing Cybersecurity Protections

The president’s last budget proposal sought funding for a broad, $19 billion cybersecurity initiative, which included multiple elements aimed at enhancing security for the federal government’s systems, as well as U.S. networks and data more broadly. The funding level represented a 35 percent increase from the previous fiscal year’s $14 billion.

The additional funding and focus made sense, given that the government was rocked in 2015 by the disclosure of security breaches that targeted the Office of Personnel Management (OPM), in which personal information of 22.1 million current, former and potential federal employees was stolen.

As agencies worked to combat insider threats, they also continued to work with the Department of Homeland Security on its Continuous Diagnostics and Mitigation initiative. CDM is designed to provide federal departments and agencies “with capabilities and tools that identify cybersecurity risks on an ongoing basis, prioritize these risks based upon potential impacts, and enable cybersecurity personnel to mitigate the most significant problems first.”

CDM is just one of an array of tools that agencies are deploying to provide them with predictive threat intelligence. Still, it’s very clear that agencies have a great deal of work ahead of them to improve security.

In September, the White House named Gregory Touhill, a retired Air Force brigadier general and a current official at the Department of Homeland Security, as the first federal CISO. Touhill’s job, and that of his successor, is to coordinate the government’s cybersecurity policies. Touhill said earlier this month that White House cyber officials have identified 63 different policy directives, regulations or other requirements they plan to retire.

Last week, an Obama-commissioned panel urged the Trump administration to take a series of actions to enhance cybersecurity within the government and in concert with private industry. Obama has also ordered U.S. intelligence agencies to deliver a report to him by Jan. 20 on “lessons learned” regarding Russian cyber intrusions intended to influence the 2016 presidential election.

tpsdave/Pixabay