Effective data analysis  can help agencies identify trends they weren’t aware of, says the FAA’s Elliott Black .

Feb 20 2018
Data Analytics

FAA, CMS and GSA Retool to Take Advantage of Big Data

Agencies can use Big Data to gain insights that help citizens and make their operations more efficient, but they must upgrade IT infrastructure to do so.

Officials at the Federal Aviation Administration want to make the most strategic, well-informed capital planning decisions possible about airport facilities. But airport facility requirements change when the airline industry changes, ranging from mergers and acquisitions to changes in the size of aircraft they operate and their underlying business models.

To improve their decision-making, FAA executives such as Elliott Black, director of the FAA’s Office of Airport Planning and Programming, are combing through terabytes of current and historical information that promise new insights for forecasting.

“I love data,” Black says. “By taking an open and honest look at our information, we can identify trends or problems that we weren’t aware of previously.”

Leaders at the FAA and counterparts at agencies such as the Centers for Medicare and Medicaid Services (CMS) and the General Services Administration realize that to effectively harvest insights from their expanding volumes of diverse data, they must re-evaluate their underlying data management and analytics capabilities.

“Agencies that want to take real advantage of Big Data, analytics and artificial intelligence will eventually need to upgrade their older systems,” says Shawn McCarthy, research director for IDC Government Insights.

SIGN UP: Get more news from the FedTech newsletter in your inbox every two weeks!

It’s a good time to make a change. Steady innovation is bringing about new analytics capabilities derived from emerging technologies such as machine learning, as well as enhancements to open-source tools and commercial applications. “We’re seeing a number of new analytical tools out there that make it easier to build customer reports on the fly,” Black says. “This could reduce the workload for our people and enable them to spend more time doing the substantive analyses we need to do.”

For the past 15 years, the FAA has been relying on its System of Airports Reporting to help manage and forecast capital improvement investments for the approximately 3,300 airports across the country that are eligible for federal grants. SOAR centralizes a wealth of information: 35 years of historical funding data, as well as current project activity and capital needs information provided by individual airports, regional offices and state aeronautical agencies.

The FAA’s data management currently consists of government-developed technology with hardwired connections among the database, user interface and reporting modules, making it difficult to slice and dice the data. The agency is upgrading the system with connections created with industry-standard application programming interfaces and commercial technology that will replace the hardwiring. “By better integrating the modules and building in better business analytics, we want to make it easier to perform complex analyses,” Black says.

FT_Q118_F_Joch-quote_0.jpg

 

The systems will include a commercial database management system as well as commercial business analytics and reporting applications. “Our goal is for the airport community to be able to enter their information directly, which will save them time and enhance data consistency,” Black says.

Numerous Infrastructure Choices for Processing Big Data 

Innovation isn’t limited to analytics tools; CIOs also have new options for building out IT infrastructure to support the efficient processing of large data sets. For example, organizations can select processing that is optimized for specific database platforms.

Lenovo servers have long been the reference platform SAP uses for developing HANA,” says Charles King, principal analyst at Pund-IT. “Plus, virtually every x86 server or system vendor has built solutions that can be applied to Big Data problems and workloads. Dell EMC offers tailored solutions for SAP HANA, Oracle Database and Microsoft SQL Server data, as well as open-source data analysis platforms, such as Apache’s Hadoop and Spark.”

In addition, storage vendors are delivering Big Data solutions that capitalize on all-flash and flash-optimized storage arrays, King says. Flash storage delivers much better performance than traditional spinning-disk drives, which can speed up data analysis.

CMS Crunches Numbers, Saves Lives

In the Office of Minority Health at CMS, it’s understood that gleaning new analytical insight from routinely collected data can produce life-changing results for citizens. By sifting through large volumes of payment and demographic data, the office helps health officials better serve the unique needs of minority populations, people with disabilities and those in rural areas.

For example, infant mortality rates for African-Americans are nearly double the nationwide average; Hispanics show disproportionately higher rates of diabetes than the national average; and deaths from opioids are greatest among non-Hispanic whites. “These disparities show why it’s important to disaggregate data to understand the specific challenges facing various populations,” says Director Cara James. “That helps us target limited healthcare resources to the areas of greatest need.”

Part of the outgrowth of this effort is the Mapping Medicare Disparities Tool, which shows outcomes and other data for 18 chronic diseases, such as diabetes or heart disease.

FT_Q118_F_Joch-elpunto.jpg

A collection of technologies support the map. Applications from various vendors, such as Microsoft, extract Medicare fee-for-service data and feed the results into a Microsoft Excel spreadsheet. An open-source JavaScript library and a cloud-based data analysis platform are then used to produce the final visualizations.

“One of the biggest goals of the tool is to increase awareness and understanding of disparities at the state and local levels,” James says. “Local officials can then use that information to inform their decision-making.”

GSA Visualizes Data in the Cloud 

As chief data officer for the General Services Administration, Kris Rowley and his team are developing a long-term strategy for an enterprise data management and analytics platform, which relies on Oracle and SAP solutions.

To achieve that goal, Rowley plans to update the reporting tools the agency has implemented. “There’s been rapid development in visualization technology to make information more presentable and help executives more easily grasp insights from the data,” he says.

The agency is moving much of its data to public cloud repositories to capitalize on the computing capabilities available with those models. As they do this, officials want latitude in choosing which analytical tools stakeholders can use. “I want to be able to plug any visualization application into cloud data sets and know there won’t be any migration costs,” Rowley says. “That means getting away from traditional solutions that integrate the reporting tool with where the data is stored.”

The GSA evaluations also take emerging technology into account. “Everything we’re doing will create a foundation for moving to machine learning,” Rowley says. “Machine learning will support the enterprise by empowering the workforce with predictive modeling and the ability to forecast what may happen next.”

Zaid Hamid
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT