Sep 29 2020
Data Analytics

Agencies Are Not Ready for the Data Tsunami, Report Indicates

A recent report from data analytics firm Splunk reveals the challenges public sector agencies face in managing the coming onslaught of data growth.

If data is the 21st century’s version of oil, government agencies seem unprepared for a coming gusher. However, federal IT leaders can make changes to the technologies they use and the culture within their agencies to get ahead of the wave.

According to “The Data Age Is Here. Are You Ready?,” a recent report from data analytics company Splunk, the public sector as a whole is “behind the curve in understanding and adopting” emerging technologies such as 5G wireless, the Internet of Things, artificial intelligence and machine learning, augmented and virtual reality, blockchain, and edge computing.

The public sector’s expectation for data growth is a factor of 3.5 over the next five years, compared with a factor of nearly five across all industries.

Mike Saliter, vice president of industries and specialization at Splunk, says that because the public sector is behind in terms of adopting emerging technologies such as artificial intelligence and blockchain, agencies have not yet factored in how much data they will actually be seeing in the years ahead as those technologies become more entrenched. “They almost don’t know what they haven’t seen yet,” he says. “In one to two years they will probably catch up with that projection.”

According to the report, while 60 percent of public sector respondents expect that the value of data will grow in the near future, “they are facing significant challenges in putting data to work, including integrating data from multiple sources, managing the volume of data and overcoming a lack of resources.”

How Federal Agencies Can Better Manage Data

Federal agencies can tackle these challenges through the traditional trifecta of people, processes and technology, Saliter says.

Anecdotally, he says, the public sector knows that it has “not been on the cutting edge of going after and attracting” personnel with skill sets in emerging technologies. Agencies need to do more to attract such workers and compete more with the private sector, according to Saliter.

In terms of technology, agencies need tools that are capable of performing data integration and handling higher volumes of data. “We as an industry need to be able to handle much more volume than exists today,” Saliter says.

Agencies that are at capacity in terms of computing performance or data storage need to rethink their IT architectures, he adds. IT leaders also need to invest in technologies that enable real-time analysis and decision-making.

Historically, agencies relied on transitional data warehouses or data lakes to dump that data into a repository. Agencies then used reports or business intelligence to extract data and get value and insights.

“There is a time and place for that,” Saliter says, but in general, in “that architecture is not going to be able to keep up with the volume and rapid decision-making that is expected going forward.”

Now, agencies need to have solutions that can ingest real-time performance indicators, known as metrics and traces, that are provided by applications. Saliter also says agencies should invest in streaming technology to handle data in motion and be able to analyze data in near real time. Another path forward is to invest in AI and machine learning to detect anomalies and perform projections.

Agencies need top-level management support of these initiatives, Saliter says. “They need to make sure this becomes a cultural change. They need to bring in diversity in thinking and diversity in skills.”

Some of that will come by agencies hiring younger workers and data scientists out of college who can bring fresh thinking, Saliter says.

READ MORE: Find out how feds are learning new data science skills.

Federal Data Strategy Can Help Evolve Agency Efforts

The federal government is in the midst of implementing a 20-point action plan to start making the Federal Data Strategy a reality. The strategy “provides a common set of data principles and best practices in implementing data innovations that drive more value for the public,” according to the Office of Management and Budget.

The Chief Data Officers Council has also provided a platform for agency CDOs to collaborate and share best practices. According to a survey of federal CDOs released in August, more than half of the CDOs surveyed reported improvements in data quality (64 percent), assessment of staff capabilities and needs (57 percent), migration to cloud-based services (57 percent), and availability of metadata (54 percent).

A combination of more empowered roles for CDOs and regulations or policy can have an impact on agencies’ ability to handle large data volumes, Saliter says.

The Federal Data Strategy itself is “going to stimulate and put fuel on the fire to move toward a more analytics- and data-driven approach,” Saliter says. “I would echo that policy is critical to this happening even faster and further.”

MORE FROM FEDTECH: What is predictive analytics and how can it help agencies?

piranka/Getty Images

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT