The federal government is awash in data, and to make sense of all of that data and derive value from it, agencies need analytics tools.
Bill Brantley, the former training administrator for the U.S. Patent and Trademark Office’s Global Intellectual Property Academy, notes for DigitalGov that the government “is probably the one of the biggest (if not the biggest) producers of data. Every day, thousands of federal workers collect, create, analyze, and distribute massive amounts of data from weather forecasts to economic indicators to health statistics.”
How much data do agencies actually generate? It is difficult to say, but the volume is enormous. And writing at Nextgov in 2018, Dan Tucker, vice president for digital solutions at Booz Allen Hamilton, and George Young, vice president of U.S. public sector at Elastic, noted that the “the petabytes-on-petabytes of data that agencies generate, collect and retain, is typically scattered across IT silos.”
Analytics platforms can help agencies gather, analyze and leverage valuable data that can drive improvements and efficiencies in their operations and missions. “Everyone deals with data — in the case of government, huge amounts of data,” Nick Desbarats, an educator and consultant at Practical Reporting, a data visualization consultancy, recently told FedTech. “For audiences of decision-makers and the public, data visualization is a powerful communications tool.”
How to Set Up a Data Analytics Practice in Your Agency
There are a wide variety of analytics platforms and use cases agencies can employ. Such tools can be used for predictive analytics and maintenance, to determine if or when equipment will fail before it does. Analytics can sift through vast quantities of data to deliver insights that can be shared across an agency or with a partner in real time.
However, to take advantage of data analytics, agencies need to know what they want to get out of such tools.
The Advanced Technology Academic Research Center offered some handy tips for agency IT leaders in its report “Harnessing Big Data within the Federal Government – Findings and Recommendations of ATARC’s Big Data Innovation Lab.” ATARC advises agencies to:
- Create a data-centric culture: Agencies should build a collaborative environment among all chief data officers (CDOs), including federal and agency CDOs, and also create a sharing culture focusing on end-user benefits, sets expectations and establishes internal champions.
- Empower agency CDOs: IT leaders, including CDOs, should develop a strategic plan to capitalize on respective agency data such as governance, security, data models, access and analytic tools; define an agile program to bring value to the agency; and implement cross-agency data safeguarding standards.
- Enable data-driven decision-making: IT leaders should make sure the data being analyzed adds value to the agency decision-making. They should also ensure the data are of appropriate quality to the decisions being made and create metrics for assessing data quality and value.
- Focus on improving services, creating efficiencies and meeting mission: Leaders should figure out which data they want to analyze and how to use it, identify the economic impact of opening specific datasets and use data to improve citizens’ digital service experience.
- Use the right technology: As much as possible, IT leaders should automate data collection and aggregation to reduce costly manual workload, eliminate errors and ensure trust.
Data Analytics Enable Agencies to Enhance Their Missions
What can agencies do with data analytics platforms? The possibilities are only limited by the datasets they have access to and the tools they use.
The National Ignition Facility, located at the Lawrence Livermore National Laboratory in Livermore, Calif., is home to the world’s largest laser. NIF uses analytics tools from Splunk, including Splunk Enterprise and Splunk IT Service Intelligence, to maximize the uptime of its systems and proactively monitor and respond to IT and security challenges. The tools also allow NIF to improve control systems’ reliability, increase availability and maintain the infrastructure and systems that supported doubling the number of laser shot experiments from 200 to 400 annually.
Similarly, Sandia National Laboratories, a multimission U.S. National Nuclear Security Administration research and development lab, used Splunk tools to develop High-Fidelity Adaptive Deception and Emulation System, a cybersecurity application that provides an automation-driven collaborative framework for fast and consistent threat identification and response. HADES allows Sandia analysts to produce and share threat intelligence while interacting with attacks in real time, according to Sandia. The solution also harnesses the ability to deceive adversaries, profiling them incognito to expose their tactics, thereby providing IT security analysts with a significant competitive advantage.
Another example of data analytics at work comes from the U.S. Geological Survey, which operates a real-time system that detects earthquakes using only tweets from regular Twitter users, utilizing analytics tools from Elastic.
“The main benefit of the tweet-based detections is speed, with most detections occurring between 20 and 120 seconds after the origin time,” according to Elastic. “This timeframe is significantly faster than seismically derived detections in poorly instrumented regions of the world.”
The tools allow the USGS to rapidly characterize earthquake damage via the added real-time search and visualization tools combined with analysis of the 32 million “earthquake” tweets the system has so far compiled.
Analytics tools have clear benefits for agencies. IT leaders need to simply figure out how they want to use them.