May 31 2022

For NASA, Selective Service and DISA, Data Protection and Storage Takes Priority

Data protection and storage take priority across federal agencies.

NASA receives at least 4 terabytes of data each day, and the agency is constantly looking for ways to “liberate” its data in ways that will make the information more valuable for decision-making.

“We’re really here to assist with providing the hindsight, insight and foresight to impact and improve mission outcomes,” says Ron Thompson, the agency’s chief data officer and deputy digital transformation officer.

“We have pockets of excellence but lack the holistic view of data across the agency. For example, in our scientific community, the desire is to have the ability to cut across all our science divisions,” he adds. “Seeing that data across domains is not something that people had in mind when the work was initially formed and built.” 

At the same time, of course, agencies need to protect their data and systems from cyberattackers. This balance between availability and security has always been a challenge, but it’s becoming an even more pressing one as data analytics becomes more critical to government operations, notes Axel Domeyer, an associate partner at McKinsey.

“Governments struggle to realize the full value of their data because it is stored in scattered silos, making it difficult for agencies to access information owned by other agencies when needed,” Domeyer says. “Even when sharing data is technically possible, the information is often formatted in a way that impedes a combination with other data and joint processing.

“The priority for governments should be to identify the data sharing use cases that create the most value, specify the data sets required to implement these use cases, and set up the technical infrastructure and governance required for sharing the data across the government,” he says.

Click the banner to learn more about securing your data by becoming an Insider.

Enabling Real-Time Data Management with the Right Tools

NASA uses data management tools from Cohesity and Rubrik, but the agency is also in the process of building its own enterprise data platform (EDP), with the goal of making data accessible across the enterprise.

The platform, scheduled to go live this summer, is designed to offer low- or no-code solutions for data visualization, and it utilizes tools from vendors including Tableau and MuleSoft.

EXPLORE: The growing need for data backups to better protect data and prevent threats.

“Overall, the design is not to lock into any tools,” Thompson says. “The idea is to use the best of the breed.”

The goal of the system, he adds, is to speed up and simplify access to data, and to eventually power predictive analytics applications.

“Right now, a lot of our data is collected through manual data calls,” he says. “Those requests can take several weeks to come back, and the data quality is not as accurate. The EDP will allow us to go right into the systems of record and pull the data in real time.”

Finding the Right Solutions to Best Protect Data

Men between the ages of 18 and 25 are required to register for the Selective Service System (SSS), not only to avoid legal consequences but also to be eligible for benefits including certain job opportunities and student aid. This means that the agency is responsible for managing and protecting data from millions of young Americans.

CISO Scott Jones notes that the agency needs to not only safeguard data but also put in place systems and practices to ensure continuous availability and disaster recovery.

“Data is not just seen as a commodity to protect but an asset to leverage,” Jones says. “Data can’t be leveraged unless it’s continuously available. Continuity of operations for our network, systems and data is IT’s No. 1 priority.”

As a smaller agency with a broad mission, SSS must be very deliberate in choosing its data management and security tools, says Deputy CIO Daniel Mira.

Ron Thompson
Overall, the design is not to lock into any tools. The idea is to use the best of the breed.”

Ron Thompson Chief Data Officer and Deputy Digital Transformation Officer, NASA

“When we select a solution, it needs to be the type of solution that’s going to be compliant and meet standards for emerging cybersecurity operations now while anticipating future requirements,” he says.

“We have to choose solutions within our budget and make sure they meet security requirements and zero-trust guidelines put out by the National Institute of Standards and Technology, the Cybersecurity and Infrastructure Security Agency and the Office of Management and Budget.”

The agency uses tools from Qualys to inventory its assets, CrowdStrike for anti-virus and VMware Carbon Black for endpoint detection and response. Also, for the past two years, SSS has used Cohesity for backup and recovery of its primary data sets.

“The benefit of a product like Cohesity is that you’re no longer dealing with a significant tape library footprint,” says CISO Ruben Ramos. “Now you have an on-demand disaster recovery capability that meets encryption standards and is highly scalable.”

DIVE DEEPER: How more federal agencies are move toward zero-trust.

DISA's Efforts to Implement Zero Trust Solutions

Like many agencies, the Defense Information Systems Agency (DISA) is aggressively implementing zero-trust principles to safeguard sensitive data while still keeping it accessible to those who need it.

Building off of its efforts over the past several years, the agency is now in the process of working with contractors building a multimillion-dollar prototype for a zero-trust security solution called Thunderdome.

The tool will utilize technologies such as secure access service edge and software-defined WAN and will integrate with the agency’s existing endpoint protection and identity management initiatives.

In early 2021, DISA and the National Security Agency published a Zero-Trust Reference Architecture for the Department of Defense.

“It’s really about how you move toward making sure that you are explicitly verifying all of the actions that are happening on your network,” says Drew Malloy, technical director for DISA’s Cyber Security and Analytics Directorate. “It’s not a light switch, where one day you’re not zero trust and the next day you are.”

84%

The percentage of IT leaders who say there has been more demand for data insights within their organizations since COVID-19 began

Source: Source: Experian, “2021 Global Data Management Research,” February 2021

Historically, defense agencies have been extremely effective at protecting classified data. “Through classification, we’ve traditionally had separate networks and separate systems,” says Brandon Iske, chief engineer overseeing the zero-trust architecture team within DISA’s Cyber Security and Analytics Directorate.

“Data protection has always been our foundational capability, but what we’re starting to evolve to is granularity and protections for each of those classifications,” he adds.

Data sharing and zero trust follow parallel paths: Data must be correctly tagged to share properly, and data that needs to be protected — that may not be available for sharing — must also be tagged correctly.

“There’s been a lot of work on standardization and looking at how to do proper tagging. But it’s been very high-level,” Malloy says. “When you use technologies like data-loss prevention tools, which apply policies that control how you manage your data, you need to have those data classes tagged appropriately. That’s really the next step that we need to take.”

RELATED: What federal technology will need to be replaced to create a zero-trust environment?

Ryan Olbrysh

aaa 1

Register