Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Feb 19 2024
Security

WEST 2024: Agencies Adapt Plans for Data Security as the Amount of Information Grows

Sorting, analyzing and moving data becomes more complex with increased volume.

Defense agencies’ protocols on how to collect and use data are evolving as they realize just how much they can capture, how difficult it can be to protect in transit and that the edge might not be the place to analyze data after all.

“Information is combat power. It will be the decision advantage of the future,” said Navy CIO Jane Rathbun, speaking at WEST 2024 in San Diego. “The future state for me is where the warfighter is given decision-ready information from which to act and doesn’t have to fuse it at the tactical edge.

“That information's coming at us very quickly, a lot of information. We need to focus on understanding our data, understanding our information, tagging it, managing it and making it transparent.”

Secure data is one of the five pillars of the Cybersecurity and Infrastructure Security Agency’s zero-trust maturity model. At minimum, agencies are expected to manually categorize and inventory data, among other requirements. Optimal compliance includes extremely strong encryption, automated data tagging and continuous inventory.

Click on the banner and learn how to become an Insider.

Growth in Data Volume Creates Unexpected Issues

“Data tagging is super important. If we don't have consistency on our data tagging, it's going to be very hard as we move into zero-trust networks to make sure that the access to that data is consistent,” said Timothy Hess, technical lead for identity, credential and access management for the Defense Information Systems Agency.

“Is a tank a physical tank that the Army uses? Is a tank something that flies in the air to fuel airplanes, or is it a water tank for the Navy? The standards of data and tagging are going to be very important.”

The data ecosystem may actually be overstuffed, suggested Terry Halvorsen, IBM vice president for federal client development. “Why? The data systems don’t integrate well. And we copy everything. How do you choose the right processes to eliminate some of the data or eliminate repetitive data? That gets harder because we’re expanding our databases.”

The cost of data also becomes an issue as the volume grows. “You’ll want to use things like deduplication, compaction and tiering of data to a lower-cost medium,” said Jim Cosby, field CTO for NetApp’s U.S. public sector team.

“If I can take 5 terabytes and shrink it down to 1TB and it still looks like 5 to all the apps and users,” he said, “I've saved 80 percent of my cost, my footprint and my time, and more than likely I'm going to speed up the execution of my mission.”

LEARN MORE: The U.S. Navy is working to establish fleetwide connectivity.

Securing and Sharing Data Remains Complicated

Once an agency — defense or civilian — gets that aspect of data under control, it still needs to figure out how to share it securely. Technology can get in the way, said Daniel Corbin, technical director and deputy commandant for information (C4) for the U.S. Marine Corps.

“You’ve got to have some common framework under which to share that data from a technical perspective,” he said. “We are generally doing that through means that are traditionally the way we’ve done it all along; which is, we buy a bunch of stuff and try to integrate it. There are many challenges in that model.”

Agencies that deal in sensitive and classified information have another issue: how to share data among networks at different security levels. “There is no cross-domain solution that allows that,” said Keegan Mills, engineering and cyber technology lead for the Marine Corps Systems Command.

“To me, that is a showstopper, a blocker that prevents us from even developing and fielding a technical solution. That’s something we need to solve, and we need to solve it fast,” he added.

READ ON: Learn how to build a flexible and accessible data platform.

Consider Sending Analyzed Data to the Edge

Rathbun believes that artificial intelligence and large language modeling can help with the data aggregation issues, if not the technological ones.

“But we have to rethink how we manage our information,” she said. “If I’m going to build models that can be consumed at the tactical edge that will need data from all those sources, I have to rethink how I house, compute and source data.”

Her “sacrilegious” (as she put it) point of view on data in general is that perhaps it doesn’t need to move to the edge to be useful at the speed of mission. “Maybe we train models and move our models to the tactical edge, consume new data and then move everything back to the mother ship,” she said.

“I also won't be doing my warfighters — the marines and sailors at the tactical edge — any favors because I required them to do fusion of data to make a decision. We have to think about getting to the point where we can maneuver data, combine data and hide the provenance of data if something is a sensitive source.”

To learn more about WEST 2024, visit our conference page. You can also follow us on X (formerly Twitter) at @FedTechMagazine to see behind-the-scenes moments.

Photography by Mike Carpenter & Jesse Karras