NASA is also trying out an edge computing approach to process some of its International Space Station data.

Feb 01 2022
Data Analytics

The Evolution of Edge Computing and Data Analytics in Federal IT

Government agencies are exploring analyzing information in situ, which can increase efficiency and yield other benefits.

Instead of transporting large amounts of information into data lakes to be analyzed,several federal agencies are now processing data as it’s being produced — and where — via edge computing.

Data from devices or sensors can be processed by software in real time in an edge-based structure, allowing agencies to quickly access and utilize any findings for faster decision-making.

“If an organization has thousands of live camera feeds, and each camera has a computer vision model that sends an alert when it recognizes an object or activity, we no longer need to send the livestream, the data-intensive video, back to a centralized processing hub,” says Kyle Michl, chief innovation officer at Accenture Federal Services. “We can take action immediately or send relevant clips to the cloud for AI model training.”

Many artificial intelligence-enabled solutions can support an edge computing-based system, such as Dell EMC PowerEdge servers, powered by Intel Xeon processors, integrated VxRail hyperconverged infrastructure solutions and Dell Edge Gateways, which can allow agencies to collect, secure and analyze data from multiple devices and sensors.

Networking solutions, such as Dell EMC’s Virtual Edge Platform and SD-WAN Solution, powered by VMware, can help integrate edge devices into an agency’s existing network.

“The edge is quickly rivaling data centers and public clouds as the location where organizations are gaining valuable insights,” says Jeff Boudreau, president and general manager of Dell Technologies’ infrastructure solutions group. “By putting compute, storage and analytics where data is created, we can deliver those data insights in real time and create new opportunities.”

DISCOVER: Find out how solutions from Dell can help your agency with edge computing.

Having computational data storage and connectivity resources closer to where information is being gathered helps reduce latency and can offer other benefits, according to Bill Wright, senior director of North American government affairs for Splunk.

“The more exciting part is bringing advanced analytics — machine learning, artificial intelligence — to the network edge,” Wright says. “This will enable smarter systems that can operate more autonomously while also parsing data and sharing it upstream, which will reduce bandwidth and storage needs. That’s becoming increasingly important because of the sheer quantity of data.”

How Performing Analytics at the Edge Aids Agencies

A suite of edge computing solutions has enabled agencies to “take the power of edge computing and tactical cloud into areas where you haven’t been able to take them before, so that you can produce great results from a mission perspective,” Cameron Chehreh, vice president and CTO of Dell Technologies Federal Systems, told FedTech. “And then, when you can connect back to networks, you can harness further the data you have collected and the information you have analyzed already at the tactical edge, but now fuse it with your enterprise data to create a more holistic picture.”

Nine out of 10 federal leaders view edge solutions as a very or extremely important element in meeting their agencies’ mission-related needs, according to Accenture Federal Services research.

The U.S. Postal Service is one agency that’s adopted an edge-based analytics system. By utilizing technology company NVIDIA’s EGX platform to analyze billions of images from USPS processing centers, lost packages can now be found in hours instead of days.

The Agriculture Department’s FarmBeats project helps farms obtain an overview of current conditions using AI models and data fed from sensors, drones and other sources to an edge device and Microsoft cloud solution via the unused broadcasting frequencies between television channels.

NASA is also trying out an edge computing approach to process some of its International Space Station data — a change from the agency’s previous method of obtaining, storing and transmitting or bringing data back from missions to be processed, according to Bryan Dansberry, associate program scientist for the ISS Research Integration Office.

“We can do the analysis there, or we can do preliminary work and then use it as a secondary tool, being able to get the data to the ground and analyze it on the ground."

One of NASA’s first attempts to perform edge-based analytics involved data from DNA sequencing being performed in space, according to Dansberry. That could take a significant amount of time to process, even after crew members began doing the sequencing onsite instead of collecting, freezing and physically transporting samples.

RELATED: How can agencies optimize for data ingestion at the edge?

“When we didn’t have the edge computing in place, you still have to find a time where you can schedule a download of that large chunk of data,” he says. “That could be a week or two, and then the research may also slow things down another week or two at whatever facilities are being used. So, you were still looking at weeks or months; that’s now down to minutes.”

NASA has also used edge data processing capabilities and propriety AI software to allow two ISS instruments to communicate directly instead of relying on astrophysics facilities that work different hours to get in touch when a new X-ray object has suddenly erupted in the sky.

The MAXI sky survey instrument that’s manned by one team can alert the Neutron Star Interior Composition Explorer (NICER) instrument, operated by the other facility, to focus on the area to take more detailed measurements.

“You’ve taken the human out of the loop, which was a delaying factor,” Dansberry says. “When one team is awake, the other’s asleep; sometimes, you’d send a message that wouldn’t be received until the next day. If you have a new stellar event taking place, missing six, seven or eight hours before the next instrument picks it up and turns to it can be critical to getting really interesting data for the scientists.”

EXPLORE: How are federal agencies leveraging edge computing in their missions?

Edge-Based Analytics Is Increasing as Tech Advances

Although only about 10 percent of enterprise-generated data was created and processed outside of a traditional data center or cloud just a few years ago, Gartner has predicted 75 percent of enterprise-generated data will be by 2025.

The growing interest in edge processing has been fueled in part by the introduction of more powerful Intel and other processors and AI-optimized chipsets, according to Accenture’s Michl, which has made running advanced analytics at the point of use more possible.

“We are seeing an increase in federal agencies capitalizing on advancements in edge computing technology,” Michl says. “Organizations need to determine what data makes sense and is feasible to move and process in the cloud versus locally at the edge. As technology capabilities continue to advance and data volumes increase, we will see processing across the cloud continuum, with more processing at the edge.”

Edge computing, particularly as 5G connectivity enables even more Internet of Things devices, could have numerous applications in government settings, according to Wright, ranging from running analytics in a remote area during a war to helping federal energy officials perform inspections on oil rigs.

“For the public sector, edge computing has a wide range of use cases,” he says. “That edge is a refugee camp, a battlefield, a secure facility. The government in particular is best positioned to take advantage of edge computing, given the wide array of missions agencies are assigned.”

Brought to you by:

NASA Johnson/Flickr, Creative Commons
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT