Predictive Analytics Helps Optimize the Federal Supply Chain
The Defense Logistics Agency uses a predictive analytics tool to forecast demand for materials such as weapons, gear, rations and other equipment. This helps ensure optimum readiness by predicting and preventing shortages rather than scrambling to fill gaps in the supply chain, says Jeanie Parrish, business analytics branch chief at the DLA Analytics Center of Excellence.
“We’re trying to support our warfighter in the readiness arena. So, the more that we can use these emerging technologies to get better predictions, to get faster information for the decision-makers to identify innovative solutions, the better it is,” says Parrish, who oversees DLA’s enterprise-level strategic operations research and analytic efforts.
The homegrown material-availability predictor model forecasts demand up to 90 days in advance; the agency recently brought on IBM to help optimize its algorithm and push that prediction window as far as 12 months.
“We’re pulling contractors that have the data science skills to apply those types of applications to determine if we can get better predictions about lead time, estimation and demand forecasting,” Parrish says.
For example, the DLA analytics team uses quality and procurement data to ensure the “health” of any given item.
“There are a lot of different item variables that come into play when you’re trying to find root causes that would put an item on back order,” Parrish explains. A Qlik visualization dashboard, with data that’s refreshed daily, helps DLA analysts see trends, conduct research and perform root cause analyses — across the supply chain or down to the item level.
“It’s a very robust tool. But even being as robust as it is, we want to make sure it’s accurate and that the algorithms we’ve chosen are the right ones. So that’s why we have a contractor coming in,” Parrish says.
“If we can get a 12-month prediction that’s more accurate than our 90-day prediction, that gives our analysts the ability and more time to prevent the back order from happening.”
Automation and Analytics Speed Up Workflows
AI and related technologies have the capability to analyze the vast amounts of data the federal government gathers. Automation and predictive data management further speeds the work. That, in turn, allows employees to spend their time doing what they do best, rather than staring at raw numbers when it’s not part of their job.
“We want to use AI to make human decisions in a singular high- volume function, such as going over very large amounts of data,” says Chakib Chraibi, chief data scientist and acting associate director of the Office of Data Services at the National Technical Information Service.
“You want to filter that data and make sure that when humans are involved, they are really involved in what their expertise is,” he says.
For example, the Department of Energy and the National Nuclear Safety Administration, responsible for maintaining the safety and security of the nuclear weapons stockpile, require highly skilled specialists.
“Predictive analytics optimizes logistics to effectively use those highly skilled resources,” Chraibi says. “You’re going to be more proactive than reactive, to predict if there’s going to be a failure somewhere.
“You optimize the use of your resources, and you address problems before they become crises.”
Effective use of specialists’ time is also a priority at DLA. Analysts “can do all kinds of things with mathematics, and we can do all kinds of things with modeling and simulation,” Parrish says. “Our people know the business, but they don’t know the data interpretation.
“So our biggest obstacle is trying to get people trained to know if the algorithm that we use is the right one that would give us more accurate predictions. That’s why we’re bringing in contractors.”
Saving Taxpayer Dollars with Data Analytics
NTIS also works with the Department of Health and Human Services and HHS’ Office of the Inspector General to fight healthcare-related fraud, waste and abuse.
“They want to make sure that every dollar spent is properly spent in an approved transaction, and they also were interested in supporting better service outcomes at lower cost,” Chraibi says. “We used artificial intelligence and advanced data analytics to help them identify suspicious transactions by detecting anomalies and unusual patterns, which can result in identifying overpayments, improper payments or fraudulent schemes.”
In 2018, improper payments accounted for 5 percent of Medicare’s net costs of $616.8 billion, according to the Centers for Medicare and Medicaid Services. Since 2011, the agency has used a predictive analytics tool that flags suspicious claims.
The tool analyzes about 4.5 million Medicare prepaid claims each day. By 2016, the agency had saved some $1.5 billion. In 2019, the program contributed to the first-ever national ROI of $11.60 for every dollar the federal government spends on CMS’ integrity program. “It’s a very robust tool. But even being as robust as it is, we want to make sure it’s accurate and that the