Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Aug 28 2024
Data Analytics

What Is ModelOps, and How Can It Help Agencies Innovate AI?

Its capabilities give agencies the tools they need to clean data and manage the machine learning models that power AI services.

Greater artificial intelligence disruption, and opportunity, appears to be on the horizon, with agencies looking to increase the integration of the technology within their research efforts.

To make the most of their AI investments, agencies need tools for managing machine learning models, governing and cleaning the data feeding into them, and adjusting them when new data becomes available.

While agencies such as the Department of Agriculture already use AI to strengthen crop estimates, process geospatial data, model potential disease outbreaks and more, these advanced efforts all fall under the rubric of model operationalization, or ModelOps, a concept developed within the past decade. ModelOps involves the use of tools, technologies and processes to manage the lifecycle of machine learning models.

As federal IT leaders incorporate AI more deliberately into agency missions, ModelOps will become an essential approach.

Click the banner below to learn more about continuous app modernization.

 

What Is ModelOps, and How Does It Work?

ModelOps is an umbrella term that includes tools that allow organizations to derive greater value from their AI models, says Terry Halvorsen, vice president of federal client development at IBM. That can include DevOps, DataOps, ITOps and MLOps.  

Importantly, ModelOps also involves tools related to data management and data cleaning. Ideally, those tools will leverage automation, Halvorsen says, “because one of the big problems with all of this — and implementing enterprise AI and cleaning up your data — is that there aren’t enough skilled people” to do the work.

“So, how do I use these techniques to automate things and, say, reduce my requirement for data scientists?” Halvorsen asks. “I could do that with this set of tools and processes.”

ModelOps is focused on the governance of the full lifecycle of ML models and ensuring that the models are updated when they start to become less stable and lose their predictive value, says Jenn Atlas, director of global presales at Minitab, a data analytics firm that offers ModelOps services.

“From a big-picture standpoint, its job is to make sure that the model is good, holding its own and alerting the data scientists and other people who are using that model [to issues],” Atlas says.

ModelOps can also be used to swap in new models when an agency’s main model needs fine-tuning or replacement. The capability encompasses safety and ensuring that models are not using biased data that will lead to biased outcomes, Atlas says.

Jenn Atlas Minitab
The tricky part of ModelOps is that you are usually crossing a couple of different departments that have competing priorities, but it behooves them to work together to get that benefit.”

Jenn Atlas Director of Global Presales, Minitab

ModelOps vs. MLOps: What’s the Difference?

Seen as a subset of ModelOps, MLOps is a set of tools focused more on enabling data scientists and others they are working with to collaborate and communicate when automating or adjusting ML models, Atlas says. It is concerned with testing ML models and ensuring that the algorithms are producing accurate results.

MLOps is also more narrowly focused on factors such as the costs of data engineering and model training. It also focuses on checking the data that is feeding into the ML model.

“It’s the part that makes it worthwhile or not if ModelOps can be successful,” Atlas says. 

Federal ModelOps Use Cases 

There are many use cases for ModelOps in government, AI experts say. 

ModelOps is particularly helpful in the federal context for data management and improving data quality, Halvorsen says. That’s in large part because agencies house reams of legacy data, much of it on paper, that they are digitizing.

That data likely includes many errors that were created in the past, and these could impact models that are trained on the data. Errors could be discovered through DataOps and ITOps tools, Halvorsen says.

Many organizations, including agencies, use ML models to analyze drone footage and other surveillance imagery to detect changes from previous observations, Atlas says. Automating that through ModelOps could be useful to agencies including USDA, the Army Corps of Engineers and others that perform observations in the field and analyze data.

DISCOVER: Data poisoning is evolving alongside AI.

ModelOps also helps agencies check whether the data they are collecting and using for models is current enough for the desired application. “If I’m targeting, it better be current data and not something based on a geographic survey from three years ago,” says Halvorsen, who is a former Department of Defense CIO.

ModelOps can determine data viability as well, Halvorsen says. Data viability refers to the shelf life of data, or how long it can be stored and still be useful. 

Additionally, ModelOps tools can account for whether the volume of data being used for a model will limit the impact of any errors within it, Halvorsen says. The opposite may be true if an agency uses a smaller but more accurate data set to train a model. 

How Can Agencies Realize the Benefits of ModelOps?

ModelOps is not particularly difficult to implement, but it often fails when IT leaders don’t invest enough in cleaning their data.

“People don’t have a good understanding of their data, and they frankly don’t want to pay to restructure and in some cases rearchitect the data to make it more valuable for use in an AI development,” Halvorsen says.

To get the benefits of ModelOps, there must be strong partnerships and communication among data scientists, engineers, IT security teams and other technologists, Atlas says.

MORE FROM FEDTECH: AIOps helps administrators gain an edge over network disruptions.

“The tricky part of ModelOps is that you are usually crossing a couple of different departments that have competing priorities, but it behooves them to work together to get that benefit,” she says.

It helps to have a leader such as a chief data officer to bridge those gaps and make sure the teams are all working together, Atlas says. 

The title of the convening figure — whether it’s a CIO, CDO or chief AI officer — matters less than it being someone who “has the money and the authority to follow the fixes” and implement needed changes in the data or models, Halvorsen says. Given the current state of budgeting, that will probably continue to be CIOs, he says.

With that in place, the leader should focus on how much data the agency is using. The leader must also ensure that the agency gets the most out of its data if it’s determined that large amounts are lying fallow when it comes to training models.

“You have to increase the value you’re getting from your data and how much data you’re using, but you also have to make sure that data is high quality,” Halvorsen says. “Sometimes this is hard because people want results faster, particularly in government where it can be less about money and more about showing results."

“You have to resist the urge to say, ‘OK, well, I’ll just throw the AI at my current data,’” he says. “That will give you crappy results.”

UP NEXT: The State Department is improving the state of federal AI.

Just_Super/Getty Images