Close

New Workspace Modernization Research from CDW

See how IT leaders are tackling workspace modernization opportunities and challenges.

Feb 23 2026
Artificial Intelligence

How Federal Agencies Turn Data Readiness Into Real-World AI Results

Officials unlock artificial intelligence value by fixing governance, metadata and pipelines first — building trusted, shareable data foundations across missions.

When Taka Ariga was chief data and chief artificial intelligence officer at the Office of Personnel Management, he introduced agile governance, a model that allows AI projects to move “at the speed of innovation” rather than being stymied by federal bureaucracy.

He created a 10-person governance team representing every part of the organization to oversee data governance, data readiness and AI initiatives — because “if you have a team of 30, aligning calendars is impossible,” he says.

Instead of quarterly or monthly meetings where AI might get five minutes of discussion in a large group, the small team met every two weeks to guide decisions on data quality and reliability for AI projects. Because AI models are updated regularly, for example, agencies can’t spend six months evaluating each new development before making decisions, he says.

“The most important part is that we were empowered to make microdecisions without having a perfect set of information,” says Ariga, now a senior fellow at the Data Foundation. “The frequency of the meetings and the empowerment of that group allowed us to make sometimes imperfect decisions along the way as use cases progressed from feature development to proof of concept and pilots.”

Click the banner below to ready your data foundation for AI.

 

While Silicon Valley embraces a “fail fast” mentality, government can’t fail on behalf of taxpayers. Instead, the OPM governance team focused on learning quickly, making decisions with imperfect information while knowing they could adjust course.

“We can make decisions, knowing nothing we do during development is ultimately catastrophic,” Ariga says. “We can always go back and change direction.”

Are Public Sector Agencies More AI Ready?

Historically, federal agencies have been seen as technology laggards, slowed by red tape and compliance mandates. But when it comes to AI, the public sector may actually be ahead of the curve.

Regulated industries, including the public sector, are better positioned to pursue AI initiatives than most businesses because regulations have forced them into better data management, says IDC Research Vice President Dave McCarthy.

Taka Argia

 

McCarthy says his analysis is confirmed by storage vendor executives who have told him that, in contrast, many unregulated businesses aren’t ready for AI projects because they lack proper data hygiene and readiness. Federal agencies have been compelled to practice meticulous data management for decades through mandates like the Federal Records Act, Federal Information Security Modernization Act security requirements and the Federal Data Strategy.

“Some of the stuff that was considered a weight before is actually an accelerator for some of this new technology,” McCarthy says. But that doesn’t mean agencies have solved all their data challenges. When organizations implement AI, it starts to expose weaknesses in their data strategies, but they’re still ahead of organizations starting from scratch, he says.

Ann Dunkin, former Energy Department CIO and now a distinguished professor of the practice and distinguished external fellow at the Georgia Institute of Technology, says data readiness at agencies varies significantly. She agrees that regulated industries as well as regulated agencies are well positioned because “they know what data they have and where it’s located, so you can apply AI to it much more easily.”

But science agencies face vastly different challenges. NASA has decades of data in every format, including paper, while the Energy Department’s research records — not regulatory data — are also difficult to access.

Click the banner below to keep up with the IT, cyber and AI experts making government efficiency a reality.

 

“When your data is everywhere, in all sorts of different formats, how do you possibly apply AI?” Dunkin says.  

These varying levels of readiness and challenges across the federal government highlight why agencies must pursue disciplined data management and modernized infrastructure, whether it’s in the cloud, on-premises or via a hybrid approach, so they can position themselves for successful AI implementations.

Making Data AI Ready

A recent study by Forrester shows that only 15% of organizations have achieved a positive bottom-line impact from their AI initiatives in the past 12 months — and the biggest reason is that their data is not ready, says Forrester analyst Indranil Bandyopadhyay.

“Your success with any kind of AI — predictive, generative or agentic — is wholly dependent upon the maturity of your data estate, and how you are able to make your data AI ready,” says Futurum Group analyst Brad Shimmin.

Click the banner below for the latest federal IT and cybersecurity insights.

 

Ariga says there are four tenets to data governance: data inventory and availability, a data reliability assessment, interoperability, and data loss prevention.

Some chief data officers mistakenly think they need a comprehensive data inventory before pursuing AI initiatives, an approach that can paralyze agencies, says Oliver Wise, former chief data officer at the Department of Commerce and now executive director of the Bloomberg Center for Government Excellence at Johns Hopkins University.

Instead, effective data leaders should start by speaking with business and program leaders to identify pressing organizational problems that can be solved by AI, then focus on preparing the specific data needed to address those problems.

“We don’t serve ourselves well as chief data officers by saying all of our data has to be 100% ready before we entertain the idea of AI,” Wise says. “We try to solve the problem first, then we build the data inventory along the way.”

DISCOVER: The Marines’ AI strategy is dependent on digital transformation teams.

Best Practices To Manage Data

Analysts say cloud providers and vendors offer data governance tools to catalog data, create metadata, enforce policies, and track the lineage or flow of data to ensure security, privacy and compliance. Microsoft Purview, AWS’ Lake Formation, Amazon DataZone and Google Dataplex offer cloud-based data governance, Bandyopadhyay says.

“These tools now use AI to automate things like discovery, classifying sensitive information, even predicting security risks,” he says.

Data observability tools are also critical, providing visibility into data quality, helping identify and resolve data issues quickly, and managing resource consumption, Shimmin says.

Companies are increasingly integrating these capabilities into unified platforms, creating more comprehensive data management solutions. Vendors in this space include Alteryx and Informatica, Shimmin says. Purview offers observability tools as well , Bandyopadhyay says.

UP NEXT: Observability is key to modernizing federal networks.

“The market is rich with such tooling,” Shimmin says. “There are tons of options.”

A key best practice is to develop rich metadata that makes agency data searchable, accessible and machine interpretable for AI, Wise says.

For example, before Dunkin left the Energy Department, her team was developing an AI application to help scientists find research across all of the national labs. They were in the process of creating a metadata-driven “card catalog” that could index siloed research data, so researchers could find relevant work at other facilities and find opportunities for collaboration while still maintaining strict access controls.

Photography by Stephen Voss