Close

New AI Research From CDW

See how IT leaders are tackling AI opportunities and challenges.

Jun 16 2025
Artificial Intelligence

Federal AI Optimization Starts With Strategic Partnerships

Industry has the manpower and connections to get resource-strapped agencies’ models of the ground.

Agencies need to recognize many of the tasks they’re trying to accomplish with artificial intelligence have already been done by industry and tap into its experience.

Most pilots attempting to operationalize AI at scale fail due to automation, performance or integration limitations, but third-party partners such as CDW Government allow agencies to build prototypes with automation capabilities they may not have natively.

Many agencies experienced funding cuts and layoffs during the first four months of the Trump administration and will need to lean on industry partners with the manpower to integrate workflows and the corporate connections to enable AI.

With its massive commercial arm, CDW Government can show agencies what worked and what didn’t work for industry and help them move twice as fast with that business intelligence.

Click the banner below to learn what's next for artificial intelligence.

 

Understanding AI and Its Security Needs

The Air Force recently announced its AI Center of Excellence, which will leverage the expertise of Stanford University’s School of Engineering. Expect this to become a trend, along with agencies working together to accelerate AI innovation, especially after reductions in their developer ranks.

That said, some agencies still need to overcome hurdles before they can embrace the technology.

We’re still seeing misunderstandings about what AI is in the marketplace. For instance, does a camera that can identify people use data analytics or AI? The reality is, AI is simply the management and manipulation of data with models, and agencies need to establish a foundational capability.

Another hurdle: Some agencies still need to establish an AI policy or security parameters. New AI capabilities are coming to market constantly, which creates new security risks that agencies need to account for.

It’s important that agencies don’t cut corners when trying to deploy affordable AI, because those corners will inevitably require more regulation and security controls. Shadow AI is the result of employees using the technology secretly without agency approval, and agencies should avoid it at all costs.

Agencies Need Help Establishing a Foundational AI Capability

Agencies looking for AI best practices should collaborate with industry and universities, and the Air Force is a shining example. Despite already having the Air Force Research Lab and Air Force Academy, the branch forged strategic partnerships with Stanford, the Massachusetts Institute of Technology, Microsoft and other tech companies to get its AI CoE off the ground.

U.S. Special Operations Command has also begun using AI for cognitive mission planning and streamlining command and control, and the Department of Defense employs machine learning to analyze drones and related capabilities through Project Maven.

For the first time, we’re seeing hyperscalers such as Microsoft and Google work together and with startup Scale AI to develop the Pentagon’s Thunderforge capability. Microsoft has developed a next-generation quantum chip, Majorana 1, that coupled with Google’s AI capability should lead to tremendous innovation.

Whether it’s the realm of public health, public works, law enforcement or homeland security, we’re seeing requests for generative AI alternatives to ChatGPT that government can use. The demand is there; smart partnerships are key.

This article is part of FedTech’s CapITal blog series.

CapITal blog logo

Airman 1st Class Luis A. Ruiz-Vazquez/Air Force