Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Dec 06 2024
Artificial Intelligence

Fighting the AI ‘Octopus’: Agencies Need a Methodology

Artificial intelligence touches every part of an agency, like it or not. CDW’s Mastering Operational AI Transformation program, otherwise known as MOAT, will help.

Artificial intelligence evolved from a tool assisting with prompted tasks to an autonomous agent handling complex tasks, with varied foundational models and tooling at its disposal, between early 2023 and 2024.

Agents with different “personalities” can now work together as a swarm to solve problems. AI computing has scaled about 1,000 times in the past eight years, underscoring the breakneck velocity of change occurring in the space.

Contrary to what you may have heard, AI isn’t in a flattening hype cycle, and the next wave of foundational models and capabilities are expected to fuse with robotics for next-generation transformation.

Government needs to address this growth thoughtfully. Historically, it has adopted emerging technologies first without adapting to or caring for them, then paid industry to clean up the resulting mess later — at great cost. This can’t be the case with a technology that will inevitably touch all parts of agencies.

Fortunately, the Mastering Operational AI Transformation methodology exists to help agencies establish a unified approach to how they will use and secure the technology to provide value to employees and citizens.

Click the banner below to start modernizing digital government with emerging technologies.

 

The AI Octopus Touches Everything in an Agency

CDW developed MOAT in February 2023, in response to ChatGPT 3.5 hitting the market, and it’s already used by Colorado’s Department of Motor Vehicles and in heavily regulated industries such as finance, manufacturing and healthcare.

The Department of Energy and the Intelligence Community are parts of government already inundated with AI that would benefit from MOAT most readily.

MOAT starts by educating IT leaders on where AI is headed, then helps them establish governance and policies and identify and align use cases through a steering committee or center of excellence. A COE is a cross-functional group of leaders from across the agency — security, policy, development and legal professionals — all coming together to manage the octopus of everything that is touching AI across the organization.

Leadership must align everyone in the agency with its AI vision to ensure no one intentionally or unwittingly sabotages it, then collectively determine how to achieve it.

MOAT’s reference architecture examines AI readiness holistically across five pillars: people, strategy, data, administration and solutions. The dimensions of each pillar are measured and rated as in place, somewhat in place or not in place, so the entire agency has visibility and can budget appropriately to address gaps.

Whether an agency wants to use AI or not, the COE needs to establish policies for either locking the technology down — because employees are most certainly bringing it into the office via their phones — or deciding which AI models to use in an official manner.

Digital Government TOC

 

Identifying and Aligning AI Use Cases

Employees should be encouraged to share all existing AI use cases they’re aware of, so the COE can address and align them using metadata to assess whether the projected lifecycle warrants adoption, and to determine funding and flag security concerns.

Outstanding AI use cases don’t only come out of an agency’s IT arm. In fact, IT teams tend to be attracted to fun, interesting projects at the expense of quick wins that reduce manual workflows, costs and time to completion. But remember, the longer it takes to build an AI model, the more likely that model is to be obsolete upon completion because technology has already advanced.

AI won’t take jobs from employees who know how to use it, so agencies need to stress the importance of upskilling to workers — even if the workers have to take that retraining upon themselves.

UP NEXT: The best way to prepare for an AI-fueled cyberattack is to practice.

This article is part of FedTech’s CapITal blog series.

CapITal blog logo

cokada / Getty Images