For instance, stackArmor works with agencies to map AI security standards to Federal Risk and Authorization Management Program and National Institute of Standards and Technology requirements to speed up the process by which companies receive authority to operate. That way, agencies have more AI solutions to choose from.
Industry Partnerships and Federal Funding Are Needed
Under the executive order, the Department of Homeland Security is expected to develop an AI risk management framework.
“You have to understand not just the AI that you build but the AI that you buy that’s embedded in software,” Kent says. “What’s missing is the tactical framework of how all those things exactly knit together.”
Agencies have AI pilots down, but they need industry’s help in establishing structure, controls and processes for maintaining software as they scale, she adds.
RELATED: Agencies should be part of the AI proof-of-concept process.
The entertainment sector undoubtedly requires different AI security standards than the health and national security sectors, which also needs to be addressed, the former federal CIO says.
Kent was pleased to see the executive order task agency deputy secretaries with AI oversight and propose steps to close the AI talent gap in government. The National Science Foundation is required to establish at least four new National AI Research Institutes, and agencies will need to reinvest in workforce training as the technology changes employees’ roles and responsibilities, Kent says.
For example, the Department of Veterans Affairs is using AI to turn doctors’ spoken words into folder notes, but that won’t eliminate the need for doctors — it will simply free them up for different tasks.
All of this will require a boost to agencies’ IT budgets.
“Make it a priority,” Kent says. “And fund it.”