Jun 10 2020
Software

What Agencies Should Consider Before Deploying AI

Federal agencies should be methodical in their approach to adopting artificial intelligence.

Artificial intelligence is the major buzzword in federal IT these days, the way that cloud once was. It’s easy to see why. There is booming investment in AI in the private sector, and various agencies across the government are experimenting with AI to achieve their missions.

The National Oceanic and Atmospheric Administration is working with Microsoft to use AI and cloud technology to more easily and accurately identify animals and population counts of endangered species. NASA is ramping up the use of AI throughout its operations, from conducting basic financial operations to finding extra radio frequencies aboard the International Space Station. And the Defense Health Agency’s dermatologists are even using AI to better monitor patients’ skin.

AI has also come to the fore in response to the coronavirus pandemic. As experts from Booz Allen Hamilton wrote in FCW

Governments and organizations have employed AI to track outbreaks and predict where future outbreaks will occur; research and develop treatments; diagnose the disease from CT lung scans; model the protein structures of the SARS-CoV-2 virus that causes COVID-19, which may reveal clues for a vaccine; and better understand the origins and potential future variations of the virus.

Despite the clear utility of AI, agency IT leaders should learn the lessons of the past decade from the rush to the cloud. Some moved as many applications as they could to the cloud in haste, only to later realize that their costs went up and they did not need to put as much data into public clouds as they did. 

IT leaders should be methodical and deliberate in how and when they deploy AI. They should work with partners to ensure that the technology will help meet and support mission needs and that they have the appropriate technological foundation to support such AI solutions. 

How to Plan for an AI Deployment

Some agencies are pushing ahead on AI deployments. For example, the U.S. Patent and Trademark Office recently inked a $50 million contract to use machine learning, natural language processing and AI technologies to improve intellectual property registration.

The road to issuing an AI contract can be long but should be followed deliberately to get the most value out of the investment. At the outset, IT leaders need to clearly identify what their objectives are and think through if or how AI technologies can help. 

They can work with trusted partners such as CDW to whiteboard out those ideas, and can then get connected to industry leaders such as Microsoft, Google and IBM. Those discussions can then lead to a pilot program and, potentially, a larger AI deployment if the trial run proves successful. 

AI is so buzzy and exciting that IT leaders may be tempted to invest in such solutions without thinking through what their intended goals are, or if AI is truly necessary. In some cases, it will not be necessary — or will only be necessary for a few applications or uses. That is why it is critical to talk through objectives with intelligent partners to develop a truly strategic plan. 

READ MORE: Find out how to bring federal workers into the conversation around emerging technologies. 

What Are the Enabling Technologies for AI?

Once it has been determined that AI is the appropriate solution, it is important for agencies to invest in the necessary back-end technologies to support such deployments. 

Some of those solutions are supercomputers, such as the NVIDIA DGX A100, which the Energy Department’s Argonne National Laboratory is using to “help researchers explore treatments and vaccines and study the spread of the virus, enabling scientists to do years’ worth of AI-accelerated work in months or days,” Rick Stevens, an associate lab director at Argonne, says in a statement.

However, there is a wide array of supporting and interlocking technologies needed to make AI a reality and effectively run AI software. They include “data collection, data conditioning, algorithms, computing, robust artificial intelligence, and human-machine teaming,” notes a Cornell University research paper.

The kind of hardware needed to support such solutions doesn’t have to be a supercomputer. It could be a single Dell EMC PowerEdge R740xd master/login node and 16 Dell EMC PowerEdge C6420 dense compute servers in four C6000 chassis. Those nodes are then linked to a 1 Gigabit Ethernet connection for access to the outside network and the internet, and a 10Gb Ethernet connection for internal traffic and data movement. Other solutions include IBM Spectrum Storage, IBM Spectrum Computing and IBM enterprise AI servers

IT leaders should focus on what their goals are and what the correct configuration of technology solutions is to perform the work needed to achieve those goals. 

Such deployments can be expensive. It is wise for IT leaders to understand their requirements and their objectives before diving into AI. 

AI is here to stay. But federal agencies need to approach the technology in a disciplined and logical manner.

This article is part of FedTech’s CapITal blog series. Please join the discussion on Twitter by using the #FedIT hashtag.

CapITal blog logo

Erikona/Getty Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT