Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Nov 09 2018
Cloud

Data Migration Process: How Agencies Can Successfully Move Data to Modern Systems

As agencies shift more data and applications from legacy systems to the cloud, here are some tips to keep in mind.

The Trump administration’s new “Cloud Smart” strategy encourages federal agencies to continue moving data to the cloud. The strategy also notes that data migration planning is an essential element of such moves.

“Cloud technology adoption requires that agencies prioritize migration planning, sustainment, and organizational maturity in order to realize the full benefit of these services,” the strategy states

Yet for data migrations to be successful, agency IT teams need to rigorously plan out such migrations, consider their scope, examine the agency’s existing infrastructure and data, and determine how data will be structured once it’s moved into a more modern architecture. Such processes need to be deliberate and involve strong IT governance as well as technology, experts say.

“You can’t possibly pick up all of your infrastructure, applications and data and move it to the cloud overnight,” says Dave McClure, the principal director of Accenture Federal Services, who leads its CIO leadership agenda. “It’s a multiyear adventure, if you will.”

MORE FROM FEDTECH: Discover how agencies should approach FITARA compliance in 2018!  

How Can Agencies Successfully Undertake Data Migrations?

There are numerous questions agency IT leaders should ask themselves and their staff when undertaking a data migration process. As Valeh Nazemoff, executive vice president and co-owner of Acolyst, a business technology performance management consulting firm, notes in FCW, these include asking why the data needs to be migrated, what it’s currently used for and which staff will be involved.

Additionally, Nazemoff says, agencies should consider the timing of a migration, where the data should go once it is moved and how it will be migrated.

McClure, who was formerly the associate administrator of citizen services and innovative technologies at the General Services Administration, agrees, and says most agencies need to consider the target for migration of legacy databases into the cloud and why that is necessary. “What is that going to get us?” he says. “Is it getting us more cost savings? Is it getting us better security because it’s more secure? Is it getting us better performance so that mission side is happier with the process tools and service delivery tools that are available in the cloud environment?”

Once agencies get a handle on the scope of their migrations and why they want to move data, they should collect critical infrastructure and application data. That will help them determine whether data is structured or unstructured and if it is embedded in legacy proprietary systems, McClure says.

“What is the data that I need to move and why? Just having those strategy discussions about that is pretty important,” he says.

Agencies should also undertake a deep analysis of the value stream, cost savings and performance improvement that come from moving specific applications to the cloud, he adds. That enables agencies to develop a roadmap of what is going to be moved, when, how and why.

MORE FROM FEDTECH: Find out how agencies can use government data to drive innovation! 

The Types of Data Migration for Agencies to Consider

There are several different kinds of data migrations agencies can undertake. Here is a quick breakdown on a few of the major ones:

  • Cloud Migration: Agencies undertaking cloud migration usually do so because it grants them higher reliability and greater availability of their data, McClure notes. Legacy systems put agencies in danger, according to a recent Accenture survey of 185 federal IT executives; 58 percent of those surveyed say their agencies experienced two to three major disruptions or outages over the past decade, and just 4 percent avoided any discontinuities within that time frame. Agencies also move data to the cloud for greater security protections than exist with legacy systems, in part because cloud service providers continuously update security protections. Compute power is also relatively inexpensive, delivering cost savings, McClure notes. And cloud enables agencies to achieve compliance goals and increase transparency in reporting on their data.

    The biggest fears with cloud migration are data loss and a lack of interoperability among different cloud platforms. “How easily does that data reconcile and become transferrable back to another system? What guarantee do I have that it’s actually my data and that it’s not infiltrated with other data as well?” McClure says. Agencies also need to contend with potential cloud service disruptions and costs associated with new cloud services.

  • Database Migration: With database migrations, agencies often need to overcome the challenge of data stored in proprietary systems with embedded logic and rules around its use. “The movement of a database to a cloud environment may not be as seamless as one expects if you don’t understand how that data and those business rules and logic will behave when they move to an entirely different infrastructure and entirely different way of storing and computing data,” McClure says.

    Agencies should ensure that new databases have a track record of being able to perform just as well or better than what was in place before. In many cases, databases are not transferable to the cloud, McClure warns, meaning agencies need to figure out new database systems or cloud applications that can perform the same functions more efficiently. 

  • Application Migration: The “lift and shift” migration of applications to the cloud from traditional data centers is often difficult, McClure says, and agencies often need to re-architect apps in cloud environments. Accenture recommends to federal clients that they also consider moving nonproduction environments to the cloud because they tend to have less migration complexity and often account for a large chunk of an agency’s IT budget. Doing so gives agency IT teams experience with the cloud without having to move core operating systems or mission apps first.

    Agencies may need to renew old software, McClure says, and in some cases, software may no longer be supported. If that is the case, it may make more sense to look for a Software as a Service solution that can replace existing software.

What Is Data Migration Testing and Why Is It Important?

According to the Software Testing Help blog, data migration testing involves “a verification process of migration of the legacy system to the new system with minimal disruption/downtime, with data integrity and no loss of data, while ensuring that all the specified functional and non-functional aspects of the application are met post-migration.”

Essentially, agencies need to undertake a health assessment on whether apps and data are fit for the cloud or other new environments, McClure says. Agencies need to look at technical issues, as well as those related to migration costs, risks, business values, and complexity of the transfer and re-orchestrating of the data to work in cloud environments.

Such data migration testing is designed to give agencies a clear sense of the relative complexity or simplicity of moving data. McClure says agencies should not begin any data migration until they perform that analysis, which Accenture dubs a Fit for Cloud Assessment. 

Digital%20Transformation_IR_1%20(1).jpg

Data Migration Tools Agencies Can Use

There is no shortage of tools for agencies to examine their infrastructure and data. Unfortunately, McClure says, agencies often do not have a complete picture of their infrastructure, apps and data. Once that inventory is done, data migration becomes more doable, according to McClure, and “the roadmap of what is moved when, where, why and how becomes easier to decipher.”

Agencies often have a lot of “dark” data that is unknown to IT teams and departments wide reporting mechanisms. To get a handle on their data, agencies can turn to extraction, transformation and loading (ETL) tools, which are often associated with data warehouses. Such tools are designed to extract, examine and clean data, and learn how it is being used.

IBM, Microsoft, Oracle and SAP are among the key ETL vendors, McClure says. Other vendors, such as Informatica, Qlik and NetApp, offer similar tools.

As TechTarget notes, there are also several categories of file migration tools, including host-based file-level migration; host-based block-level migration; network-based file-level migration; network-based block-level migration; and array-based block-level migration.

No matter the tool, McClure says, agencies need to understand the data they want to move and why, and then do so in a way that minimizes risk and addresses the most common issues associated with remediation problems: changing from one database scenario to another or one infrastructure or application set to another. 

“It requires that careful analysis so that the risk factors are lowered and the chances of success are tremendously improved,” he says.

gorodenkoff/Getty Images