By: Eric Plasse and James St. Martin, Technology Modernization Practice
There are many reasons a business with legacy systems may explore moving to a new modern system. Advances in technology can enable an enterprise to reimagine their business model completely, accelerate innovation, and drastically improve the customer experience. While the benefits are clear, the path from old to new can be very challenging. For life and annuity insurers, one of the most difficult challenges is data migration. Highly sophisticated and poorly documented legacy systems are the norm for many insurers. The sheer volume of records that need to migrate and fear of data loss during the process can be overwhelming. Those risks can be managed with a well-constructed data migration plan, allowing a business to reap the full benefits of the new system.
As you construct your plan, here are three steps to ensure a seamless, successful migration:
STEP ONE | DATA MAPPING
Data can be lost if there is a difference in how data is stored in either the source or target system, causing unexpected errors during migration. By adopting a systematic approach to mapping source data, teams can help avoid potential issues. When constructing your data mapping plan, including the steps below:
• Accurately map source data to the target system. This step is the most critical aspect of successful data migration. It’s vital that columns, keys, or values in the source data point to the same columns, keys, or values in the target systems.
• Focus on your most sensitive financial data. This type of data is the most critical of values and must be painstakingly accurate. Businesses should devote a majority of their time to this sensitive data.
• Map iteratively. Map priority data first, then deliver to the development team to begin extraction. Next, move on to secondary data and continue in this fashion. Multiple mapping and development teams should be working concurrently.
• Detect gaps and mitigate. In any migration, no matter how well-planned, there will almost always be data gaps on both sides. For instance, source data with no corresponding target item and a target item with no source; the latter is the more important of the two scenarios when the target requires that data item for valuation. Once a gap is recognized, mitigation can take place.
• Identify the required items:
– Target data structures: ensure the receiving (Target) system provides intake data structures.
– Identify sending system data sources and required access early in the process. Ask: where will the source data come from, and how will it be extracted?
STEP TWO | DATA EXTRACTION
During this step of the process, data is extracted from the source system to be parsed and analyzed before importing into the new repository. As you retrieve data, consider this approach:
• Start data extraction early. Some values can be extracted immediately, while others may be more time-consuming and require translation (smoker codes, coverage/product types, etc.). For example, data such as contract numbers, product information, and client information are common and, so long as the intake structure attributes are available, the source extractions can start. More complex data will require more intensive data mapping, which can run concurrently while preliminary data extractions are coded and tested.
• Use automated testing tools. When available, use automated (preferable) or manual testing tools to achieve a high level of data quality. These tools help to reconcile and validate data as it moves from the legacy system to the new system to ensure it’s in the proper order.
• Design extraction processes to be flexible. Processes should be capable of extracting blocks of data based on criteria such as product type, contract status, and last activity date. Building a data extraction suite that can surgically gather and provide specific blocks of data will support phased conversion efforts that often come into play as the parties get closer to go-live.
• Design extraction processes should run concurrently. There will be many intake structures required by the target system. Parallel execution of data extractions saves valuable time during the go-live event.
STEP THREE | DATA DELIVERY
After completion of data mapping and extraction, the final files are ready to be delivered to the new system. As you take this last step, implement the following:
• Set up an approved data transfer method. Transferring entity tech resources must have a recommended transfer method, preferably a dedicated point-to-point secure connection that does not expose data to unauthorized access. This conduit must be up and running as soon as possible.
• Avoid data obfuscation (masking) of NPI data. A strong contractual agreement regarding data security and a secure point-to-point delivery method can eliminate the need to obfuscate, a process which is cumbersome and adds additional time and cost.
• Deliver early and often. The two parties must agree to a data delivery schedule with documented expectations for improvements in volume and quality of data.
• Load data on the target as soon as possible. Depending on the approach, the receiving entity must be ready to load data once the minimum required data is available.
Once these steps are taken, there should be a secure, constant data quality assurance process in the target system. Following the completion of all the transfers, a final decommission to shut down and/or dispose of old systems can occur.
The challenges of migrating data from legacy to modern systems can be effectively managed with a well-constructed data migration strategy. Leveraging a proven methodology can reduce risk and deliver significant benefits. Organizations considering such an effort should consider working with an expert with a proven track record of successful data migrations.
Considering an outside resource to help you on your mission to modernization? The team at NEOS has migrated more than 5 million policies with over $75 billion of AUM. We’ve helped dozens of clients achieve their data migration objectives. Let’s get started.