In a recent webinar (Reporting and Data Considerations during Guidewire Core Systems Upgrade), we asked our attendees which core legacy system(s) – Claims, Policy, Billing, or None – they have been able to retire completely? Over 60% of the people responded as “None”. This response is not surprising as most insurers have complex IT landscapes with multiple legacy systems of each kind.
Guidewire customers realize tremendous value from their Guidewire ClaimCenter®, PolicyCenter, and BillingCenter implementations but still struggle with zombie legacy systems. However, reducing the reliance on legacy systems and simplifying the IT landscape are key tenets of business transformation.
In this three-part blog series, I would like to present a few approaches to legacy data migration for those with Guidewire core systems. Of course, the approach employed will depend upon the insurer’s business and technical needs and goals. These approaches are not meant to be prescriptive and a legacy data migration journey should only be embarked upon after close consultation with the key business, IT, and partner stakeholders.
In this first part, let’s examine some data migration approaches for our claims management software, ClaimCenter.
Manual Approach: This option is typically viable only for smaller claims volume insurers. A manual migration consists of keying open claims into the new system (in this case ClaimCenter) and retaining all closed claims on the legacy system. While relatively inexpensive, there is a risk of keying errors, plus both the legacy and new system will run concurrently until the need to view old claims arises or disconnection of all legacy-connected systems is complete. Also, prior to retirement, the data in the legacy system will need to be saved off into a data repository for safekeeping.
“Big Bang”: A “Big Bang” approach may be the best approach for those insurers who need to retire their legacy system as fast as possible and can deploy ClaimCenter in one phase. This has the potential for greater risk, as all the data must be cleansed and operationalized for the new system. Depending on the quality of the legacy data, there may be additional effort to pass the data through more iterations than originally anticipated, resulting in project delay and additional cost.
For customers with very large volumes of historical data, this approach could result in performance degradation in ClaimCenter, and a migrate-to-archive approach would be recommended. The migrate-to-archive approach relies on a second instantiation of ClaimCenter which would be used only for the migration of historical claims (those that won’t be loaded into the production ClaimCenter instance). These are migrated first to a second instance of ClaimCenter and then archived. The production version will then retrieve from the archive as needed.
Phased: This is a “Big Bang” approach executed in multiple iterations. The project is divided into logical segments, such as by line of business or claims office. Each segment is migrated in a “Big Bang” approach (either to the production system or using the migrate-to-archive approach). Similarly, each succeeding segment of the project is then migrated. A phased approach will require two operational systems. However, the duality will be compartmentalized based on the scope of each phase. For example, if a phase is by branch office, only the migrated branch will use the new system, while the other branches will continue using the legacy system, even though the legacy and new systems are both running simultaneously.
Guidewire DataHub**™**: The typical use case for DataHub, Guidewire’s data warehouse, is when an insurer wishes to consolidate multiple legacy claim systems to ClaimCenter. In a typical migration to ClaimCenter, an intermediate data store is created to consolidate and cleanse the legacy data. DataHub replaces that intermediate data store. The attribution for claims data is minimal out-of-the-box with DataHub, requiring extensions in most cases. The open claims would then typically be migrated to ClaimCenter, while the closed claims remain in DataHub. DataHub will be required to do a “conversion of one” when a closed claim needs to be re-opened and transferred to ClaimCenter. It is just as important to ensure data integrity with both open and closed claims, as there is no claim archive for validation. Thus, additional care is needed to validate claims when stored in DataHub to ensure they are sufficiently cleansed to pass the equivalent of the open claim integrity checks for ClaimCenter. Not passing could result in the delay of activating a claim from DataHub. Data residing in DataHub can also be used for other purposes such as downstream integrations and reporting. It reduces the retirement risk as it retains the original source values which can be used by other systems that remain, with less code changes. It also enables faster retirement of the legacy system.
In the next part of this blog series, I will examine data migration approaches for Guidewire PolicyCenter®.