Cloud Enterprise Architecture, cloud migration, Case Study, Data Migration, data engineering
Case Study: Migrate Data From Mainframe To AWS RDS
Background
A large US Government agency was faced with the dilemma of migrating their existing data from a legacy mainframe system to a new platform housed on Amazon Web Services (AWS). The agency had approximately 80 million customer benefits records that needed to be migrated from an IBM mainframe to an open source platform while ensuring the accuracy and integrity of every benefits record. This required the records to be transformed from a mainframe-based flat file structure to a cloud-native, modern distributed database while maintaining the ability of the original system to continue to seamlessly interface with the transformed and re-hosted data.
Analysis
STS consulted with the agency to determine that the 80 million customer benefits-related records needed to be migrated to a new system with a requirement for zero data loss. Additionally, all of the legacy system’s business rules, which were written in COBOL and also stored on the mainframe, needed to be extracted and migrated while preserving the business rules as-is. The agency determined that they needed to search more effectively than their current system would allow even though the agency wrote their current system over the course of the last 40 years. The business rules that were written initially in the mainframe and expanded through the years involved a significantly different process than the current one—with some business rules being written over 6-month timeframes. Additionally, as the agency utilized and expanded on their existing database, they were a bit “creative” in their approach to data storage , and borrowed space from other areas within the IBM mainframe, which presented difficulties with tracking data stored.
Solution
STS initially considered a mySQL solution, but discovered that the mySQL platform had structural and performance limitations. STS then decided that creating an end-to-end ETL process leveraging the open-source tool Pentaho to extract data from the mainframe into an AWS hosted Oracle 12C database would provide the best solution for the agency. Utilizing Pentaho, a data integration tool, the developers were able to identify data, validate it, search for any potential data anomalies on a row-by-row basis, migrate it, and load it into the new database. Additionally, the developers involved with this project were able to reverse engineer from the COBOL code the underlying business rules developed prior to 1980. The end-to-end migration was successful—the first part of a much larger scope and complex project.
Benefit
The STS-driven outcomes delivered to the agency offered a myriad of benefits. Given the new AWS platform and the overall data migration, the agency is able to operate more efficiently because all of the business rules and errors associated with the previous platform have been cleaned up. There is current functionality in place to ensure that data is validated correctly, and the agency can search their new database more effectively. Additionally, the look and feel of the existing platform has been retained and users can see the traditional mainframe terminal “Green Screen” look and feel. The learning curve for users was very minimal and the overall functionality, capability and performance was significantly improved.
Like what you're reading? Start a conversation by booking a meeting with us today.
Access our Proven Strategies for Agency Legacy Application Migration Ebook to learn about challenges of a legacy application migration to the cloud, metrics for success, and more.