The Urban and Public Utilities Division of China Resources Data Science recently completed the microservices transformation of a legacy complex business system in the real estate industry. The project has now been successfully launched, and during the system switchover, the online business data of the original monolithic system was seamlessly and losslessly migrated in batches to the new microservices architecture system, ensuring a smooth business transition. This article shares our thoughts, explorations, and lessons learned during this data migration process, with the hope of providing some experience for those with similar needs.
The monolithic system we transformed supports core business and was built using a commercial open-source framework, having been in operation for 8 years. Over this period, as the business developed, the development team kept piling on code in response to new requirements, inevitably leading to code rot. Coupled with significant staff turnover, this resulted in a loss of architectural and technical documentation, which left the system with complicated, hard-to-understand logic, poor stability, low efficiency, and difficulty in extending functionality.
The team had previously tried to refactor some non-core business using the strangler pattern as well as implementing new business requirements with microservices using the repairer pattern. However, these methods were not easy to apply to the core business code due to the bloated legacy code, tightly coupled logic, non-reusable modules, and lack of tests. Moreover, the legacy system was overwhelmed by the increasing amount of business and a comprehensive transformation was urgent. Taking into account that the team had extensive business and product expertise to support the overall system planning, we finally decided to opt for a holistic evolution approach for the transformation. After the business product experts, architects, and technical leaders designed the overall service planning blueprint and the team completed the microservices decomposition and development verification, a comprehensive data migration plan was required before the new services went live to ensure a smooth business transition and customer experience. The plan aimed to maintain data accuracy and integrity. The next section elaborates on the design and implementation process of the data migration plan.
The following illustration shows the main stages included in this data migration process:
-
In the plan formulation stage, it’s essential to consider the challenges that will arise in the subsequent stages and complete the technology selection based on actual needs
-
In the scriptwriting phase, focus on the mapping relationships of business data and ensure full communication with the business and product teams; additionally, pay attention to the efficiency of script execution and optimize the code logic in a timely manner
-
Comprehensive tests for the migration scripts are necessary, so the preparation stage for the testing environment should strive to keep the test data consistent with the production data while managing sensitive data effectively
-
After running the migration scripts in the test environment, not only do developers need to verify the correctness of data conversion mapping, but the business product team also needs to validate that the business processes are functioning smoothly
-
Given the limited time window for migration execution, a contingency plan for unexpected events must be developed and understood by the technical team
-
After a phased rollout, it’s important to properly monitor the new system and troubleshoot any issues that arise during online operation
Combining the current status of legacy systems, business, and the team, we have identified the following three main challenges in the process of this data migration task:
-
The online production data we need to migrate includes critical business elements such as contracts, bills, payment information, etc. These data have very low tolerance for errors, which requires the migration plan must have extremely high data migration accuracy
-
In the process of microservices transformation, business experts have re-modeled and optimized the business rules and data, so data migration is not just a simple database-table-field mapping process, but also involves complex operations like data transformation, cleaning, and completion
-
For business continuity, the data migration and verification tasks need to be completed within approximately 3 hours of downtime. After calculations, the actual time allocated for data migration execution is only 1 hour. Therefore, our plan must have a high execution efficiency, reliable rollback and retry mechanisms, and complete sufficient simulation validation before going live.
-
The execution time window for data migration needs to be controlled within two hours. During the testing and verification process of migration scripts, it is necessary to collect and summarize the execution time and data transfer rate of each script to identify and optimize slow scripts in advance. For this, we have customized the performance data and log output module of DataX. After each run, the relevant data is summarized and printed.
In response to the above challenges, we have chosen the DataX data migration framework. We have carried out a series of customizations, including adapting to development and deployment needs, validation and testing needs, and business needs. In terms of implementation details, we have adopted Java/SpringBatch as the data migration framework, using JSON format to write migration script configuration models, performing complex operations on data through custom transformers, and supporting version control management with git. During the migration process, we focused on testing, avoiding data conflicts, and preparing for failures. Through our efforts and improvements, the data migration was successfully completed, ensuring a smooth business transition.
The above is the essence of the article, and the author’s introduction is as follows:
Wang Zhiyong, currently serves as the Director of Urban and Public Architecture at China Resources Numerical Control Enterprise Digitalization Center, with many years of experience in the software development industry, mainly serving large-scale customer customized software development projects. He focuses on system architecture design, team capability building, and developing high-quality products.