Overcoming Performance and Compatibility Gaps Switching from Oracle to Chinese Databases
In this interview, senior database expert Kong Zaihua explains the main performance, functional, and usability shortcomings of domestic Chinese databases compared with Oracle, and outlines practical strategies, tools, and migration techniques to evaluate compatibility, handle DDL conversion, manage online/offline data transfer, and reduce reliance on stored procedures.
Performance, Functional, and Usability Gaps
Domestic Chinese databases often excel in a single workload (OLTP or OLAP) but lack the balanced HTAP performance of Oracle. Functionally, they typically cover 80‑90 % of Oracle’s feature set; missing areas include advanced analytics, tooling, and ecosystem integration. Usability suffers from fewer bug‑fix cycles and a smaller community, leading to more stability issues.
Migration and Compatibility Assessment
Key assessment steps:
Identify Oracle‑specific features used in applications (e.g., proprietary PL/SQL packages, optimizer hints, advanced security).
Extract all SQL statements from production and test environments using custom code‑scanning and SQL‑capture tools supplied by third‑party vendors.
Run the extracted SQL against the target domestic database in a validation sandbox to detect syntax or semantic incompatibilities.
Generate a migration report that lists required code rewrites, alternative functions, and estimated effort.
DDL Conversion
Because financial services require near‑continuous availability, the bank prefers online migration when possible. The conversion workflow includes:
Map Oracle data types to the target database (e.g., NUMBER → DECIMAL, VARCHAR2 → VARCHAR).
Translate Oracle object definitions (tables, indexes, constraints, sequences) to the target syntax.
Use log‑based synchronization tools to keep Oracle and the new database in sync until cut‑over.
If existing open‑source tools are insufficient, collaborate with vendors to build bespoke conversion utilities that handle proprietary data‑type mappings and DDL nuances.
Stored Procedures, Triggers, and Application Coupling
The bank recommends minimizing stored procedures and triggers because they tightly couple applications to a specific engine. When such objects exist, the migration process should:
Perform a detailed inventory of all PL/SQL packages, functions, and triggers.
Use automated refactoring tools to translate PL/SQL to portable SQL or to the target database’s procedural language, if available.
If translation is not feasible, redesign the affected functionality to be implemented in the application layer.
Online vs. Offline Data Migration
Two migration strategies are employed:
Online migration: Deploy a change‑data‑capture (CDC) component that reads Oracle redo logs, transforms the data, and applies it to the target database in near real‑time. This approach reduces downtime to the final cut‑over window.
Offline migration: When CDC does not support the target database, a custom bulk‑load utility is built. Large tables are split into chunks (e.g., by primary‑key range) and loaded in parallel to accelerate transfer.
Tooling Ecosystem
Successful migration hinges on tool maturity:
Code‑scan / SQL‑capture tools: Automatically collect every SQL statement executed in production and test environments.
Compatibility validator: Executes captured SQL against the target database, flags unsupported syntax, and suggests rewrite patterns.
DDL conversion engine: Handles data‑type mapping, object recreation, and generates migration scripts.
CDC / log‑sync component: Streams incremental changes during the migration window.
When off‑the‑shelf tools fall short, the bank co‑develops extensions with vendors, iteratively improving rule sets and automation.
Best‑Practice Recommendations
Prefer simple, standards‑compliant SQL to reduce coupling.
Avoid or replace stored procedures and triggers with application‑level logic.
Validate compatibility early using automated scanning and sandbox testing.
Leverage log‑based synchronization for online migration to meet 24/7 service requirements.
Implement parallel chunk loading for very large tables when offline migration is necessary.
dbaplus Community
Enterprise-level professional community for Database, BigData, and AIOps. Daily original articles, weekly online tech talks, monthly offline salons, and quarterly XCOPS&DAMS conferences—delivered by industry experts.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
