The data is already there
A North American utility runs 312 Oracle Forms screens against a 4.8 TB Oracle Database that records every field service call going back to 2003. The operations team has been asking for a real-time outage dashboard for nine years. The answer has always been: wait for the modernization.
The wait was unnecessary. The data is sitting in tables. The blocker was never the database — it was the assumption that analytics had to wait for the front end.
Why analytics gets deferred
Modernization projects sequence work in a predictable order: extract the forms, rebuild the front end, replace the database, then expose APIs for downstream consumers. Analytics ends up at the bottom of the list because it’s not on the critical path for the migration.
This sequencing is wrong for most enterprises. The analytics use case is often the highest-ROI work in the entire program, and it can be delivered in parallel with the front-end rebuild rather than after it.
Change Data Capture is the unlock
Change Data Capture (CDC) reads the Oracle Database transaction log and streams every insert, update, and delete to a downstream system in near real time. Tools like Oracle GoldenGate, Debezium, and managed CDC services from the hyperscalers have made this a commodity capability.
The Oracle Forms application doesn’t need to change. The PL/SQL packages don’t need to be rewritten. The CDC stream produces a continuously updated copy of the operational data in a target system optimized for analytics — typically Snowflake, BigQuery, Databricks, or ClickHouse.
Latency in practice
CDC pipelines on Oracle Forms workloads typically deliver end-to-end latency of 2 to 15 seconds. For most operational dashboards, that’s indistinguishable from real time. For genuinely sub-second use cases — fraud detection, algorithmic trading — additional architecture is needed, but those workloads are rare in Oracle Forms environments.
We’ve delivered CDC pipelines that take 6 to 10 weeks from kickoff to first dashboard. The work is well-understood and largely independent of the front-end migration.
Schema drift is the real problem
The hard part of CDC isn’t the streaming. It’s keeping the analytical schema aligned with an operational schema that wasn’t designed for analytics. Oracle Forms applications typically have hundreds of tables with cryptic column names, embedded business rules in triggers, and historical artifacts from earlier database versions.
A working analytics layer requires a semantic model — a translation between the operational schema and the questions the business actually asks. dbt has become the standard tool for this. The semantic model is the single most valuable artifact produced during the analytics work, and it carries forward into the eventual front-end migration.
Reverse ETL closes the loop
Once analytics data lives in the warehouse, reverse ETL pushes derived insights back into operational systems. A churn score computed in BigQuery becomes a column in the customer screen. A predicted maintenance date becomes a notification in the work order form.
This pattern lets enterprises deliver AI-powered features into their existing Oracle Forms applications without rewriting the forms. The Oracle Forms screen displays the score. Nobody on the user side knows or cares where it was computed.
Sequencing the work
The right sequence for most enterprises is: CDC pipeline first, semantic model second, dashboards third, reverse ETL fourth, then front-end modernization. The analytics work pays for itself in 9 to 14 months and creates the data foundation that the modernization eventually inherits.
The bottom line
Real-time analytics doesn’t have to wait for modernization. The data is already in the database. CDC and a good semantic model can deliver real-time dashboards on top of Oracle Forms applications in 6 to 10 weeks, without touching a single .fmb file. The enterprises that decouple analytics from front-end migration capture value years earlier.