Wednesday, August 01, 2012

Ruminating on "Trickle Batch Load".

Traditionally most of the batch jobs were run at end-of-day; where the entire day transaction log was pulled out and processed in some way. During the good old days, when the volume of data was low, these batch processes could easily meet their business SLAs. Even if there was a failure, there was sufficient time to correct the data and rerun the process and still meet the SLA's.

But today, the volume of data has grown exponentially. Across many of our customers, we have seen challenges around meeting SLA's due to large data volumes. Also business stakeholders have become more demanding and want to see sales reports and spot trends early - many times during a day. To tackle these challenges, we have to resort to the 'trickle batch' load design pattern.

In trickle batch, small delta changes from source systems are sent for processing mutiple times a day. There are various advantages of such a design -
  • The business gets real-time access to critical information. For e.g. sales per hour.
  • Business SLA's can be easily met, as EOD processes are no longer a bottleneck. 
  • Operational benefits include low network latency and reduced CPU/resource utilization.
  • Early detection of issues - network issues, file corruption, etc. 
The typically design strategies used to implement "trickle batch" are using the CDC (Change Data Capture) capabilities of Databases and ETL tools.
Today almost all ETL tools such as IBM InfoSphere DataStage, Informatica, Oracle Data Integrator, etc. have in-built CDC capabilities. 
Trickle batches typically feed a ODS (Operational Data Store) that can run transactional reports. Data from the ODS is then fed to a DW appliance for MIS reporting.

No comments:

Post a Comment