SlideShare a Scribd company logo
1 of 11
Download to read offline
An Oracle White Paper
August 2012
Best Practices for Real-time Data Warehousing
Best Practices for Real-time Data Warehousing
1
Executive Overview
Today’s integration project teams face the daunting challenge that, while data volumes are
exponentially growing, the need for timely and accurate business intelligence is also
constantly increasing. Batches for data warehouse loads used to be scheduled daily to weekly;
today’s businesses demand information that is as fresh as possible. The value of this real-
time business data decreases as it gets older, latency of data integration is essential for the
business value of the data warehouse. At the same time the concept of “business hours” is
vanishing for a global enterprise, as data warehouses are in use 24 hours a day, 365 days a
year. This means that the traditional nightly batch windows are becoming harder to
accommodate, and interrupting or slowing down sources is not acceptable at any time during
the day. Finally, integration projects have to be completed in shorter release timeframes,
while fully meeting functional, performance, and quality specifications on time and within
budget. These processes must be maintainable over time, and the completed work should be
reusable for further, more cohesive, integration initiatives.
Conventional “Extract, Transform, Load” (ETL) tools closely intermix data transformation
rules with integration process procedures, requiring the development of both data
transformations and data flow. Oracle Data Integrator (ODI) takes a different approach to
integration by clearly separating the declarative rules (the “what”) from the actual
implementation (the “how”). With ODI, declarative rules describing mappings and
transformations are defined graphically, through a drag-and-drop interface, and stored
independently from the implementation. ODI automatically generates the data flow, which
can be fine-tuned if required. This innovative approach for declarative design has also been
applied to ODI's framework for Changed Data Capture (CDC). ODI’s CDC moves only
changed data to the target systems and can be integrated with Oracle GoldenGate, thereby
enabling the kind of real time integration that businesses require.
This technical brief describes several techniques available in ODI to adjust data latency from
scheduled batches to continuous real-time integration.
Best Practices for Real-time Data Warehousing
2
Introduction
The conventional approach to data integration involves extracting all data from the source
system and then integrating the entire set—possibly using an incremental strategy—in the
target system. This approach, which is suitable in most cases, can be inefficient when the
integration process requires real-time data integration. In such situations, the amount of data
involved makes data integration impossible in the given timeframes.
Basic solutions, such as filtering records according to a timestamp column or “changed” flag,
are possible, but they might require modifications in the applications. In addition, they
usually do not sufficiently ensure that all changes are taken into account.
ODI’s Changed Data Capture identifies and captures data as it is being inserted, updated, or
deleted from datastores, and it makes the changed data available for integration processes.
Real-Time Data Integration Use Cases
Integration teams require real-time data integration with low or no data latency for a number
of use cases. While this whitepaper focuses on data warehousing, it is useful to differentiate
the following areas:
- Real-time data warehousing
Aggregation of analytical data in a data warehouse using continuous or near real-
time loads.
- Operational reporting and dashboards
Selection of operational data into a reporting database for BI tools and dashboards.
- Query Offloading
Replication of high-cost or legacy OLTP servers to secondary systems to ease query
load.
- High Availability / Disaster Recovery
Duplication of database systems in active-active or active-passive scenarios to
improve availability during outages.
- Zero Downtime Migrations
Ability to synchronize data between old and new systems with potentially different
technologies to allow for switch-over and switch-back without downtime.
- Data Federation / Data Services
Provide virtual, canonical views of data distributed over several systems through
federated queries over heterogeneous sources.
Best Practices for Real-time Data Warehousing
3
Oracle has various solutions for different real-time data integration use cases. Query
offloading, high availability/disaster recovery, and zero-downtime migrations can be handled
through the Oracle GoldenGate product that provides heterogeneous, non-intrusive and
highly performant changed data capture, routing, and delivery. In order to provide no to low
latency loads, ODI has various alternatives for real-time data warehousing through the use of
CDC mechanism, including the integration with Oracle GoldenGate. This integration also
provides seamless operational reporting. Data federation and data service use cases are
covered by Oracle Data Service Integrator (ODSI).
Architectures for Loading Data Warehouses
Various architectures for collecting transactional data from operational sources have been
used to populate data warehouses. These techniques vary mostly on the latency of data
integration, from daily batches to continuous real-time integration. The capture of data from
sources is either performed through incremental queries that filter based on a timestamp or
flag, or through a CDC mechanism that detects any changes as it is happening. Architectures
are further distinguished between pull and push operation, where a pull operation polls in
fixed intervals for new data, while in a push operation data is loaded into the target once a
change appears.
A daily batch mechanism is most suitable if intra-day freshness is not required for the data,
such as longer-term trends or data that is only calculated once daily, for example financial
close information. Batch loads might be performed in a downtime window, if the business
model doesn’t require 24 hour availability of the data warehouse. Different techniques such
as real-time partitioning or trickle-and-flip1 exist to minimize the impact of a load to a live
data warehouse without downtime.
Batch Mini-Batch Micro-Batch Real-Time
Description Data is loaded in full
or incrementally
using a off-peak
window.
Data is loaded
incrementally
using intra-day
loads.
Source changes
are captured and
accumulated to
be loaded in
intervals.
Source changes
are captured and
immediately
applied to the
DW.
Latency Daily or higher Hourly or higher 15min & higher sub-second
Capture Filter Query Filter Query CDC CDC
Intialization Pull Pull Push, then Pull Push
Target Load High Impact Low Impact, load frequency is tuneable
Source Load High Impact Queries at peak
times necessary
Some to none depending on CDC
technique
1 See also: Real-Time Data Warehousing: Challenges and Solutions by Justin Langseth
(http://dssresources.com/papers/features/langseth/langseth02082004.html)
Best Practices for Real-time Data Warehousing
4
IMPLEMENTING CDC WITH ODI
Change Data Capture as a concept is natively embedded in ODI. It is controlled by the
modular Knowledge Module concept and supports different methods of CDC. This chapter
describes the details and benefits of the ODI CDC feature.
Modular Framework for Different Load Mechanisms
ODI supports each of the described data warehouse load architectures with its modular
Knowledge Module architecture. Knowledge Modules enable integration designers to
separate the declarative rules of data mapping from their technical implementation by
selecting a best practice mechanism for data integration. Batch and Mini-Batch strategies can
be defined by selecting Load Knowledge Modules (LKM) for the appropriate incremental
load from the sources. Micro-Batch and Real-Time strategies use the Journalizing
Knowledge Modules (JKM) to select a CDC mechanism to immediately access changes in
the data sources. Mapping logic can be left unchanged for switching KM strategies, so that a
change in loading patterns and latency does not require a rewrite of the integration logic.
Methods for Tracking Changes using CDC
ODI has abstracted the concept of CDC into a journalizing framework with a JKM and
journalizing infrastructure at its core. By isolating the physical specifics of the capture
process from the process of detected changes, it is possible to support a number of different
techniques that are represented by individual JKMs:
Non-invasive CDC through Oracle GoldenGate
Oracle GoldenGate provides a CDC mechanism that can process source changes non-
invasively by processing log files of completed transactions and storing these captured
changes into external Trail Files independent of the database. Changes are then reliably
transferred to a staging database. The JKM uses the metadata managed by ODI to generate
Figure 1: GoldenGate-based CDC
J$ T
Source
Log
S
ODI
Load
S
GoldenGate
Target
Staging
Real-Time Reporting
Best Practices for Real-time Data Warehousing
5
all Oracle GoldenGate configuration files, and processes all GoldenGate-detected changes in
the staging area. These changes will be loaded into the target data warehouse using ODI’s
declarative transformation mappings. This architecture enables separate real-time reporting
on the normalized staging area tables in addition to loading and transforming the data into
the analytical data warehouse tables.
Database Log Facilities
Some databases provide APIs and utilities to process table changes programmatically. For
example, Oracle provides the Streams interface to process log entries and store them in
separate tables. Such log-based JKMs have better scalability than trigger-based mechanisms,
but still require changes to the source database. ODI also supports log-based CDC on DB2
for iSeries using its journals.
Database Triggers
JKMs based on database triggers define procedures that are executed inside the source
database when a table change occurs. Based on the wide availability of trigger mechanisms in
databases, JKMs based on triggers are available for a wide range of sources such as Oracle,
IBM DB2 for iSeries and UDB, Informix, Microsoft SQL Server, Sybase, and others. The
disadvantage is the limited scalability and performance of trigger procedures, making them
optimal for use cases with light to medium loads.
Figure 3: Trigger-based CDC
Trigger
S J$ T
Source Target
ODI Load
Figure 2: Streams-based CDC on Oracle
S J$ T
ODI Load
Log Oracle Streams
Target
Source
Best Practices for Real-time Data Warehousing
6
Oracle Changed Data Capture Adapters
Oracle also provides separate CDC adapters that cover legacy platforms such as DB2 for
z/OS, VSAM CICS, VSAM Batch, IMS/DB and Adabas. These adapters provide
performance by capturing changes directly from the database logs.
Source databases supported for ODI CDC
Database Log-based CDC Trigger-
based CDC
JKM Oracle
GoldenGate
Database
Log Facilities
Oracle CDC
Adapters
Oracle   
MS SQL Server   
Sybase ASE  
DB2 UDB  
DB2 for iSeries 2  
DB2 for z/OS 2 
Informix, Hypersonic
DB

Teradata, Enscribe,
MySQL, SQL/MP,
SQL/MX
2
VSAM CICS, VSAM
Batch, IMS/DB,
Adabas

2 Requires customization of Oracle GoldenGate configuration generated by JKM
Best Practices for Real-time Data Warehousing
7
Publish-and-Subscribe Model
The ODI journalizing framework uses a publish-and-subscribe model. This model works in
three steps:
1. An identified subscriber, usually an integration process, subscribes to changes that
might occur in a datastore. Multiple subscribers can subscribe to these changes.
2. The Changed Data Capture framework captures changes in the datastore and then
publishes them for the subscriber.
3. The subscriber—an integration process—can process the tracked changes at any
time and consume these events. Once consumed, events are no longer available for
this subscriber.
ODI processes datastore changes in two ways:
- Regularly in batches (pull mode)—for example, processes new orders from the
Web site every five minutes and loads them into the operational datastore (ODS)
- In real time (push mode) as the changes occur—for example, when a product is
changed in the enterprise resource planning (ERP) system, immediately updates the
on-line catalog
Processing the Changes
ODI employs a powerful declarative design approach, Extract-Load, Transform (E-LT),
which separates the rules from the implementation details. Its out-of-the-box integration
interfaces use and process the tracked changes.
Developers define the declarative rules for the captured changes within the integration processes
in the Designer Navigator of the ODI Studio graphical user interface—without having to code.
In ODI Studio, customers declaratively specify set-based maps between sources and targets, and
then the system automatically generates the data flow from the set-based maps.
Figure 4: The ODI Journalizing Framework uses a publish-and-subscribe architecture
Order #5A32
Orders
Order #5A32
CDC
Subscribe
Consume
Capture/Publish
Consume
Subscribe
Integration
Process 2
Target 1
Target 2
Integration
Process 1
Best Practices for Real-time Data Warehousing
8
The technical processes required for processing the changes captured are implemented in
ODI’s Knowledge Modules. Knowledge Modules are scripted modules that contain database
and application-specific patterns. The runtime then interprets these modules and optimizes
the instructions for targets.
Ensuring Data Consistency
Changes frequently involve several datastores at one time. For example, when an order is
created, updated, or deleted, it involves both the orders table and the order lines table. When
processing a new order line, the new order to which this line is related must be taken into
account.
ODI provides a mode of tracking changes, called Consistent Set Changed Data Capture, for
this purpose. This mode allows you to process sets of changes that guarantee data
consistency.
Best Practices using ODI for Real-Time Data Warehousing
As with other approaches there is no one-size-fits-all approach when it comes to Real-Time
Data Warehousing. Much depends on the latency requirements, overall data volume as well
as the daily change volume, load patterns on sources and targets, as well as structure and
query requirements of the data warehouse. As covered in this paper, ODI supports all
approaches of loading a data warehouse.
In practice there is one approach that satisfies the majority of real-time data warehousing use
cases: The micro-batch approach using GoldenGate-based CDC with ODI. In this
approach, one or more tables from operational databases are used as sources and are
replicated by GoldenGate into a staging area database. This staging area provides a real-time
copy of the transactional data for real-time reporting using BI tools and dashboards. The
operational sources are not additionally stressed as GoldenGate capture is non-invasive and
performant, and the separate staging area handles operational BI queries without adding load
to the transactional system. ODI performs a load of the changed records to the real-time
data warehouse in frequent periods of 15 minutes or more. This pattern has demonstrated
the best combination of providing fresh, actionable data to the data warehouse without
introducing inconsistencies in aggregates calculated in the data warehouse.
Best Practices for Real-time Data Warehousing
9
Customer Use Case: Overstock.com
Overstock.com is using both GoldenGate and ODI to load customer transaction data into a
real-time data warehouse. GoldenGate is used to capture changes from its retail site, while
ODI is used for complex transformations in its data warehouse.
Overstock.com has seen the benefits of leveraging customer data in real time. When the
company sends out an e-mail campaign, rather than waiting one, two, or three days, it
immediately sees whether consumers are clicking in the right place, if the e-mail is driving
consumers to the site, and if those customers are making purchases. By accessing the data in
real time using Oracle GoldenGate and transforming it using Oracle Data Integrator,
Overstock.com can immediately track customer behavior, profitability of campaigns, and
ROI. The retailer can now analyze customer behavior and purchase history to target
marketing campaigns and service in real time and provide better service to its customers.
Conclusion
Integrating data and applications throughout the enterprise, and presenting a consolidated
view of them, is a complex proposition. Not only are there broad disparities in data
structures and application functionality, but there are also fundamental differences in
integration architectures. Some integration needs are data oriented, especially those involving
large data volumes. Other integration projects lend themselves to an event-oriented
architecture for asynchronous or synchronous integration.
Changes tracked by Changed Data Capture constitute data events. The ability to track these
events and process them regularly in batches or in real time is key to the success of an event-
driven integration architecture. Oracle Data Integrator provides rapid implementation and
maintenance for all types of integration projects.
Figure 5: Micro-Batch Architecture using ODI and GoldenGate
Operational
Source(s)
Log
Periodical
ODI Load
GoldenGate
Real-Time Data
Warehouse
Staging Area for
Operational BI
Operational BI /
Real-Time Reporting
Best Practices for Real-time Data Warehousing
August 2012
Author: Alex Kotopoulis
Oracle Corporation
World Headquarters
500 Oracle Parkway
Redwood Shores, CA 94065
U.S.A.
Worldwide Inquiries:
Phone: +1.650.506.7000
Fax: +1.650.506.7200
oracle.com
Copyright © 2010, Oracle and/or its affiliates. All rights reserved. This document is provided for information purposes only and
the contents hereof are subject to change without notice. This document is not warranted to be error-free, nor subject to any other
warranties or conditions, whether expressed orally or implied in law, including implied warranties and conditions of merchantability or
fitness for a particular purpose. We specifically disclaim any liability with respect to this document and no contractual obligations are
formed either directly or indirectly by this document. This document may not be reproduced or transmitted in any form or by any
means, electronic or mechanical, for any purpose, without our prior written permission.
Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective
owners.
0510

More Related Content

Similar to best-practices-for-realtime-data-wa-132882.pdf

Change data capture the journey to real time bi
Change data capture the journey to real time biChange data capture the journey to real time bi
Change data capture the journey to real time biAsis Mohanty
 
Migration to Oracle 12c Made Easy Using Replication Technology
Migration to Oracle 12c Made Easy Using Replication TechnologyMigration to Oracle 12c Made Easy Using Replication Technology
Migration to Oracle 12c Made Easy Using Replication TechnologyDonna Guazzaloca-Zehl
 
Scalable scheduling of updates in streaming data warehouses
Scalable scheduling of updates in streaming data warehousesScalable scheduling of updates in streaming data warehouses
Scalable scheduling of updates in streaming data warehousesFinalyear Projects
 
REAL TIME PROJECTS IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...
REAL TIME PROJECTS  IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...REAL TIME PROJECTS  IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...
REAL TIME PROJECTS IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...Finalyear Projects
 
The Shifting Landscape of Data Integration
The Shifting Landscape of Data IntegrationThe Shifting Landscape of Data Integration
The Shifting Landscape of Data IntegrationDATAVERSITY
 
MODERN DATA PIPELINE
MODERN DATA PIPELINEMODERN DATA PIPELINE
MODERN DATA PIPELINEIRJET Journal
 
Using Data Platforms That Are Fit-For-Purpose
Using Data Platforms That Are Fit-For-PurposeUsing Data Platforms That Are Fit-For-Purpose
Using Data Platforms That Are Fit-For-PurposeDATAVERSITY
 
Data warehouse-testing
Data warehouse-testingData warehouse-testing
Data warehouse-testingraianup
 
December 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWA
December 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWADecember 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWA
December 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWACarsten Roland
 
Data warehouse-dimensional-modeling-and-design
Data warehouse-dimensional-modeling-and-designData warehouse-dimensional-modeling-and-design
Data warehouse-dimensional-modeling-and-designSarita Kataria
 
Cloud vs On-prem Data Warehouse.pdf
Cloud vs On-prem Data Warehouse.pdfCloud vs On-prem Data Warehouse.pdf
Cloud vs On-prem Data Warehouse.pdfAmeliaWong21
 
Data Ware House Testing
Data Ware House TestingData Ware House Testing
Data Ware House Testingmanojpmat
 
Traditional BI vs. Business Data Lake – A Comparison
Traditional BI vs. Business Data Lake – A ComparisonTraditional BI vs. Business Data Lake – A Comparison
Traditional BI vs. Business Data Lake – A ComparisonCapgemini
 
Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Precisely
 
Data Orchestration Solution: An Integral Part of DataOps
Data Orchestration Solution: An Integral Part of DataOpsData Orchestration Solution: An Integral Part of DataOps
Data Orchestration Solution: An Integral Part of DataOpsEnov8
 
An Overview of Data Lake
An Overview of Data LakeAn Overview of Data Lake
An Overview of Data LakeIRJET Journal
 
Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...
Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...
Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...IRJET Journal
 
TDWI checklist 2018 - Data Warehouse Infrastructure
TDWI checklist 2018 - Data Warehouse InfrastructureTDWI checklist 2018 - Data Warehouse Infrastructure
TDWI checklist 2018 - Data Warehouse InfrastructureJeannette Browning
 

Similar to best-practices-for-realtime-data-wa-132882.pdf (20)

Change data capture the journey to real time bi
Change data capture the journey to real time biChange data capture the journey to real time bi
Change data capture the journey to real time bi
 
Migration to Oracle 12c Made Easy Using Replication Technology
Migration to Oracle 12c Made Easy Using Replication TechnologyMigration to Oracle 12c Made Easy Using Replication Technology
Migration to Oracle 12c Made Easy Using Replication Technology
 
Scalable scheduling of updates in streaming data warehouses
Scalable scheduling of updates in streaming data warehousesScalable scheduling of updates in streaming data warehouses
Scalable scheduling of updates in streaming data warehouses
 
REAL TIME PROJECTS IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...
REAL TIME PROJECTS  IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...REAL TIME PROJECTS  IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...
REAL TIME PROJECTS IEEE BASED PROJECTS EMBEDDED SYSTEMS PAPER PUBLICATIONS M...
 
The Shifting Landscape of Data Integration
The Shifting Landscape of Data IntegrationThe Shifting Landscape of Data Integration
The Shifting Landscape of Data Integration
 
MODERN DATA PIPELINE
MODERN DATA PIPELINEMODERN DATA PIPELINE
MODERN DATA PIPELINE
 
Using Data Platforms That Are Fit-For-Purpose
Using Data Platforms That Are Fit-For-PurposeUsing Data Platforms That Are Fit-For-Purpose
Using Data Platforms That Are Fit-For-Purpose
 
Data warehouse-testing
Data warehouse-testingData warehouse-testing
Data warehouse-testing
 
December 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWA
December 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWADecember 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWA
December 2015 - TDWI Checklist Report - Seven Best Practices for Adapting DWA
 
Data warehouse presentation
Data warehouse presentationData warehouse presentation
Data warehouse presentation
 
Data warehouse-dimensional-modeling-and-design
Data warehouse-dimensional-modeling-and-designData warehouse-dimensional-modeling-and-design
Data warehouse-dimensional-modeling-and-design
 
Cloud vs On-prem Data Warehouse.pdf
Cloud vs On-prem Data Warehouse.pdfCloud vs On-prem Data Warehouse.pdf
Cloud vs On-prem Data Warehouse.pdf
 
Data Ware House Testing
Data Ware House TestingData Ware House Testing
Data Ware House Testing
 
Traditional BI vs. Business Data Lake – A Comparison
Traditional BI vs. Business Data Lake – A ComparisonTraditional BI vs. Business Data Lake – A Comparison
Traditional BI vs. Business Data Lake – A Comparison
 
Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?
 
GROPSIKS.pptx
GROPSIKS.pptxGROPSIKS.pptx
GROPSIKS.pptx
 
Data Orchestration Solution: An Integral Part of DataOps
Data Orchestration Solution: An Integral Part of DataOpsData Orchestration Solution: An Integral Part of DataOps
Data Orchestration Solution: An Integral Part of DataOps
 
An Overview of Data Lake
An Overview of Data LakeAn Overview of Data Lake
An Overview of Data Lake
 
Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...
Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...
Evaluation of Data Auditability, Traceability and Agility leveraging Data Vau...
 
TDWI checklist 2018 - Data Warehouse Infrastructure
TDWI checklist 2018 - Data Warehouse InfrastructureTDWI checklist 2018 - Data Warehouse Infrastructure
TDWI checklist 2018 - Data Warehouse Infrastructure
 

Recently uploaded

Invezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz1
 
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Introduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxIntroduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxfirstjob4
 
CebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptxCebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptxolyaivanovalion
 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxolyaivanovalion
 
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...Delhi Call girls
 
Generative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and MilvusGenerative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and MilvusTimothy Spann
 
BabyOno dropshipping via API with DroFx.pptx
BabyOno dropshipping via API with DroFx.pptxBabyOno dropshipping via API with DroFx.pptx
BabyOno dropshipping via API with DroFx.pptxolyaivanovalion
 
Mature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxMature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxolyaivanovalion
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130Suhani Kapoor
 
RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998YohFuh
 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxolyaivanovalion
 
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service BhilaiLow Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service BhilaiSuhani Kapoor
 
Ukraine War presentation: KNOW THE BASICS
Ukraine War presentation: KNOW THE BASICSUkraine War presentation: KNOW THE BASICS
Ukraine War presentation: KNOW THE BASICSAishani27
 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxMohammedJunaid861692
 
VidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptxVidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptxolyaivanovalion
 
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改atducpo
 
VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...
VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...
VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...Suhani Kapoor
 

Recently uploaded (20)

Invezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signals
 
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Introduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxIntroduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptx
 
꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...
꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...
꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...
 
CebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptxCebaBaby dropshipping via API with DroFX.pptx
CebaBaby dropshipping via API with DroFX.pptx
 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFx
 
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
 
Generative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and MilvusGenerative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and Milvus
 
BabyOno dropshipping via API with DroFx.pptx
BabyOno dropshipping via API with DroFx.pptxBabyOno dropshipping via API with DroFx.pptx
BabyOno dropshipping via API with DroFx.pptx
 
Mature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxMature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptx
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
 
RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998
 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptx
 
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service BhilaiLow Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
Low Rate Call Girls Bhilai Anika 8250192130 Independent Escort Service Bhilai
 
Ukraine War presentation: KNOW THE BASICS
Ukraine War presentation: KNOW THE BASICSUkraine War presentation: KNOW THE BASICS
Ukraine War presentation: KNOW THE BASICS
 
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in  KishangarhDelhi 99530 vip 56974 Genuine Escort Service Call Girls in  Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
 
VidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptxVidaXL dropshipping via API with DroFx.pptx
VidaXL dropshipping via API with DroFx.pptx
 
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
代办国外大学文凭《原版美国UCLA文凭证书》加州大学洛杉矶分校毕业证制作成绩单修改
 
VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...
VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...
VIP High Class Call Girls Jamshedpur Anushka 8250192130 Independent Escort Se...
 

best-practices-for-realtime-data-wa-132882.pdf

  • 1. An Oracle White Paper August 2012 Best Practices for Real-time Data Warehousing
  • 2. Best Practices for Real-time Data Warehousing 1 Executive Overview Today’s integration project teams face the daunting challenge that, while data volumes are exponentially growing, the need for timely and accurate business intelligence is also constantly increasing. Batches for data warehouse loads used to be scheduled daily to weekly; today’s businesses demand information that is as fresh as possible. The value of this real- time business data decreases as it gets older, latency of data integration is essential for the business value of the data warehouse. At the same time the concept of “business hours” is vanishing for a global enterprise, as data warehouses are in use 24 hours a day, 365 days a year. This means that the traditional nightly batch windows are becoming harder to accommodate, and interrupting or slowing down sources is not acceptable at any time during the day. Finally, integration projects have to be completed in shorter release timeframes, while fully meeting functional, performance, and quality specifications on time and within budget. These processes must be maintainable over time, and the completed work should be reusable for further, more cohesive, integration initiatives. Conventional “Extract, Transform, Load” (ETL) tools closely intermix data transformation rules with integration process procedures, requiring the development of both data transformations and data flow. Oracle Data Integrator (ODI) takes a different approach to integration by clearly separating the declarative rules (the “what”) from the actual implementation (the “how”). With ODI, declarative rules describing mappings and transformations are defined graphically, through a drag-and-drop interface, and stored independently from the implementation. ODI automatically generates the data flow, which can be fine-tuned if required. This innovative approach for declarative design has also been applied to ODI's framework for Changed Data Capture (CDC). ODI’s CDC moves only changed data to the target systems and can be integrated with Oracle GoldenGate, thereby enabling the kind of real time integration that businesses require. This technical brief describes several techniques available in ODI to adjust data latency from scheduled batches to continuous real-time integration.
  • 3. Best Practices for Real-time Data Warehousing 2 Introduction The conventional approach to data integration involves extracting all data from the source system and then integrating the entire set—possibly using an incremental strategy—in the target system. This approach, which is suitable in most cases, can be inefficient when the integration process requires real-time data integration. In such situations, the amount of data involved makes data integration impossible in the given timeframes. Basic solutions, such as filtering records according to a timestamp column or “changed” flag, are possible, but they might require modifications in the applications. In addition, they usually do not sufficiently ensure that all changes are taken into account. ODI’s Changed Data Capture identifies and captures data as it is being inserted, updated, or deleted from datastores, and it makes the changed data available for integration processes. Real-Time Data Integration Use Cases Integration teams require real-time data integration with low or no data latency for a number of use cases. While this whitepaper focuses on data warehousing, it is useful to differentiate the following areas: - Real-time data warehousing Aggregation of analytical data in a data warehouse using continuous or near real- time loads. - Operational reporting and dashboards Selection of operational data into a reporting database for BI tools and dashboards. - Query Offloading Replication of high-cost or legacy OLTP servers to secondary systems to ease query load. - High Availability / Disaster Recovery Duplication of database systems in active-active or active-passive scenarios to improve availability during outages. - Zero Downtime Migrations Ability to synchronize data between old and new systems with potentially different technologies to allow for switch-over and switch-back without downtime. - Data Federation / Data Services Provide virtual, canonical views of data distributed over several systems through federated queries over heterogeneous sources.
  • 4. Best Practices for Real-time Data Warehousing 3 Oracle has various solutions for different real-time data integration use cases. Query offloading, high availability/disaster recovery, and zero-downtime migrations can be handled through the Oracle GoldenGate product that provides heterogeneous, non-intrusive and highly performant changed data capture, routing, and delivery. In order to provide no to low latency loads, ODI has various alternatives for real-time data warehousing through the use of CDC mechanism, including the integration with Oracle GoldenGate. This integration also provides seamless operational reporting. Data federation and data service use cases are covered by Oracle Data Service Integrator (ODSI). Architectures for Loading Data Warehouses Various architectures for collecting transactional data from operational sources have been used to populate data warehouses. These techniques vary mostly on the latency of data integration, from daily batches to continuous real-time integration. The capture of data from sources is either performed through incremental queries that filter based on a timestamp or flag, or through a CDC mechanism that detects any changes as it is happening. Architectures are further distinguished between pull and push operation, where a pull operation polls in fixed intervals for new data, while in a push operation data is loaded into the target once a change appears. A daily batch mechanism is most suitable if intra-day freshness is not required for the data, such as longer-term trends or data that is only calculated once daily, for example financial close information. Batch loads might be performed in a downtime window, if the business model doesn’t require 24 hour availability of the data warehouse. Different techniques such as real-time partitioning or trickle-and-flip1 exist to minimize the impact of a load to a live data warehouse without downtime. Batch Mini-Batch Micro-Batch Real-Time Description Data is loaded in full or incrementally using a off-peak window. Data is loaded incrementally using intra-day loads. Source changes are captured and accumulated to be loaded in intervals. Source changes are captured and immediately applied to the DW. Latency Daily or higher Hourly or higher 15min & higher sub-second Capture Filter Query Filter Query CDC CDC Intialization Pull Pull Push, then Pull Push Target Load High Impact Low Impact, load frequency is tuneable Source Load High Impact Queries at peak times necessary Some to none depending on CDC technique 1 See also: Real-Time Data Warehousing: Challenges and Solutions by Justin Langseth (http://dssresources.com/papers/features/langseth/langseth02082004.html)
  • 5. Best Practices for Real-time Data Warehousing 4 IMPLEMENTING CDC WITH ODI Change Data Capture as a concept is natively embedded in ODI. It is controlled by the modular Knowledge Module concept and supports different methods of CDC. This chapter describes the details and benefits of the ODI CDC feature. Modular Framework for Different Load Mechanisms ODI supports each of the described data warehouse load architectures with its modular Knowledge Module architecture. Knowledge Modules enable integration designers to separate the declarative rules of data mapping from their technical implementation by selecting a best practice mechanism for data integration. Batch and Mini-Batch strategies can be defined by selecting Load Knowledge Modules (LKM) for the appropriate incremental load from the sources. Micro-Batch and Real-Time strategies use the Journalizing Knowledge Modules (JKM) to select a CDC mechanism to immediately access changes in the data sources. Mapping logic can be left unchanged for switching KM strategies, so that a change in loading patterns and latency does not require a rewrite of the integration logic. Methods for Tracking Changes using CDC ODI has abstracted the concept of CDC into a journalizing framework with a JKM and journalizing infrastructure at its core. By isolating the physical specifics of the capture process from the process of detected changes, it is possible to support a number of different techniques that are represented by individual JKMs: Non-invasive CDC through Oracle GoldenGate Oracle GoldenGate provides a CDC mechanism that can process source changes non- invasively by processing log files of completed transactions and storing these captured changes into external Trail Files independent of the database. Changes are then reliably transferred to a staging database. The JKM uses the metadata managed by ODI to generate Figure 1: GoldenGate-based CDC J$ T Source Log S ODI Load S GoldenGate Target Staging Real-Time Reporting
  • 6. Best Practices for Real-time Data Warehousing 5 all Oracle GoldenGate configuration files, and processes all GoldenGate-detected changes in the staging area. These changes will be loaded into the target data warehouse using ODI’s declarative transformation mappings. This architecture enables separate real-time reporting on the normalized staging area tables in addition to loading and transforming the data into the analytical data warehouse tables. Database Log Facilities Some databases provide APIs and utilities to process table changes programmatically. For example, Oracle provides the Streams interface to process log entries and store them in separate tables. Such log-based JKMs have better scalability than trigger-based mechanisms, but still require changes to the source database. ODI also supports log-based CDC on DB2 for iSeries using its journals. Database Triggers JKMs based on database triggers define procedures that are executed inside the source database when a table change occurs. Based on the wide availability of trigger mechanisms in databases, JKMs based on triggers are available for a wide range of sources such as Oracle, IBM DB2 for iSeries and UDB, Informix, Microsoft SQL Server, Sybase, and others. The disadvantage is the limited scalability and performance of trigger procedures, making them optimal for use cases with light to medium loads. Figure 3: Trigger-based CDC Trigger S J$ T Source Target ODI Load Figure 2: Streams-based CDC on Oracle S J$ T ODI Load Log Oracle Streams Target Source
  • 7. Best Practices for Real-time Data Warehousing 6 Oracle Changed Data Capture Adapters Oracle also provides separate CDC adapters that cover legacy platforms such as DB2 for z/OS, VSAM CICS, VSAM Batch, IMS/DB and Adabas. These adapters provide performance by capturing changes directly from the database logs. Source databases supported for ODI CDC Database Log-based CDC Trigger- based CDC JKM Oracle GoldenGate Database Log Facilities Oracle CDC Adapters Oracle MS SQL Server Sybase ASE DB2 UDB DB2 for iSeries 2 DB2 for z/OS 2 Informix, Hypersonic DB Teradata, Enscribe, MySQL, SQL/MP, SQL/MX 2 VSAM CICS, VSAM Batch, IMS/DB, Adabas 2 Requires customization of Oracle GoldenGate configuration generated by JKM
  • 8. Best Practices for Real-time Data Warehousing 7 Publish-and-Subscribe Model The ODI journalizing framework uses a publish-and-subscribe model. This model works in three steps: 1. An identified subscriber, usually an integration process, subscribes to changes that might occur in a datastore. Multiple subscribers can subscribe to these changes. 2. The Changed Data Capture framework captures changes in the datastore and then publishes them for the subscriber. 3. The subscriber—an integration process—can process the tracked changes at any time and consume these events. Once consumed, events are no longer available for this subscriber. ODI processes datastore changes in two ways: - Regularly in batches (pull mode)—for example, processes new orders from the Web site every five minutes and loads them into the operational datastore (ODS) - In real time (push mode) as the changes occur—for example, when a product is changed in the enterprise resource planning (ERP) system, immediately updates the on-line catalog Processing the Changes ODI employs a powerful declarative design approach, Extract-Load, Transform (E-LT), which separates the rules from the implementation details. Its out-of-the-box integration interfaces use and process the tracked changes. Developers define the declarative rules for the captured changes within the integration processes in the Designer Navigator of the ODI Studio graphical user interface—without having to code. In ODI Studio, customers declaratively specify set-based maps between sources and targets, and then the system automatically generates the data flow from the set-based maps. Figure 4: The ODI Journalizing Framework uses a publish-and-subscribe architecture Order #5A32 Orders Order #5A32 CDC Subscribe Consume Capture/Publish Consume Subscribe Integration Process 2 Target 1 Target 2 Integration Process 1
  • 9. Best Practices for Real-time Data Warehousing 8 The technical processes required for processing the changes captured are implemented in ODI’s Knowledge Modules. Knowledge Modules are scripted modules that contain database and application-specific patterns. The runtime then interprets these modules and optimizes the instructions for targets. Ensuring Data Consistency Changes frequently involve several datastores at one time. For example, when an order is created, updated, or deleted, it involves both the orders table and the order lines table. When processing a new order line, the new order to which this line is related must be taken into account. ODI provides a mode of tracking changes, called Consistent Set Changed Data Capture, for this purpose. This mode allows you to process sets of changes that guarantee data consistency. Best Practices using ODI for Real-Time Data Warehousing As with other approaches there is no one-size-fits-all approach when it comes to Real-Time Data Warehousing. Much depends on the latency requirements, overall data volume as well as the daily change volume, load patterns on sources and targets, as well as structure and query requirements of the data warehouse. As covered in this paper, ODI supports all approaches of loading a data warehouse. In practice there is one approach that satisfies the majority of real-time data warehousing use cases: The micro-batch approach using GoldenGate-based CDC with ODI. In this approach, one or more tables from operational databases are used as sources and are replicated by GoldenGate into a staging area database. This staging area provides a real-time copy of the transactional data for real-time reporting using BI tools and dashboards. The operational sources are not additionally stressed as GoldenGate capture is non-invasive and performant, and the separate staging area handles operational BI queries without adding load to the transactional system. ODI performs a load of the changed records to the real-time data warehouse in frequent periods of 15 minutes or more. This pattern has demonstrated the best combination of providing fresh, actionable data to the data warehouse without introducing inconsistencies in aggregates calculated in the data warehouse.
  • 10. Best Practices for Real-time Data Warehousing 9 Customer Use Case: Overstock.com Overstock.com is using both GoldenGate and ODI to load customer transaction data into a real-time data warehouse. GoldenGate is used to capture changes from its retail site, while ODI is used for complex transformations in its data warehouse. Overstock.com has seen the benefits of leveraging customer data in real time. When the company sends out an e-mail campaign, rather than waiting one, two, or three days, it immediately sees whether consumers are clicking in the right place, if the e-mail is driving consumers to the site, and if those customers are making purchases. By accessing the data in real time using Oracle GoldenGate and transforming it using Oracle Data Integrator, Overstock.com can immediately track customer behavior, profitability of campaigns, and ROI. The retailer can now analyze customer behavior and purchase history to target marketing campaigns and service in real time and provide better service to its customers. Conclusion Integrating data and applications throughout the enterprise, and presenting a consolidated view of them, is a complex proposition. Not only are there broad disparities in data structures and application functionality, but there are also fundamental differences in integration architectures. Some integration needs are data oriented, especially those involving large data volumes. Other integration projects lend themselves to an event-oriented architecture for asynchronous or synchronous integration. Changes tracked by Changed Data Capture constitute data events. The ability to track these events and process them regularly in batches or in real time is key to the success of an event- driven integration architecture. Oracle Data Integrator provides rapid implementation and maintenance for all types of integration projects. Figure 5: Micro-Batch Architecture using ODI and GoldenGate Operational Source(s) Log Periodical ODI Load GoldenGate Real-Time Data Warehouse Staging Area for Operational BI Operational BI / Real-Time Reporting
  • 11. Best Practices for Real-time Data Warehousing August 2012 Author: Alex Kotopoulis Oracle Corporation World Headquarters 500 Oracle Parkway Redwood Shores, CA 94065 U.S.A. Worldwide Inquiries: Phone: +1.650.506.7000 Fax: +1.650.506.7200 oracle.com Copyright © 2010, Oracle and/or its affiliates. All rights reserved. This document is provided for information purposes only and the contents hereof are subject to change without notice. This document is not warranted to be error-free, nor subject to any other warranties or conditions, whether expressed orally or implied in law, including implied warranties and conditions of merchantability or fitness for a particular purpose. We specifically disclaim any liability with respect to this document and no contractual obligations are formed either directly or indirectly by this document. This document may not be reproduced or transmitted in any form or by any means, electronic or mechanical, for any purpose, without our prior written permission. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. 0510