2. Informix Warehousing Moving Forward
Goal is to provide a comprehensive warehousing platform that
is highly competitive in the marketplace
Incorporating the best features of XPS and Red Brick into
Informix for OLTP/Warehousing and Mixed-Workload
Using the latest Informix technology in:
Continuous Availability and Flexible Grid
Data Warehouse Accelerator using latest industry
technology
Integration of IBM’s BI software stack
3. 3
Informix Warehouse
Feature
- SQW
- Data Modeling
- ELT/ETL
Informix Warehouse with
Storage
Optimization/Compression
Cognos integration
- Native Content Store on Informix
SQL Merge
Informix Warehouse: Roadmap
External Tables
Star Join Optimization
Multi-index Scan
New Fragmentation
Fragment Level Stats
Storage Provisioning
Warehouse
Accelerator
OLAP
Query Rewrite
Hash Join
Enhance IWA
+++
11.5xC3
11.5xC6
11.70
11.70xC2
11.5xC4
11.5xC5
12.10
4. 4
IWA 1st
Release
On SMP
SMB: IGWE
Scale out: IWA
on Blade ServerWorkload Analysis Tool
More Locales
Data Currency
IWA: Roadmap
Partition Refresh
MACH11 support
Solaris on Intel
Trickle Feed
Union queries
Derived tables
OAT Integration
SQL/OLAP for IWA
11.7xC2
11.7xC5
12.1xC1
11.7xC3
11.7xC4
2012 IIUG
2012 IIUG
Support for Informix
Timeseries
11.7xC7
5. Informix Publications
Bulletin of the Technical Committee on Data Engineering: March 2012 Vol. 35 No. 1
Real Time Business Intelligence. September 2, 2011 - Seattle, United States
IBM Data management Magazine: Supercharging the
data wharehouse while keeping the costs down.
2012 Bloor Report: IBM Informix in hybrid workload environments
2012 Ovum Analyst report: Informix Accelerates Analytic Integration into OLTP
DBTA Article: Empowering Business Analysts with Faster Insights
http://youtu.be/xJd8M-fbMI0
7. What is OLAP?
• On-Line Analytical Processing
• Commonly used in Business
Intelligence (BI) tools
– ranking products, salesmen, items, etc
– exposing trends in sales from historic data
– testing business scenarios (forecast)
– sales breakdown or aggregates on multiple
dimensions (Time, Region, Demographics,
etc)
8. OLAP Functions in Informix
• Supports subset of commonly used
OLAP functions
• Enables more efficient query processing
from BI tools such as Cognos
9. Example query with group by
select customer_num, count(*)
from orders
where customer_num <= 110
group by customer_num;
customer_num (count(*))
101 1
104 4
106 2
110 2
4 row(s) retrieved.
10. Example query with OLAP function
select customer_num, ship_date, ship_charge,
count(*) over (partition by customer_num)
from orders
where customer_num <= 110;
customer_num ship_date ship_charge (count(*))
101 05/26/2008 $15.30 1
104 05/23/2008 $10.80 4
104 07/03/2008 $5.00 4
104 06/01/2008 $10.00 4
104 07/10/2008 $12.20 4
106 05/30/2008 $19.20 2
106 07/03/2008 $12.30 2
110 07/06/2008 $13.80 2
110 07/16/2008 $6.30 2
9 row(s) retrieved.
12. Ranking Functions
• Partition by clause is optional
• Order by clause is required
• Window frame clause is NOT allowed
• Duplicate value handling is different
between rank() and dense_rank()
– same rank given to all duplicates
– next rank used “skips” ranks already covered by
duplicates in rank(), but uses next rank for
dense_rank()
13. Where does OLAP function fit?
Joins, group by,
having
OLAP functions
Final order by
14. Applications
BI Tools
Step 1. Submit SQL
DB protocol: SQLI or DRDA
Network : TCP/IP,SHM
Informix
2. Query matching and
redirection technology
Step 3
offload SQL.
DRDA over TCP/IP
Step 4
Results:
DRDA over TCP/IP
Local
Execution
Coordinator
Compressed
data
In memory
Worker
Memory image
on disk
Compressed
data
In memory
Worker
Memory image
on disk
Compressed
data
In memory
Worker
Memory image
on disk
Compressed
data
In memory
Worker
Memory image
on disk
Step 6. Return results/describe/error
Database protocol: SQLI or DRDA
Network : TCP/IP, SHM
Query Flow
5. Feed results to OLAP
iterators if it exists
15. QUERY: DWA executed:(OPTIMIZATION TIMESTAMP: 10-02-2012 20:52:56)
------
select ws_web_page_sk, ws_net_paid, avg(ws_net_paid) over()
from web_sales
where ws_web_page_sk < 10
group by ws_web_page_sk, ws_net_paid
order by ws_web_page_sk, ws_net_paid
Estimated Cost: 1497990
Estimated # of Rows Returned: 286309
Temporary Files Required For: Order By Group By
1) ds2@BVSRDWA:dwa.aqt5cbe4c46-acdc-463a-a9cb-2c3318cc9164: REMOTE DWA PATH
Remote SQL Request:
{QUERY {FROM dwa.aqt5cbe4c46-acdc-463a-a9cb-2c3318cc9164} {WHERE {< COL066 10 } } {SELECT {SYSCAST
COL066 AS INTEGER NULLABLE} {SYSCAST COL083 AS DECIMAL 7 2 NULLABLE} } {GROUP COL066 COL083 } }
Query statistics:
-----------------
Table map :
----------------------------
Internal name Table name
----------------------------
type rows_prod est_rows time est_cost
-------------------------------------------------
dwa 307576 0 00:03.62 0
type it_count time
----------------------------
olap 307576 00:04.48
type rows_sort est_rows rows_cons time est_cost
------------------------------------------------------------
sort 307576 286309 307576 00:06.49 192262
17. Union Query Support
select sum(sales_amt) from SALES
UNION ALL
select sum(returns_amt) from SALES_RETURN;
18. Derived table Query
select state, totsales, totreturns
from (select state, sum(sale_amt) from sales)
as stsales(state, totsales),
(select state, sum(return_amt) from sales_returns)
as streturns(state, totreturns)
Where stsales.state = streturns.state;
19. SELECT d_year,i_brand_id,i_class_id,i_category_id ,i_manufact_id,SUM(sales_cnt) AS sales_cnt ,SUM(sales_amt) AS sales_amt
FROM
(
SELECT d_year ,i_brand_id ,i_class_id ,i_category_id,i_manufact_id,SUM(sales_cnt) AS sales_cnt
,SUM(sales_amt) AS sales_amt
FROM (SELECT d_year ,i_brand_id ,i_class_id ,i_category_id ,i_manufact_id ,cs_quantity AS sales_cnt
,cs_ext_sales_price AS sales_amt
FROM catalog_sales JOIN item ON i_item_sk=cs_item_sk
JOIN date_dim ON d_date_sk=cs_sold_date_sk
WHERE i_category='Music'
UNION
SELECT d_year ,i_brand_id ,i_class_id ,i_category_id ,i_manufact_id
,ss_quantity AS sales_cnt
,ss_ext_sales_price AS sales_amt
FROM store_sales JOIN item ON i_item_sk=ss_item_sk
JOIN date_dim ON d_date_sk=ss_sold_date_sk
WHERE i_category='Books'
UNION
SELECT d_year
,i_brand_id
,i_class_id
,i_category_id
,i_manufact_id
,ws_quantity AS sales_cnt
,ws_ext_sales_price AS sales_amt
FROM web_sales JOIN item ON i_item_sk=ws_item_sk
JOIN date_dim ON d_date_sk=ws_sold_date_sk
WHERE i_category='Sports') sales_detail
GROUP BY d_year, i_brand_id, i_class_id, i_category_id, i_manufact_id) as tmp
GROUP BY d_year, i_brand_id, i_class_id, i_category_id, i_manufact_id
ORDER BY sales_amt, sales_cnt
20. Query statistics:
Table map :
----------------------------
Internal name Table name
----------------------------
t1 (Temp Table For Collection Subquery)
t2 (Temp Table For Collection Subquery)
type rows_prod est_rows time est_cost
-------------------------------------------------
dwa 1410644 0 00:06.38 0
type rows_prod est_rows time est_cost
-------------------------------------------------
dwa 2749278 0 00:12.53 0
type rows_prod rows_cons_1 rows_cons_2 time
------------------------------------------------------
merge 4159922 1410644 2749278 00:19.38
type rows_prod est_rows time est_cost
-------------------------------------------------
dwa 723063 0 00:08.01 0
type rows_prod rows_cons_1 rows_cons_2 time
------------------------------------------------------
merge 4882985 4159922 723063 00:28.11
type rows_sort est_rows rows_cons time
-------------------------------------------------
sort 4867550 0 4882985 01:22.97
type table rows_prod est_rows rows_scan time est_cost
-------------------------------------------------------------------
scan t1 4867550 4320015 4867550 00:04.59 190744
type rows_prod est_rows rows_cons time est_cost
------------------------------------------------------------
group 77949 1769089 4867550 00:52.63 11610464
type table rows_prod est_rows rows_scan time est_cost
-------------------------------------------------------------------
scan t2 77949 1769089 77949 00:00.08 99698
type rows_prod est_rows rows_cons time est_cost
------------------------------------------------------------
group 77949 724460 77949 00:01.20 4647282
type rows_sort est_rows rows_cons time est_cost
------------------------------------------------------------
sort 77949 724460 77949 00:01.77 709964
22. SQL Enhancements
• OLAP Window functions/aggregates
• Multiple distinct aggregates
• Distinct with CASE expression
• NULLS FIRST, NULLS LAST modifier to ORDER BY
23. Support for custom NULL sorting
• Informix has NULLS FIRST by default and cannot be
changed.
• NULLS FIRST, NULLS LAST are modifiers to ORDER BY
clause
• Oracle supports both.
• Helps avoid sorting in Cognos. Cognos used this for some
reports against Oracle.
SELECT c1, c2, sum(c3)
FROM t1
GROUP BY c1, c2
ORDER BY c2 NULLS LAST, C1 NULLS FIRST;
24. DISTINCT with CASE expression
• Support for CASE expression as argument to
aggregates was added in 11.70.
• 12.10 adds support for distinct on CASE expression
SELECT sum(T983271.set_avgday_sales_rtl_amt) as c2,
count(distinct case when T983271.not_set_cnt > 0
then T983271.store_sk_id end ) as c3
FROM features_upc_tab T983271;
25. Multiple aggregates with distincts
• Long requested by Cognos,Wal-Mart and others.
• Wal-Mart had run into this during IWA pilot project.
• Design and part of the code taken from XPS.
• Long live XPS!
select region, sum(distinct cid), avg(distinct salesdt)
From sales_tab;
26. Informix Database Server
Informix warehouse Accelerator
BI Applications
Step 1. Install, configure,
start Informix
Step 2. Install, configure,
start Accelerator
Step 3. Connect Studio to
Informix & add accelerator
Step 4. Design, validate,
Deploy Data mart
Step 5. Load data to
accelerator
Ready for Queries
IBM Smart Analytics
Studio
Step 1
Step 2
Step 3
Step 4
Step 5
Ready
Informix Ultimate Warehouse edition
27. Informix Database Server
BI Applications
Step 1. Create the Sales-Mart
and load it. Sales is the fact
table -- range partitioned.
Step 2. Load jobs
update the fact table “sales”
Only updates existing partition
Step 3. Identify the partition,
execute dropPartMart().
Step 4. for same partition,
execute loadPartMart().
Ready for Queries
IBM Smart Analytics
Studio or stored
procedures or
command line tool
Step 1
Step 4
Step 2
Step 3
Ready
Case 1: Partition refresh: Updates to existing Partitions
Sales-Mart
sales
customer
stores
IWA
OLTP Apps
partitioned fact table
SQL Script: call
Stored procedure
Modified partition
INSERT, UPDATE, DELETE
28. Informix Database Server
BI Applications
Step 1. Create the Sales-Mart
and load it. Sales is the fact
table -- range partitioned.
Need to move the Time
window to next range.
ep 2. DETACH operation
Execute dropPartMart()
DETACH the partition
ep 3. ATTACH operation
ATTACH the partition
Execute loadPartMart()
Ready for Queries
IBM Smart Analytics
Studio or stored
procedures or
command line tool
Step 1
Step 3Step 2
Ready
Case 2: Partition refresh: Time Cyclic data management
Sales-Mart
sales
customer
stores
IWA
OLTP Apps
partitioned fact table
Move the window.
29. dropPartMart() procedure
1. Uses the accelerator name, datamart name, table
name and partition name.
Partition name can be the name of the partition or
partition number (sysfragments.partn)
The partition name or number should be a valid
partition for the table.
Call dropPartMart() first before doing the DEATCH
30. loadPartMart() procedure
1. Uses the accelerator name, datamart name, table
name and partition name.
Partition name can be the name of the partition or
partition number (sysfragments.partn)
The partition name or number should be a valid
partition for the table.
ATTACH the partition first, before calling
loadPartMart().
31. 31
Informix Database Server
Informix warehouse Accelerator
BI Applications
Step 1. Install, configure,
start Informix
Step 2. Install, configure,
start Accelerator
Step 3. Connect Studio to
Informix & add accelerator
Step 4. Design, validate,
Deploy Data mart
Step 5. Load data to
accelerator
Ready for Queries
IBM Smart Analytics
Studio
Step 1
Step 2
Step 3
Step 4
Step 5
Ready
Informix Warehouse Accelerator – In 11.70.FC4
32. Background
• Prior to 11.70.FC5, adding accelerator, create, deploy,
load, enable, disable datamart, accelerating queries – are
all operations officially supported only on Standard
server or Primary node of MACH11/HA environment.
• We estimate about 50% of Informix customers use HDR
secondary servers and growing number of customers use
MACH11 (SDS secondary) configurations and RSS nodes.
MACH11 is the Informix scale out solution.
• IWA itself supports a scale out solution (on a cluster)
starting with 11.70.FC4.
• Reasons to support MACH11 and IWA together.
– This feature will enable partitioning a cluster or HA group
between OLTP and BI workload.
– This feature will give help to off-load the expensive LOAD
functionality to secondary servers
– We have customers now requesting support for HDR secondary
to IWA
33. 33
Informix Primary
Informix warehouse Accelerator
BI Applications
Step 1. Install, configure,
start Informix
Step 2. Install, configure,
start Accelerator
Step 3. Connect Studio to
Informix & add accelerator
Step 4. Design, validate,
Deploy Data mart from
Primary, SDS, HDR, RSS
Step 5. Add IWA to sqlhosts
Load data to
Accelerator from any node.
Ready for Queries
IBM Smart Analytics
Studio
Step 1
Step 3
Step 4
Step 5
Ready
Informix Warehouse Accelerator – 11.70.FC5. MACH11 Support
Informix
SDS1
Informix
SDS2
Informix
HDR
Secondary
Informix
RSS
Step 2
34. Step 1: Install:
• Informix and IWA are installed just like before.
• Informix can be combination of standard,
primary, SDS, HDR secondary and RSS nodes.
• IWA can be installed on the same computer as any
one of the nodes or on distinct computer.
• IWA can also be installed on a cluster hardware
with multiple worker nodes for scale out
performance.
35. Step 2: Configure
• Informix and IWA are installed just like before.
• Informix can be combination of standard,
primary, SDS, HDR secondary and RSS nodes.
• IWA can be installed on the same computer as any
one of the nodes or on distinct computer.
• IWA can also be installed on a cluster hardware
with multiple worker nodes for scale out
performance.
• Note: Informix MACH11 technology works with
logged and ANSI databases only.
36. •The secondary servers should be updatable
secondary servers.
•Set this in $ONCONFIG
UPDATABLE_SECONDARY 10
Step 2: Configure
37. Step 3: Connect
• You can connect to IWA from Informix from any of the
Informix servers using existing method.
– Get the connection details via:
# ondwa getpin
– The output will be, ip address, port, pin for IWA connection.
– Use that information to create the connection.
• After successful connection from Informix to IWA, the
SQLHOSTS will have something like this
FAST group - -
c=1,a=484224232041684420473a283e612f74393e6025757159506a51344a6b4e2f2d2d47455e6b653f2f6c795f287d7b65224d6c3c2f65722e6a2a4245397b3b447d572c3129696b306440
FAST_1 dwsoctcp 172.34.22.188 21022 g=FAST
• To use this connection on any of Informix nodes, copy these lines AS IS to the
SQLHOSTS file of those servers.
• Make sure copy ALL the lines within the FAST group.
38. Step 3: Connect..continued
• The name of the IWA will be used as the AQT site name in
systables.sitename. So, it’s important to have the right
site name in SQLHOSTS entry for a successful connection.
• Changing ANY of the details of this SQLHOSTS entry will
result in connection, query matching and acceleration
issues.
39. Step 4. Design, Validate and Deploy
• The secondary servers should be updatable secondary
servers.
• Set this in $ONCONFIG
UPDATABLE_SECONDARY 10
• The Design, validate and deploy would be identical.
40. Step 5. Running queries
• Once you deploy the data mart from one of the nodes,
the data mart definitions in the catalogs are replicated to
all the systems.
• SQLHOSTS entries should be copied over manually.
• After this, queries can be run as usual. Informix does
query matching and off-loading as it does on Primary.
41. Informix Database Server
Informix warehouse Accelerator
BI Applications
Step 1. Install, configure,
start Informix
Step 2. Install, configure,
start Accelerator
Step 3. Connect Studio to
Informix & add accelerator
Step 4. Design, validate,
Deploy Data mart
Step 5. Load data to
accelerator
Ready for Queries
IBM Smart Analytics
Studio
Step 1
Step 2
Step 3
Step 4
Step 5
Ready
42. Design DM by
workload analysis or
manually
Deployed datamart
Datamart
DeletedDatamart in USE
Datamart Disabled
Partition based refresh
Trickle feed refresh
Deploy
Load
Drop
Disable
Enable Drop
43. Administration: Open Admin Tool
• Browser based administration tool for Informix
• Replaces ISAO studio with most functions
• Adds workload analysis and datamart deployment
• Adds support for data refresh and setup commands.
44. Data Refresh: RefreshMart Implementation :
new stored procedure :
ifx_refreshMart(
'accelerator_name',
'data_mart_name',
'locking_mode',
NULL);
locking_mode is optional : can be NULL
4th
parameter : not used as of now
if used while new functionality “trickleFeed” is active :
ifx_refreshMart() will not refresh fact tables for which trickleFeed
is active.
45. Data Refresh: RefreshMart :
granularity based on table partitions
data mart remains available for query acceleration
single call of stored procedure for ease of use
control of execution remains with administrator
handles all data changes, including fragment operations
data consistency via lock mode parameter
prerequisite :
sysadmin database accessible for administrator
46. Informix Database Server
Step 1. Create the Sales-Mart
and load it. Sales is the fact
Table, customer and stores
Dimension tables.
Step 2 Setup tricklefeed by
calling ifx_setupTrickleFeed
p 3. Let application roll.
the inserts on fact and
dates on any dimensions.
ep 4. As the applications
ns, the reports see new
ta updated on IWA
IBM Smart Analytics
Studio or stored
procedures or
command line tool
Step 1
Step 3
Step 2
Data Refresh: Scenario for Real-time trickle feed.
Sales-Mart
sales
customer
stores
IWA
OLTP Apps
fact table
Setup the trickle
feed
Run the application
Step 4
Reports & BI Apps
47. Data Refresh: Trickle feed (cont.)
insert into
fact_table ...
fact table
data row trigger
dimension table1
data row
accelerator
data mart
data row
Dbscheduler
task
ifx_loadPartMart()
ifx_refeshMart()
data row
dimension table2
data row
48. “refreshMart” - a New Function for the
Informix Warehouse Accelerator
Motivation:
data mart is a snapshot,
time consuming load required to reflect data changes,
manual drop and (re-) load of individual partitions is
cumbersome,
want ease of use with a single function “doing it all”.
IWA refreshMart ( 1 )
49. RefreshMart :
refreshes only “data units” that were changed :
less data to be moved.
“data units” are table data partitions :
single data partition for a normal table,
single data partition for each fragment of a fragmented table
control of granularity by table fragmentation.
IWA refreshMart ( 2 )
50. Data Mart Meta Info :
new meta information about data marts needed :
to keep track of changes
stored in new tables in sysadmin database
sysadmin database must exist
data marts must be re-created after upgrade
administrator needs acces rights for sysadmin
database
execute function task('grant admin','<user name>','warehouse');
IWA refreshMart ( 3 )
51. Data Mart Meta Info :
full data load is reference point
changes registered :
insert, update, delete of data records (but no actual data is
logged),
drop partition, then reload partition
detach fragment
drop partition
attach fragment
load partition
completed refreshMart is new reference point
IWA refreshMart ( 4 )
52. LoadMart :
loads complete data, always and unconditionally
rebuilds compression dictionary from scratch
RefreshMart :
does not extend or rebuild compression dictionary
new values are placed in “catch-all” containers
periodically do full data load using “ifx_loadMart()”
IWA refreshMart ( 5 )
53. RefreshMart and Data Consistency :
optional lock mode specifies locking of tables in warehouse
database :
MART
TABLE
NONE
works like lock mode for loadMart
if not specified :
lock mode of last loadMart is in effect.
IWA refreshMart ( 6 )
54. RefreshMart Implementation :
new stored procedure :
ifx_refreshMart(
'accelerator_name',
'data_mart_name',
'locking_mode',
NULL);
locking_mode is optional : can be NULL
4th
parameter : not used as of now
IWA refreshMart ( 7 )
55. RefreshMart - Summary :
granularity based on table partitions
data mart remains available for query acceleration
single call of stored procedure for ease of use
control of execution remains with administrator
handles all data changes, including fragment
operations
data consistency via lock mode parameter
prerequisite :
sysadmin database accessible for administrator
IWA refreshMart ( 8 )
57. Data Transfer from Informix to IWA – First time
Ready
for Queries
Ready
for Queries
Design the Data Mart
-- OAT analysis
-- cmdline analysis
-- ISAO Studio
Design the Data Mart
-- OAT analysis
-- cmdline analysis
-- ISAO Studio
Deploy the Data Mart
-- OAT
-- Stored procedure
-- ISAO Studio
Deploy the Data Mart
-- OAT
-- Stored procedure
-- ISAO Studio
Optionally lock the tableOptionally lock the table
Insert table data into external tableInsert table data into external table
Send data over to IWASend data over to IWA
Fact table – split into each worker
Dim table – copy to each worker
Fact table – split into each worker
Dim table – copy to each worker
Compression frequency partitioning & encodingCompression frequency partitioning & encoding
Write the memory image to diskWrite the memory image to disk
InformixInformix
IWAIWA
Load the Mart
-- OAT
-- Stored procedure
-- ISAO Studio
Load the Mart
-- OAT
-- Stored procedure
-- ISAO Studio
Red: New in v12.10Red: New in v12.10
58. Distributing data from IDS (Fact tables)
Data Fragment
Fact Table
UNLOAD
UNLOAD
UNLOAD
UNLOAD
IDS Stored Procedures
Copy
A copy of the IDS data is now
transferred over to the Worker
process. The Worker process
holds a subset of the data
(compressed) in main memory
and is able to execute queries
on this subset. The data is
evenly distributed (no value
based partitioning) across the
CPUs.
Coordinator
Process
Worker
Process
Compressed Data
Compressed Data
Compressed Data
Compressed Data
Compressed Data
Compressed Data
Worker
Process
Worker
Process
Data Fragment
Data Fragment
Data Fragment
59. Dimension Table
Dimension Table
Dimension Table
Dimension Table
Distributing data from IDS (Dimension tables)
IDS
UNLOAD
UNLOAD
UNLOAD
UNLOAD
IDS Stored Procedure
Dimension Table
Dimension Table
Dimension Table
Dimension Table
All dimension tables are
transferred to the worker
process.
Dimension Table
Dimension Table
Dimension Table
Dimension Table
Dimension Table
Dimension Table
Dimension Table
Dimension Table
Coordinator
Process
Worker
Process
Worker
Process
Worker
Process
60. • Fact table(s) of a data mart are populated only by new data
rows. Data is inserted and not updated or deleted.
• Dimension tables are updated.
• Need the latest data for analysis
• Full refresh can take a long time.
• Partitioning/fragmentation scheme could refresh too many
partitions.
• Inserted rows shall get loaded to the data mart:
• - within a configurable time
• - optionally, data mart dimension tables shall get refreshed
Trickle feed Use Case
61. Informix Database Server
Step 1. Create the Sales-Mart
and load it. Sales is the fact
Table, customer and stores
Dimension tables.
Step 2 Setup tricklefeed by
calling ifx_setupTrickleFeed
p 3. Let application roll.
the inserts on fact and
dates on any dimensions.
ep 4. As the applications
ns, the reports see new
ta updated on IWA
Use OAT, Studio or
stored procedures.
Step 1
Step 3
Step 2
Scenario for Real-time trickle feed.
Sales-Mart
sales
customer
stores
IWA
OLTP Apps
fact table
Setup the trickle
feed
Run the application
Step 4
Reports & BI Apps
62. User interface:
ifx_setupTrickleFeed( 'accelerator_name', 'data_mart_name', buffertime)
accelerator_name
The name of the accelerator that contains the data mart.
data_mart_name
The name of the data mart.
buffertime
An integer that represents the time interval between refreshes and
whether dimension tables are refreshed.
Examples:
execute procedure ifx_setupTrickleFeed('salesacc', ‘partsmart', 60);
execute procedure ifx_setupTrickleFeed('salesacc', 'carmart', -300);
Trickle feed (cont.)
64. Trickle feed (cont.)
insert into
fact_table ...
fact table
data row trigger
dimension table1
data row
accelerator
data mart
data row
Dbscheduler
task
ifx_loadPartMart()
ifx_refeshMart()
data row
dimension table2
data row
65. Design DM by workload
analysis or manually
Deployed datamart
Datamart Deleted
Datamart in USE
Datamart Disabled
Partition based refresh
Trickle feed refresh
Deploy
Load
Drop
Disable
Full Load/
Enable
Drop
Complete view of Data mart state transitions.
66. Summary
• Full refresh will recreate the dictionary, but can
take time.
• Partition based refresh is very fast and refreshes
only the partitions with new data since the last
refresh
• Trickle feed captures the INSERTs on the fact
table and refreshes by sending this data to IWA.
It can also refresh dimension tables.
• Even when you use partition refresh or trickle
feed, do a full refresh periodically, say, daily or
weekly.
67. Deep dive into interval and
rolling window table partitioning in IBM Informix
Keshava Murthy IBM rkeshav@us.ibm.com
Hinweis der Redaktion
execute function dropPartMart(’myAccelerator’,’myMart’,’user10’,’tab22’,’part1’); execute function loadPartMart(’myAccelerator’,’myMart’,’user10’,’tab22’,’part1’);
execute function dropPartMart(’myAccelerator’,’myMart’,’user10’,’tab22’,’part1’); execute function loadPartMart(’myAccelerator’,’myMart’,’user10’,’tab22’,’part1’);