Dylan's BI Study Notes

My notes about Business Intelligence, Data Warehousing, OLAP, and Master Data Management

Archive for the ‘ODI’ Category

Using Load Plan for managing your ETL task in BI Apps 11.1.1.7.1 (1)

Posted by Dylan Wan on May 14, 2013

One of the major change introduced in BI Apps 11.1.1.7.1 is the way how we manage the ETL task sequence and trim the unnecessary tasks.

This functionality was accomplished earlier using DAC.  The problem we frequently faced was that the DAC repository and the INFA repository are maintained as two separate repositories.  We have to sync up the name of tasks exactly in order to use DAC to manage the task execution of the Informatica workflow tasks.

Load Plan and Load Plan Generator was designed for addressing this requirement.

Here is a good article that describes the story.

Load Plan Generator – An Inside Look

Posted in BI Application, Business Intelligence, DAC, Data Warehouse, ETL, Infomatica, OBIA, ODI, Oracle, Oracle Data Integrator | Tagged: , , , | Leave a Comment »

Incremental ETL in BI Apps 11.1.1.7.1 (1)

Posted by Dylan Wan on May 10, 2013

The data warehouse needs to be refreshed periodically to make sure that the data warehouse have the up to date information.  The Incremental Load is the regular ETL process running everyday.  The frequency can vary, but it is commonly seen as nightly job.  To ensure that the incremental ETL job can be finished within a limited time window, incremental ETL load typically identifies the changes happened in the source system and only deal with the delta to avoiding doing a Full ETL Load.

In BI Apps 11.1.1.7.1, if you are not leveraging the Golden Gate and the SDS (Source Dependent Data Store), the incremental ETL is accomplished by directly comparing the last update date captured in the source table with the last refreshed date.

BI Apps 11.1.1.7.1 ODI based ETL does not use separate ETL tasks for incremental load and full load.  The logic is determined by whether the data has been executed before.  In other word, it is based on whether the data has been refreshed.  This was a productivity gain for BI App developers and also make the support of incremental ETL becomes clear.  In the past, you may see the Incremental Load Informatica Workflow and Full Load.  Every time when there is a logic change, the workflows need to be modified.  It was a source of code bugs before.  By using a better ETL architecture, we eliminate the needs of maintaining two separate tasks.

Here is a high level summary of how the incremental ETL is accomplished in BI Apps 11.1.1.7.1 ETL:

1. Use the W_ETL_LOAD_DATES table for tracking all the ETL execution

2. Use the IS_INCREMENTAL Variable for holding and passing the state

3. Refresh Variable in a Package Step.  If there is an entry in the W_ETL_LOAD_DATES, perform an incremental ETL.

4. Use the RUN_INCREMENTAL to turn on and off the incremental filter

5. Optionally, use the ODI Variable Evaluate Step to branch the Full and Incremental ETL logic within a package

6. Table Maintenance, take the ETL Load Date information to determine the operations, including Drop Index, Truncate Table before Full Load, Rebuild the index after Load, and Analyze Table after Load, tec.

 

 

 

 

 

 

 

Posted in BI, BI Application, ETL, ODI, Oracle, Oracle Data Integrator | Tagged: , , , | Leave a Comment »

ODI Shortcut

Posted by Dylan Wan on May 25, 2012

ODI provides the shortcut feature since the 11.1.1.6.0 release.  Shortcuts look like symbolic links in Linux file system and shortcuts in windows. You keep a reference to an object that exists in another folder.

The symbolic link feature has been used widely in prepackaged apps development to handle multiple releases of the software product internally. Typically multiple top folders are created, one for each release.

Many codes can be shared from an old release to its next release.  Instead of copying the codes, a symbolic link can be created in the folder created for the new release to point to the code in the folder created for the old release.

ODI shortcut feature was created for a similar purpose.

ODI allow you to create shortcuts for these objects under the ODI Project:

  • Interface
  • Package
  • Procedure

Shortcuts are for the codes that the developers created and modified.  It try to maximize the code reuse.  You create a code in a folder, and reference it in another folder, after you change the code in the original folder, the reference in the other folder also get the change.  You do not need to make the same change over again for each release.

I also notice that the scenarios are not shortcutable.  The scenarios (package scenario and interface scenario) are executable objects.

This dsign means that the scenarios should be regenerated after the shortcuts are created.  It also means that the scenario should be regenerated after the code in the base folder is modified.  It is kind of strange, but it is a design to help you to maximize the code reuse.  Each interface or a package is tie to an ODI model and model could be source apps version specific.  You want your code to run against a different model even the code is a shortcut.

Folders are also not shortcutable. ODI Folders are not regular folders like the way how we think in the file system directory.  Each folder carries a property called “Release Tag”.  The release tag can help redirecting the model to be used in generating the scenarios.

 

Posted in Data Warehouse, ODI, Oracle, Oracle Data Integrator, OWB, Sunopsis | Tagged: | Leave a Comment »

Tutorial: Create ODI Repositories

Posted by Dylan Wan on April 14, 2012

This step is not required if you are working on an existing repository.  If you are doing a new development project, or just want to get some experiences on using ODI, you may need to create ODI repositories yourselves.

A typical ODI deployment includes two repositories: Master Repository and Work Repository.

ODI provides two different tools to create them.

Here is a good tutorial page available from oracle.com:

http://st-curriculum.oracle.com/obe/fmw/odi/odi_11g/ODImaster_work_repos/ODImaster_work_repos.htm

The tool does not create database users.  Here is my script for creating the database users:

create tablespace ODI_MREP
datafile 'C:\APP\DWAN\ORADATA\DWANDB\ODI_MREP.DBF'
size 100M autoextend on Next 100M Maxsize unlimited;


create tablespace ODI_WREP
datafile 'C:\APP\DWAN\ORADATA\DWANDB\ODI_WREP.DBF'
size 200M autoextend on Next 100M Maxsize unlimited;create user ODI_MREP
identified by ODI_MREP
default tablespace ODI_MREP
temporary tablespace TEMP
quota unlimited on ODI_MREP;

create user ODI_WREP
identified by ODI_WREP
default tablespace ODI_WREP
temporary tablespace TEMP
quota unlimited on ODI_WREP;

grant connect, resource, SELECT_CATALOG_ROLE  to ODI_MREP, ODI_WREP;

If you are using RCU, you will not be able to create Work Repository and Master Repository separately.  In my opinion, RCU is a poor tool and lacks many basic features. I will stay away from it.

Posted in Business Intelligence, ODI, Oracle Data Integrator | Leave a Comment »

ODI Topology

Posted by Dylan Wan on April 13, 2012

ODI has a very clever design to isolate the deployment objects from design objects.

The designer only interface to deployment information (topology) via logical schema.  The project objects only indirectly get the dependency via model.  The “context” can determine the physical resources and allow you switch among the development, testing and production env.

From prebuilt package apps perspective, the codes are shipped under a set of logical schema.    The logical schema represents the assumption about the various source and target systems.

The concept of “context” really enables the support of unlimited multiple source instances using a single set of codes.

Posted in BI, Business Intelligence, ETL, ODI, Oracle Data Integrator | Tagged: , | Leave a Comment »

User Defined Function (UDF) in ODI

Posted by Dylan Wan on March 21, 2011

User Defined Function (UDF) is a very powerful feature from ODI.

One of features that are absent from other ETL tool is to address the need to support different database platforms.  I won’t blame those ETL tools since they are not really designed for pre-package BI Apps.

Who will need to switch database platform like that?

If your data warehouse is deployed on Oracle, you can use Oracle SQL.  If you are using Teredata, you can use Teradata.  You know that your PeopleSoft is running on DB2, you can write the DB2 SQL.  In the custom data warehouse ETL environment, switching database platforms is uncommon, one time only task.  You do not need to switch among different database platforms within your code.

A prepackaged BI apps ETL developers, however, are facing different challenges.  You do not know if the source apps is running on which database platform.  Also, you want to give customers the choices on the database platforms to deploy the data warehouse.

ODI UDF comes very handy.  You can create a UDF to use in your SQL, you can have multiple implementation of the UDF for different database platform.  You can use GetDate() for MS SQL and use SYSDATE for Oracle database in the implementation, but you can create you own function such TODAY() and use in your SQL.

User Defined Function is not a new idea.  You may see something similar in other tools.  However, to be able to use UDF in SQL and to be able to use UDF with multiple implementations under different technologies, I only see the feature in ODI.

I won’t be surprised to see those “me too” products in the near feature.

More and more companies are moving to use prepackaged BI apps.

Posted in BI, BI Application, BI Work, Business Intelligence, Data Warehouse, ETL, Infomatica, OBIA, ODI, Oracle, Oracle Data Integrator | Tagged: , | 1 Comment »

Knowledge Module in Oracle Data Integrator

Posted by Dylan Wan on March 30, 2007

Knowledge Module can determine how the data will be loaded from the source to the target database. Each Knowledge Module is a text based template that is used to generate the codes. It includes the skeleton, which can be in any language or technology. For example, if I select “Oracle SQL loader” LKM, ODI will generate the loader scripts. If I choose “External Table”, it generate the Insert and select statement using the external table feature.

Knowledge Module isolates the logical mapping from the different technology, so the mapping can be used from different platforms and DBMS systems. At the design time, I can specify which knowledge I want to use by the drag and drop interface. At the run time, the tool will use the Knowledge Module I specify, and dynamically generate the codes and process the job. I feel that it is very powerful and is the key component for ODI to support the hetergenious environment.

We can add our own knowledge module for the reusable components, opportunities for partnerships.

Posted in ETL, ODI, Oracle, Oracle Data Integrator, Sunopsis | Tagged: , | 2 Comments »

Download Oracle Data Integrator (Sunopsis) from Oracle Technology Network!

Posted by Dylan Wan on January 20, 2007

Sunopsis has now renamed as Oracle Data Integrator and the evaluation version of the software is available for download here: Read the rest of this entry »

Posted in BI, Business Intelligence, Data Warehouse, ETL, OBIEE, ODI, Oracle, Oracle BI Suite EE, Oracle Data Integrator, OWB, Sunopsis | 2 Comments »