Posted by Dylan Wan on November 10, 2015
ODI Topology allows you to isolate the physical connection and the logical data source by defining the physical schema and logical schema.
This object may be seen as redundant during development. However, it is a very useful feature for supporting the Test to Production (T2P) process. Read the rest of this entry »
Posted in BI Application | Tagged: BIAPPS, DAC | Leave a Comment »
Posted by Dylan Wan on September 15, 2015
OTBI Enterprise is the BI cloud service, a SaaS deployment of OBIA. It is using a data warehouse based architecture. The ETL processes are handled within the cloud. The data are first loaded from either on premise or cloud sources using various means in the original formats. The data are first loaded into the SDS schema, then the regular ETL processes that were designed for on premise deployment can be used to populate the data warehouse. All these processes are supposed to be transparent to the deploying companies.
As of September 2015, OTBI-E for HCM is the only module that is available. Read the rest of this entry »
Posted in Business Intelligence | Tagged: BI, BI Cloud Service, BIAPPS, BICS, Cloud, HCM Analytics, SaaS | Leave a Comment »
Posted by Dylan Wan on April 3, 2015
1. Market Segmentation in the academic world
Market Segmentation is part of marketing process. It is described in Philip Kotler’s book as part of the step of defining the market strategy.The idea is to segment consumer market by some variables and to divide the market into different segments. Selecting the segments for your products is the result of the marketing strategy. If you have multiple product lines, this definition process can be done for each product line if not done at the individual product level.
For example, my product X is targeted to sell to the consumers who are women, without kids, living in city, and having more than $90000 income.
This web page includes a very good and concise summary – Market Segmentation and Selection of Target Segments. It reminded what I learned from my Marketing course in 20 years ago.
2. Marketing Segmentation as a product feature
Once the variables and the segments are defined as part of the marketing strategy, Read the rest of this entry »
Posted in Business Intelligence | Tagged: BIAPPS, CRM Analytics, Data Mining, Marketing Analytics | Leave a Comment »
Posted by Dylan Wan on April 2, 2015
Here are the various ways that we can use Data Mining inside BI.
We can build Advanced Analytics applications.
The scoring function can be called within the opaque view or with EVALUATE function.
The opaque view method may provide a better flexibility since multiple columns can be exposed.
Here is an old Oracle white paper about how to use EVALUATE in BI Server: white paper
Posted in Business Intelligence | Tagged: BI, BIAPPS, Data Mining, OBIEE, ODM | Leave a Comment »
Posted by Dylan Wan on March 23, 2015
BI Apps data warehouse design is based on an assumption that the data warehouse schema design is independent from OLTP system.
The staging schema is an universal staging and the data warehouse is an universal data warehouse.
The assumption is that no matter what the source system you are using, the business questions the BI consumers have are similar. You want to see revenue, cost, and profitability, regardless if you are using Oracle E-Business Suite, PeopleSoft, or J.D. Edwards, revenue are still revenue, cost are still cost. The definitions of these metrics won’t vary too much from the OLTP systems you are using.
It is the BI Apps development team’s job to find where this numbers should be extracted from the source systems and to define the “map” between the source system and the target data warehouse system.
This universal data warehouse assumption does give BI Apps a unique advantage. It allows the teams develop the source adaptor independently from the data warehouse design and thus independently from the BI metadata.
In another way to view this, the application logics built in the BI tool and the knowledge of designing the data warehouse schema can be reused over time and over many projects.
Adding the support of the 1st adaptor to E-Business Suite may take sometime to analyze the analytic requirements. Adding the support of the PeopleSoft and JDE adaptors becomes a job of providing the source and target logical mapping and create the ETL jobs of loading the universal staging tables.
Posted in Business Intelligence | Tagged: BI, BIAPPS, Data Warehouse | 1 Comment »
Posted by Dylan Wan on September 28, 2014
The importance of the inventory management can be best described via working capital management.
Cash is the blood of the company.
Cash conversion cycle is measured as Read the rest of this entry »
Posted in Business Intelligence | Tagged: Account Receivables Analytics, BIAPPS, DSO, Inventory Analytics, Supply Chain Analytics | Leave a Comment »
Posted by Dylan Wan on September 28, 2014
The key question are about inventory and orders.
- What is our inventory status? by product and by location?
- Do we have enough inventory on hand and in process to meet customer demand?
- How can we maximize customer satisfaction while minimizing cost and size of inventory on hand? Both over stocking and under stocking are bad
- Do we have enough finished goods to our backlog orders? How much raw materials, work in process, and finished goods on hand?
- How well is our order fulfillment process performing?
- What are the top orders in backlog? How can we prioritize strategic customers?
- How can we ensure fulfillment on time?
- Which products have the longest fulfillment cycles?
- Where are the bottlenecks in order management process?
- What is our order management cycle time? How long does it take for booking an order?
- Do we know why customers cancel orders and return products?
- How much revenue was lost due to order cancellations or returns
Typical dimensions: By Product, By Location, By Customer
Different roles: Director of Sales Operations, Inventory Managers, Order Management Managers
Integration with other systems:
- Customers and pipeline information from a CRM application
- Product information from ERP
- Workforce information from Human Resource
- Align the training plans to skill gaps in critical product area
- Correlate staffing levels with order volume and product plan
Posted in BI Application | Tagged: BIAPPS | Leave a Comment »
Posted by Dylan Wan on May 10, 2013
The data warehouse needs to be refreshed periodically to make sure that the data warehouse have the up to date information. The Incremental Load is the regular ETL process running everyday. The frequency can vary, but it is commonly seen as nightly job. To ensure that the incremental ETL job can be finished within a limited time window, incremental ETL load typically identifies the changes happened in the source system and only deal with the delta to avoiding doing a Full ETL Load.
In BI Apps 22.214.171.124.1, if you are not leveraging the Golden Gate and the SDS (Source Dependent Data Store), the incremental ETL is accomplished by directly comparing the last update date captured in the source table with the last refreshed date.
BI Apps 126.96.36.199.1 ODI based ETL does not use separate ETL tasks for incremental load and full load. The logic is determined by whether the data has been executed before. In other word, it is based on whether the data has been refreshed. This was a productivity gain for BI App developers and also make the support of incremental ETL becomes clear. In the past, you may see the Incremental Load Informatica Workflow and Full Load. Every time when there is a logic change, the workflows need to be modified. It was a source of code bugs before. By using a better ETL architecture, we eliminate the needs of maintaining two separate tasks.
Here is a high level summary of how the incremental ETL is accomplished in BI Apps 188.8.131.52.1 ETL:
1. Use the W_ETL_LOAD_DATES table for tracking all the ETL execution
2. Use the IS_INCREMENTAL Variable for holding and passing the state
3. Refresh Variable in a Package Step. If there is an entry in the W_ETL_LOAD_DATES, perform an incremental ETL.
4. Use the RUN_INCREMENTAL to turn on and off the incremental filter
5. Optionally, use the ODI Variable Evaluate Step to branch the Full and Incremental ETL logic within a package
6. Table Maintenance, take the ETL Load Date information to determine the operations, including Drop Index, Truncate Table before Full Load, Rebuild the index after Load, and Analyze Table after Load, tec.
Posted in BI, BI Application, ETL, ODI, Oracle, Oracle Data Integrator | Tagged: BIAPPS, ETL, OBIA, ODI | Leave a Comment »