+ All Categories
Home > Data & Analytics > Dynamic Integrations for Multiple Hyperion Planning Applications

Dynamic Integrations for Multiple Hyperion Planning Applications

Date post: 13-Jan-2017
Category:
Upload: rodrigo-radtke-de-souza
View: 205 times
Download: 1 times
Share this document with a friend
29
DYNAMIC INTEGRATIONS FOR MULTIPLE HYPERION PLANNING APPLICATIONS Ricardo Giampaoli TeraCorp Rodrigo Radtke de Souza - Dell
Transcript

DYNAMIC INTEGRATIONS FOR

MULTIPLE HYPERION PLANNING

APPLICATIONS

Ricardo Giampaoli – TeraCorp

Rodrigo Radtke de Souza - Dell

Giampaoli, Ricardo

● Master degree in Business Administration and IT management

● Founder and President of Teracorp Consulting

● Fifteen year working with IT and the last six years as an EPM solution architect

● Focused in Oracle Essbase, Planning, OBIEE and Datamart design using ODI or OWB

● EPM training instructor

● Essbase Certified Specialist

● Blogger @ devepm.com

Radtke, Rodrigo

● Graduated in Computer Engineering

● Software Developer Advisor at Dell

● Ten years working with IT and the last five as ETL architect

● Oracle database Certified Expert

● Java SCJP and SCWCD certifications

● ODI/ETL Expert

● Blogger @ devepm.com

About the Speakers

TeraCorp is a company specialized in products and

services focused on EPM

TeraCorp mission is to create innovate solutions

that helps people, businesses and partners to

exceed their goals reaching their full potential.

Learn more @ www.teracorp.com.br/en

About TeraCorp

Advanced Knowledge of ODI

Average Knowledge of Hyperion Planning

Good Knowledge of SQL

Pre-Requisite

The Journey to Dynamic Integration

The Beginning

We have a Problem

Gathering the Data

Loading Dynamicaly

Putting Everything Together

Conclusion

QA

Agenda

Currently there are 5 Planning Applications, 3

Regional and 2 worldwide

Replacement of 1 worldwide Forecasting

Application would require rewrite of many

interfaces

● New Application Metadata depends of the others

applications

● Costs to change the current ODI Architecture justified the

creation of a Dynamic framework

Motivation

The Jorney to Dynamic Integration

Default ODI Development objects

One logical schema

to each physical

schema (Planning

application)

One model for each

logical schema

Each model

contains one data

store for each

dimension

One interface to

load each data

store (dimension)

The Beggining

Logical and physical schemas

Configuration

Reversing the dimension to

create the Model

Interface Creation Using

Model

Default Planning Metadata Loading Process

The Beggining

DataStore

Planning IKM

Result SqlPlanning

ETL mapping done in target data store

IKM uses ODI API to load from source data stores to Planning

ODI API translates target data store mapping into a SQL command

Metadata is loaded to Planning

Default Planning Metadata Loading Overview

A Planning instance can have any number of Applications

One application contains N dimensions

Each dimension requires at least one interface to load metadata

Any special requirement besides loading requires an extra interface

The Beggining

ODIDimensionsPlanning

Application 1

Account

Interface 1 (Load)

Interface2 (Delete)

Entity

Interface 3 (Load)

Interface 4 (Delete)

Interface 5 (Attributes)

Products Interface 6 (load)

Dimension N Interface N

Application N

Default Planning/ODI

Development

Actions to Achieve a

Smart/Flexible Solution

Problem/Solution

We got a Problem

One interface per Dimension/Application

ODI IKM works with only one application/dimension at a time

Metadata information from multiple sources with different

data formats

Different operation besides moving requires a new interface

Load is done reading the full source tables every time

Generic process to load any number of applications and dimensions

ODI IKM with dynamic target applications and dimensions data stores

Generic inbound table for metadata

Generic components to handle different metadata situations

Load only the metadata that has changed

Have easy access to the source and target data

● Creating Inbound and Extract Tables

Create a delta between the data coming from the

system of records and the Planning applications

● Create a Tie out Process

Classify the metadata in categories before the load

● Handles special metadata load situations

Preparing to Load

Gathering the Data

Inbound/Extract Process

Gathering the Data

Extract Table

All other Dimensions

Entity

Account

Inbound Table

DRM

Oracle DB

Flat Files

Any Source system

Inbound Process

Load all metadata from

any external sources

Format all data as

Planning requires

Acts as an abstraction

data Layer

Extract Process

Extract existing

Application metadata

Queries the Planning

repository tables

Used to create tie out

delta

Inbound and Extract Tables

Gathering the Data

Common Columns in all Dimensions/Applications

Columns from the three main Typesof Dimensions

Exclusive Account and Entity Columns

Unique Columns for each Dimension

Unique Columns for Inbound/Extract Tables

Merge of all possible

combinations

Inboud/Extract TableAccount

Parent

Alias: Default

Operation

Valid For Consolidations

Data Storage

Two Pass Calculation

Description

Formula

UDA

Smart List

Data Type

Aggregation (Plan1)

Aggregation (Plan2)

Aggregation (Plan3)

Aggregation (Wrkforce)

Aggregation (Capex)

Plan Type (Plan1)

Plan Type (Plan2)

Plan Type (Plan3)

Plan Type (Wrkforce)

Plan Type (Capex)

Account Type

Time Balance

Skip Value

Exchange Rate Type

Variance Reporting

Source Plan Type

Base Currency

Shori

Prod_Attrib

App_Name

Dim_Type

Hier_Name

Generation

Has_Children

Postion

HSP_OBJECT table has all the

metadata information regarding

all the objects in a Planning

Application

All joins pass through

HSP_OBJECT Table

All joins must be LEFT JOINS

to prevent data loss

Use a CONNECT BY PRIOR

and a START WITH to select a

dimension to extract

Extracting Planning Metadata

HSP_OBJECT

HSP_OBJECT_TYPE

HSP_MEMBER

HSP_ALIAS

HSP_MEMBER_FORMULA

HSP_DIMENSION

HSP_STRINGS

HSP_ACCOUNT

HSP_PLAN_TYPE

HSP_ENTITY

HSP_ ENUMERATION

HSP_MEMBER_TO_ATTRIBUTE

HSP_UDA

Gathering the Data

ODI command on source/target to loop

Main Scenario loops all Planning applications

Extract scenario loops all dimensions and inserts into Extract table

Extract Planning Process

Gathering the Data

Command on SourceCommand on TargetExtract Scenario

Main ScenarioCommand on SourceCommand on Target

Contains the Metadata to be loaded

Contains the existing Planning Metadata

Merges both tables to create a tie out with a condition

column that is used to classify the metadata

Tieout Table

Gathering the Data

Source Table Target Table

Tieout Table

● Join match by the Planning

Unique key: Parent, Member

and Hierarchy. Use the

Application name for multiple

Applications

● Full Outer Join to get all the data

that exists in both sides even if

they not match

● “Case” Statement will identify

data that only exists:

● Data that exists only in Source

● Data that exists only in the

Application

● Data that all columns matches

● And data that not Matches

Tie out Query

Gathering the Data

Moved Members

● Merges all multiple rows that have the same member name but one with “Existing

only in Source” condition and another one with “Exists only in the Application”

condition

Changed Attribute Member

● Two-step operation (delete then insert), instead of normal move member operation.

Searches for all dimension members that have a moved attribute associated with it

and change their condition to NO_MATCH

Reorder sibling members

● Search for all existing siblings of the new/changed member and mark them as

NO_MATCH, indicating that they should be loaded together

Deleted Shared Members

● Removes completely from planning application

Special Conditions

Gathering the Data

● ODI IKM uses ODI API to

define what will be

loaded into Planning

● Mapping on target data

store using source data

store definitions

● ODI API generates a

SQL code based in the

mapping, data stores and

filters

● SQL represents the

Metadata information that

will be loaded into

Planning

Default Load Process

Loading Dynamicaly

Dynamic Load Process

Loading Dynamicaly

● What needs to change in Planning IKM:

● Alter the step Prepare for Loading

● Change getInfo method to getOption

● Create Options to externalize this properties to the interface

● In the interface use the created options to pass the application

name and dimension to KM

Dynamic Load Process

● What needs to change in

IKM:

● Data store/mapping

information

● Replaced by SQL query in

Hyperion Planning

repository using Command

on Source and Target

● Generate the same SQL

as the default IKM

Loading Dynamicaly

Command on SourceCommand on Target

Dimensions Data Store information

Loading Dynamicaly

Account Dimension Entity Dimension User Defined Dimension Attibute DimensionAccount Entity Products Prod_Attrib

Parent Parent Parent Parent LegendAlias: Default Alias: Default Alias: Default Alias: Default Same for all Dimensions

Operation Operation Operation Operation Same for the 3 manly Dimensions

Valid For Consolidations Valid For Consolidations Valid For Consolidations Exclusive of Account and Entity Dimension

Data Storage Data Storage Data Storage Unique for each Dimension

Two Pass Calculation Two Pass Calculation Two Pass Calculation Merge of all Posible combinations

Description Description Description Unique for the Inbount Table

Formula Formula Formula

UDA UDA UDA

Smart List Smart List Smart List

Data Type Data Type Data Type

Aggregation (Plan1) Aggregation (Plan1) Aggregation (Plan1)

Aggregation (Plan2) Aggregation (Plan2) Aggregation (Plan2)

Aggregation (Plan3) Aggregation (Plan3) Aggregation (Plan3)

Aggregation (Wrkforce) Aggregation (Wrkforce) Aggregation (Wrkforce)

Aggregation (Capex) Aggregation (Capex) Aggregation (Capex)

Plan Type (Plan1) Plan Type (Plan1)

Plan Type (Plan2) Plan Type (Plan2)

Plan Type (Plan3) Plan Type (Plan3)

Plan Type (Wrkforce) Plan Type (Wrkforce)

Plan Type (Capex) Plan Type (Capex)

Account Type

Time Balance

Skip Value

Exchange Rate Type

Variance Reporting

Source Plan Type

Base Currency

Attribute 1

Attribute 2

Delete Attribute members

● Filters attribute members with CONDITION status as “No Match” or

“Deleted Attribute” using Load Operation “Deleted Idescendants”

Delete Shared members

● Filters shared members with CONDITION status as “Deleted Share”

using Load Operation “Deleted Idescendants”

Load Planning members:

● Filters all members with CONDITION status that are NOT “Match”,

“Deleted Share” or “Deleted Attribute” using Load Operation

“Update”

Metadata Load

Loading Dynamicaly

Dynamic Integration Scenario

● Receives the application and dimension as parameter

● Extract all metadata to METADATA_EXTRACT table

● Populates METADATA_INBOUND table with source metadata

information

● Creates METADATA_TIEOUT with metadata conditions

● Load only the delta information to Planning

Putting Everything Together

Old Structure New Structure

Process Final Results

Conclusion

Metadata Load Process

Execution Time

Total Number of ODI Objects

Development Cost

Maintenance Cost

Scalability

All metadata is loaded every time the process runs

Eight hours to finish the process for all applications.

Number of Packages: 19Number of Interfaces: 51Procedures: 32

Great effort replicating similar interfaces for new applications and dimensions

Multiple changing/error points to be verified each time a change is needed

Need to create new interfaces to supply new applications

Only the new or the changed metadata is loaded every time the process runs

One hour to finish the process for all applications. (depending with the amount of data that was changed)

Number of Packages: 8Number of Interfaces: 6Procedures: 9

Needs only to configure the process parameter to include the new application or dimension

Centralized code for all applications. Only one place to change if required.

No need to create any new interface. Code automatically identifies a new application based in a configuration table.

Include a new application

● One procedure to load metadata from source to generic

inbound table

● Some inserts into parameter table

● Minimal development time - four days

Complete change in entity structure

● No development effort

● Only one execution of metadata process to reorganize

the entire metadata

Dynamic Structure Maintenance Impact

Conclusion

Integrate ASO cubes to generic metadata architecture

Essbase API to get metadata information about ASO cubes

Maybe in KScope14?

Future Projects

Conclusion

Questions?

Questions

Ricardo Giampaoli – TeraCorpRodrigo Radtke de Souza - Dell

Thank you!


Recommended