Rules Files from Basic to Advanced
Glenn Schwartzberg ♠
About interRel
• 2008 & 2009 Oracle Titan Award winner - EPM Solution of the year• 2008 Oracle EPM Excellence Award• 2009 Oracle EPM/BI Innovation Award• One of the fastest growing companies in the world
(Inc. Magazine, ’08 & ‘09)• Two of the three Hyperion Oracle ACE Directors in the world• Founding Hyperion Platinum Partner; now Oracle Certified Partner• Focused exclusively on Oracle Hyperion EPM software
– Consulting– Training– Infrastructure and Installation– Support– Software sales
2
• 5 Hyperion Books Available:– Essbase (7): Complete Guide– Essbase System 9: Complete Guide– Essbase System 9: End User Guide– Smart View 11: End User Guide– Essbase 11: Admin Guide– eBooks available on Amazon Kindle
• Just out!– Hyperion Planning for End Users
• Coming Soon– Hyperion Planning for Admins (days now…)– Hyperion Financial Management (Q1 2010)
• To order, check out www.lulu.comCopyright © 2007, Hyperion. All rights reserved.3
These slides represent plagiarism and opinions of the presenter and do not
constitute official positions of Oracle or any other organization.
This material has not been peer reviewed and is presented here in spite of the better
judgment of the .presenter
Disclaimer
A Quick Poll
• Who here is (You can only vote once)– New to Rules files– Has limited exposure to Rules Files– Is experienced with Rules Files– Should be up here teaching the session– Came in to take a nap– Wants a Dollar (Too bad you already voted)
Agenda
• Overview
• Dimension Build Basics
• Data load Basics
• Unique situations
Rules File Basics
• Can be used to build Dimensions and/or Load Data
• Can use multiple types of input– Text files – Excel File
• Excel 2003 format and before as of V11.1.1.3
– Sql statements– Error files from other load rules
• Rows starting with \\ are ignored \\ Member Mfg_Ops_Supp Not Found In Database
EMEA,IE_CORP_USD,2010,9,3301,4320,100000,23063
• Remember to not use the same error file name
Rules File Basics (Continued)
• Basic ETL is possible– Splitting fields– Adding text
• Prefixes• Suffixes• Whole columns
– Replacing values– Joining fields (additive or merging)– Moving fields– Null field can cause promotion for dimension
builds (if selected)– Selection/Rejection or rows possible
Dimension Build Methods
• Common– Generation– Level– Parent/Child (best for shared rollups)
• Uncommon– As Child Of– As sibling With Matching String– As Sibling of Lowest Level
Common Examples• Generation
– Top down– Dimension is Gen 1
Level Bottom up Leaf is level 0
Parent/Child Parent exists before
child is added Additional
occurrences create shared members
As Child Of• Parent must already
exist• Note - the heading
for the column is the dimension name
• Good for rejections and unclassified new members
Sibling with Matching String• Tries to match as best it can. • Starting with left characters - Equal or greater than • If no match adds to bottom of dimension• Put in dimension name as field header
Example or Matching StringBefore
After
Good when there is logic in numbers like product codes and accounts
As Sibling of Lowest Level• Adds to the first hierarchy level 0 member• Put Dimension name as column heading• Good for simple hierarchies
Shared Members
• Easiest to add with Parent/Child format– Secondary instances of data gets added as
share
• Can Also Create Shares with Level or Generation builds– Use DupGen with Generation build– Use level with Level Build
• Examples of Generation and level build
Shared Members• Generation
Level
Shared Members
• Can’t move shared rollups with a single rule– Two rules
• Specify remove unspecified in first rule• Load a dummy file with a single row with first rule• Next specify merge in second rule (or use first rule)• Load complete hierarchy with second rule
– Load with same MaxL statement so you don’t lose data
Example • Create a file with a single row• Set Member Update to remove
unspecified
Example Page 2Start Load Remove File
After
Example Page 3
• Load full reload file After
Important – Remember to do Remove and rebuild(s) in a single MaxL statement or you will lose all the data from the cube
Data Loading
Much simpler than
Dimension Building
Most Efficient Data Loading (BSO)
• How to get the most efficient format– Load a single row into your database from
Excel– Export the database in column format– Look at the file
Build your load rules in the same order as the file (if possible)
Make sure to sort the file from left to right If you change the dimensions (dense/Sparse,
order) you need to rebuild the rules file
Data Load Efficiency(BSO) Continued
• Reduce the amount of data you are loading– Remove extra columns from your file– Convert zeros to #Mi– Use a Header whenever possible instead of creating
a text column– Minimize replacements and selection/rejection criteria
(do in source if possible)
ASO Load Process
• Behind the scenes, the ASO data load process is a bit different from the loading of block storage databases
• Data sources include text file or relational database• Can load one or more data sources• Can load with or without rules files• If multiple sources are used, choose
– Overwrite existing values– Add to existing values– Subtract from existing values
• Because the data files are potentially very large, a temporary load buffer is used– Loading a single load file does not involve the buffer
Prepare for the Data Load• Don’t include fields that are applicable to BSO
(they will be ignored)• Load level zero data only• If #Missing is specified, the cell will be removed
from the database• Don’t worry about sorting*
– The load buffer will sort and accumulate values
• Currency name and currency category are not supported
*Presorting multiple data source loads will improve performance even though the buffer always sorts: A sort-sort-merge-sort will be faster than a nonsort-nonsort-merge-sort. Not as dramatic a difference as BSO, but still important on very large loads and time critical incrementals.
Use Load Buffer with Multiple Files
• Aggregate Use Last option• Resource Usage
26
Load Data with MaxL – Multiple Data Sources
• Using the load buffer during incremental data loads improves performance
• Initialize the load buffer to accumulate the data– alter database AsoSamp.Sample initialize load_buffer with
buffer_id 1;• Read the data sources into the load buffer
– import database ASOSamp.Sample data from server data_file 'file_1' to load_buffer with buffer_id 1 on error abort;
– import database ASOSamp.Sample data from server data_file 'file_2' to load_buffer with buffer_id 1 on error abort;
– import database ASOSamp.Sample data from server data_file 'file_3' to load_buffer with buffer_id 1 on error abort;
• Load data from the buffer into the database– import database ASOSamp.Sample data from load_buffer
with buffer_id 1;
Load Data with MaxLMultiple Data Sources
• Import statements do not need to be continuous• As long as the buffer exists, the database is
locked from queries, aggregations and data loads by other means
Concurrent Loads
• Multiple load buffers can exist on an ASO database
• Load data simultaneously using multiple data buffers
• Commit multiple data load buffers in the same operation– Faster than committing each buffer by itself
• Must use separate sessions in MaxL
29
Concurrent Data Loads• MaxL Session 1
alter database AsoSamp.Sample
initialize load_buffer with buffer_id 1 resource_usage 0.5;
import database AsoSamp.Sample data
from data_file "dataload1.txt"
to load_buffer with buffer_id 1
on error abort;• MaxL Session 2
alter database AsoSamp.Sample
initialize load_buffer with buffer_id 2 resource_usage 0.5;
import database AsoSamp.Sample data
from data_file "dataload2.txt"
to load_buffer with buffer_id 2
on error abort;• When data fully loaded, one commit statement
import database AsoSamp.Sample data
from load_buffer with buffer_id 1, 2;
30
Slice Loading = Incremental Loads
• Enables “trickle feed” functionality• Issue - Users perform retrievals when the
database was being loaded and ASO loads can take a while
• Incremental loading creates subcubes or slices along side the primary slice of the database
• Dynamic aggregations are performed across the necessary slices to provide query results
• Different materialized views might exist within a slice as compared to the primary slice of the database
31
Incremental Data Loads
• Incremental data load time is proportional to the size of the incremental data
• Options– Merge all incremental slices into the main database
slice or– Merge incremental slices into a single data slice,
leaving the main db slice unchanged
• Load data into multiple data load buffers at the same time
32
Incremental Loads
33
Merging Slices
34
Partial Data Clear
• New feature in 11x• Why?
– Need to clear and modify selected portions of the cube– Refresh actuals but budget is static
• Physical clear completely removes cells– Longer to clear– Faster retrievals after aggregation
• Logical clear removes cells by creating compensating cells in a new slice– Faster to clear
35
• Unique situations
Blank Values
• By Default, Essbase will replace a blank value with the prior row value (Not necessarily a good thing)
• To replace with a default Value– Add a text column with a unique value or phrase (I
like the Tilde “~”)– Concatenate it with your column– Replace ~ with the default Value selecting replace
whole word– Replace ~ with nothing without checking Whole word
Blank Value Example
Flipping the sign on Data Loads• Three ways (there are more)
– Use a UDA and specify it in the load rule• I’ve been told it does not work on ASO
– Scale Column– Keep Natural and use a formula to reverse
Rejection/Selection Criteria
• Select or reject rows based on criteria• Uses And/or logic for a single column• Field 2 = Jan or Feb or Mar
Global Selection/Rejection• Across columns can be set for “And” or “Or”
logic– Select if field 1 = Actual and Field 5 contains 2010
only selects where both conditions are true– Reject if Field = Budget or Field 5 < 2010 rejects if
either is true
• Dimension building and data loading can be set differently for global option
Sub Variables in Load Rules
• Where can they be used?– As a column header– As a data source for SQL– Inside of SQL
Suppose we create the following variables
As Column Header
Remember to include the “&”
As a File Header
As Data Source
Makes it easy to change environments
Result
In the SQL
Result
More SQL• If you use the “Select”,
“From” and “Where” boxes (don’t include keywords
• To sort or group you need a where clause
• Try 1=1 group by x,y,z order by 1,2,3,4
• Even better
• Put everything in the select clause– Note- no beginning “Select”, but one after
union
More Load Rule Tricks
• Using Header names as column headers• Why?
• Use one load rule for different load files• Simplifies process, less to code
Load Rule - LaodAll
Load Rule - LaodAll
Dynamically add DTS
DTS member
Gen Name
H-T-D History
Q-T-D Quarter
Y-T-D Year
M-T-D Month
S-T-D Season
W-T-D Week
P-T-D Period
D-T-D Day
Other properties you can set
Additional Topics
• Freeform data loading• Time based loading• Attribute dimension loading• Setting properties• Removing UDAs• Join/Create Join Create Text• Studio rules
– Can’t edit Dim builds– Can edit Data load– Use Deploy MaxL
• Sql vs. Flat file loads (pros and cons)• One file to do both Dim and data loading
Questions
????????
[email protected]://glennschwartzbergs-essbase-blog.blogspot.com/