+ All Categories
Home > Documents > Xxxxxxxxxxxxx Name: xxxxxxxxxxxx ID: 2005HZxxxxxxx Organization: xxxxxxxxxxx Company Location:...

Xxxxxxxxxxxxx Name: xxxxxxxxxxxx ID: 2005HZxxxxxxx Organization: xxxxxxxxxxx Company Location:...

Date post: 28-Dec-2015
Category:
Upload: frederica-marsh
View: 262 times
Download: 5 times
Share this document with a friend
Popular Tags:
34
xxxxxxxxxxxxx Name : xxxxxxxxxxxx ID : 2005HZxxxxxxx Organization : xxxxxxxxxxx Company Location : Bangalore
Transcript

xxxxxxxxxxxxx

Name : xxxxxxxxxxxxID : 2005HZxxxxxxxOrganization : xxxxxxxxxxx CompanyLocation : Bangalore

Introduction

The LaQuinta Enterprise Reporting Architecture is a Strategic and Tactical BI platform for the analysis and reporting of LaQuinta enterprise data. The following business groups are the major users.

•OPERATIONS•MARKETING•SALES•FINANCE•REVENUE MANAGEMENT•CALL CENTER•INFORMATION TECHNOLOGY

Architecture

The Enterprise Architecture is based on a 4-Tier structure with "LaQuinta Data Warehouse 2.0" as the focal point.

The four tiers consist of:

•Legacy and Transaction Source Systems•Data Staging•LaQuinta Data Warehouse •Cognos Presentation

Fact Tables

With LAQ DW2.0, there are 5 new fact tables added along with providing more dimension data in the existing fact table. The following are the fact tables used in the LaQuinta DW2.0 project.

• Reservation Fact ( Exists in DW 1.7 )• Group and Tour Fact• Central Reservation Office FACT • Denials Fact• Call Center Fact• Budget Fact

Key Features of Version 2.0

• Subject orientation - Multiple subjects such as Folio, Call Center, Budgets, Sales, HR, etc.

• Conformal dimensions - These are dimensions that can be shared across subject areas

• Multiple Aggregations - Need based aggregation for specific summary level reporting

• Standard naming conventions across the board.• Optimized data structures using standard data warehousing

principles.

Architecture Goals• Improved visibility of Enterprise BI

– Subject orientation - Add new areas for analysis and reporting– Enable user teams to create strategic and tactical dashboards– Make available ONE Enterprise Metadata dictionary

(Business/Functional/Operational)– Consistency in handling time, booking, stay, etc. across the board– Fix existing problems – (Data Gaps, Cognos issues)

• Remove inefficiencies: Data structure, ETL, Reports, Cubes– Standardize DW practices (features, design, naming, reference data…)– Lighter, simpler Cognos presentation layer – Scalable, flexible structure and processes– Ease of configuration and maintenance

• Build upon the existing foundation

Why Version 2.0?

LaQuinta DW was the existing data warehouse application build by the other vendor. But in the on going business, there were lot of features needed and the existing systems could not scale up to accommodate the changes and also consisted issues/bugs. To accommodate, we developed and planned to release a version called LaQuinta DW 2.0. LaQuinta uses the application called MQC (Mercury Quality Center) to log the issues and bugs and suggestions with respect to all the projects.

Project Schedule from MS Project PlanMonth Week Task

June 1 Submit the Outline

June 2,3 Functional and Technical Requirement Gathering

June 4 Design Review

July 1,2UI Design Review and Assisting for the Architectural

Design

July 3 Designing the Database

July 4 Integrate with Company Standard and Codes

August 1,2,3,4 Developing the ETL Operations

September 1,2 Unit, Peer and Integration Testing

September 3 Production Deployment

September 4 Production Support and Bug Fixings ( Regular Activity )

October 1 Upload and submit the Final Report

ETL Tool Used in the Warehouse

ETL Tool used in this application is SQL SERVER 2000 DTS packages and following are the packages information used in the LaQuinta Data Warehouse 2.0 project with respect to Reservation Fact

Reservation Fact

The transactional data from Night Vision Corporate (NVCorp) is extracted using the master DTS script in the Data Warehouse Server at 07:00am EST everyday and loads the data in the Reservation Fact. The data flows into NVCorp from the Property Management System (PMS) at night and this process is call the Night Audit.

The night audit process captures all the transaction that happens in a PMS and consolidates the data in NVCorp. The following are the detailed information about the sub packages executed by the master DTS script. The DTS package used for loading the data into the Reservation Fact is: LOAD DATA WAREHOUSE

ud Master Daily DataWarehouse Load

DW Timer

(from Actors)

Load Static Data to Staging tables

Load Static Data to Dimension tables

Load Daily Folios and Folio Charges

to Staging

Load Daily Folios to FACT Table

Process Daily Consumed Room

Night and Rev enue

transactions

Load Booking Pace and Booking Horizon

FactsExecute Data

Audit Tool

Use Case Diagram for Reservation Fact

Reservation Fact

The following are the Packages used with respect to Reservation Fact.

• Daily Data Warehouse Dimension Load• Daily Data Warehouse Staging Load• Transactional Staging Feed• Transactional Fact Feed

Reservation Fact

Daily Data Warehouse Dimension Load

This is a sub package of the Master DTS package that runs on a daily basis to load the LAQ Daily Data Warehouse fact table.

This process will load the following dimensions:

• ROOMTYPE• COMPANY

Reservation Fact

Daily Data Warehouse Staging Load

Each day the Master Data Warehouse Scripts call this sub package and loads the data to the staging environment.

The name of this DTS Package is Load Static Data to STAGING and it is located on the DW server.

Generally the following items moved into the Staging tables.

• Room Type• CRS Numbers• Company Data• Aptech Properites

Reservation Fact

Daily Data Warehouse Staging Feeds

This process load the following items into the Staging Tables.

• Daily Folio Charges• Cash Sale• Foliolink Reservation Rate• Zero Fact

Reservation Fact

Daily Transactional Fact Feeds

This process load all the reservation details from staging tables to the Reservation Fact table and the following items are loaded into the Fact Table.

• Daily Folio Charges• Daily Consumed Room Nights• Daily Consumed Revenue• Daily Cancels• Booking Fact• Booking Horizon

Reservation Fact

Load Data Warehouse Master Script

• This is the MASTER DTS package that gets executed once a day to load the Reservation Fact. It also loads the Booking Horizon and Booking Pace fact tables on a daily basis as well.

• It is a schedule job on the DW server. It is scheduled to run each day starting at 7:00 AM Eastern Standard Time. It will only start and process if the Night Audit posting has completed 99% on that day. The job that it polls for to see if it is complete is a DTS package called “Load Booking Data". If that job has not completed by 7:00 AM the job will not execute.

Reservation Fact

Data Audit Tool

This is a daily process that Collects Occupancy and Revenue Numbers from NVCorp, APTECH and Data Warehouse tables and compares it by property by stay date. The Occupancy and revenue difference between NVCorp and APTECH and NVCorp and Data Warehouse is calculated by stay date. The comparison is stored in a table called DATA_AUDIT_TOOL on staging database of DW box.

Group and Tour Fact

Group and Tour Booking fact is the other fact table used to store only the Group Booking booked through call center by the VAS agents.

ud Daily Group Booking Load

Update DW Fact table

Extract Group Link Data

DW Timer

(from Actors)

Use Case Diagram for GNT Fact

Group and Tour Fact

Extract Group Link Data

In NV, there are 3 files of importance and they are Group Master, Group Header and Group Link. Every time a property creates a Group, it uses the Group Master first. Group information is available only on a 'daily' basis and not by the hour. The hierarchy is like Group Master --> Group Header --> Group Link.

Group link is similar to a folio link and contains information such as:GROUPLINK NUMBERSALES PERSON

To reduce the authorized count on a group booking, the associated folio link records have to cancel first.

The folio number of the group reservation is designated as the group master. To indentify the Group booking details, we have to use the foliolink table in NVCORP database using the master number and group link number to identify the group.

CRO Booking and Denials FactCRO Booking

The Pegasus BDE file is used to pull daily Gross, Net and Cancellations from the daily extract. The extract is done by filtering on Reservation Center ID. There are currently 5 reservation center id's that are used to filter Call Center Reservation Data.

They are:'RETRC','NBCRC','LQCRC','VAOPS','NETRZ‘

Denials

Each day as part of the Pegasus feeds we get the Yield records better known as Denials data. LaQuinta receives both System generated and User generated denials and following can be the reason for the denials.

• Closed To Arrival • Location Does Not Meet Needs • Minimum Length Of Stay Restriction • No Availability - Hotel Is Sold Out

ud Daily Call Center Data Load

Load Call Center Fact data

Load Call Center Dimensions

Load Agent-Operator Mapping

Generate Daily VAS Data Feed

Files

DW Timer

Use Case Diagram for CRO Booking Fact

Call Center Fact

Call Center Fact loads the data with call center details like agent operator who made the reservations, his skill sets, Denials information and Call Center location.

The Daily VAS extract files include all CRO feeds such as:

• DNIS Reference• DNIS Activity• Skillset Reference• Skillset Activity• Agent operator – VAS Agent Mapping

Budget Fact

This ETL process is normally meant to be run on a yearly basis. However, it has been designed to run any time on demand. It is capable of detecting and updating changes any time during the year. The data pertains to the budget values (room revenue, room occupied percentage and rooms available) for all properties for all 365 days of the year.

Extract Budget Data from Aptech Datamart Database and load it into the LAQUINTA_DW data warehouse. The budget data is available only at a property level in the TransBudget table of Aptech datamart.

ud Daily Budget Load

Extract Budget Data

DW Timer

(from Actors)

Load Budget data

Load Budget Dimensions

Use Case Diagram for Budget Fact

UAT Environment

How Data Initialization was made to start with the project?

The following steps were used to initialize all the dimension tables used in the LaQuinta DW 2.0 Project.

• Truncate all new tables.• Populate Dimension Data for each new dimension table from

existing dimension tables after applying the required transformation logic.

• Run the Fact Load process for loading New Fact tables using data extracted from source.

UAT Environment

Method used for Data Validation

Following methods are used to do the data validation both in ETL and Cognos side.

• Cube/Report Validation– Run Queries against the Data warehouse– Values from cubes/reports should match query results

• ETL Validation– Run Queries against the source data– Run Queries against the Data warehouse

These two query results must match.

29

Staging_DW 72 Reports (47-DW 25-Aptech)

6 Cubes

Source

NVCorpBDEVAS

CURRENT:DW1.7 on10.240.200.105

7 AM 1 PM

Staging and Dimension Delta Load

11AM 3 PMDW Load

LaQuinta_DW

Staging_DWH

DW2.0 onServer10.240.200.21

Cube spinDW Load

LaQuinta_DWH

UAT Deployment Diagram

• Run two ETL processes from NVCorp.• Two sets of cubes and reports for UAT comparison the next day 8AM CST.• Copy production cubes to the TEST portal and rename it.

9 PM 3AM1AM 5 AM

47 Reports10 Cubes

Staging and Dimension Delta Load Cube spin

ETL Test Results from MQCLaQuinta uses the application called MQC (Mercury Quality Center) to log the issues and bugs and suggestions with respect to all the projects.

During the UAT environment, we ran the test cases in the issue tracking tool to log the test results and below is the test results graph.

Target Deployment Diagramdd LAQ Deployment Model

Springfield LAQ

«device»

SAN

Hosting Aptech Client Apps (Citrix)

Aptech Client App Components

+ Excel

+ Powerplay Client Application

+ Powerplay Excel Add-In

Secondary Portal Serv er - SPRPQAPT01 (10.240.200.106)

Aptech App Serv er Components

+ Cognos Server 8.2

+ PowerPlay Enterprise Server 8.0

+ Powerplay IIS Server

+ Powerplay Web Client

NVCorp DB Serv er - SPRPQNVDB01 - 10.240.200.104

CRS (Pegasus)

FTP Serv er

FMI serv er Medallia Serv er

NVLog

LaQuinta3

Aptech Cube Builder - SPRPQAPT02 (10.240.200.107)

«database»

Teradata warehouse 8.0

LAQETL03

LAQ DataWarehouse DB Serv er - SPRPQDMDB01 - 10.240.200.105

«executable»

Trillium 7.5

SQL Serv er 2000

Aptech DB Component

+ LAQ Data Repository

+ SQL Server 2000 (STD)

Aptech Cube Component

+ LAQ Cube Model

+ Powerplay Impromtu

+ Powerplay Transformer

+ SQL Client Tools

Audit

Cognos Content

Store

TAC

Custom Cognos Cube Processor (10.240.200.144)

Audit

VAS Secure FTP Serv er

VAS Machine

Cold Stand By Machine NVCORP\APTECH\DW

NVSTAGE

PPSDB1

«database»

FSPROD

PPSDB2

«database»

FSPROD (Failov er)

PPSWeb1PPSWeb2

PPSAPP1PPSAPP2

Peoplesoft Webserv er

Peoplesoft Webserv er

Peoplesoft App Serv er

Peoplesoft App Serv er

NVCorp

Medallia Extract

FMI Extract

PMS HubVAS ACD

NVCorp-Peoplesoft DTS

SiteScope Serv er

SiteScope

Primary Portal Serv er - SPRPQCOGW01: (10.240.200.95)

LAQ App Serv er Components

+ Cognos Server 8.2

+ Production Portal

+ Published Cubes and Reports

LAQ DW Component

+ Revenue Booking Channel Datamart

+ Revenue Folio Data Mart

STAGING

LAQ_DWH Component

+ Group_and_Tour_Datamart

+ Reservation_Folio_Datamart

+ CRO_Booking_Datamart

File

ODBC

FilesFiles

ODBC

File «FTP»

ODBC

File «FTP»

ODBC

NVCorpTransaction LogShipping

Aptech\DMDaily\BackupRestore

Files

Revenue

DailyCubeUpload

«fiber»ODBC

Files

ODBC

Daily PMS Rev &Stats

VAS Extract Files

«fiber»

Invokes theRM data FTP

Cube DataSource«Cube

Processor:Import»

Content Store Files

Content Store Files

RemoteCube«TCP/IP»

ODBC

Summary

With the implementation of DW2.0, we have been able to provide the LaQuinta business users with the benefits of in-depth data analysis/projections along with including other business components as a part of the data warehouse model. We were also able to streamline the existing version to meet the data warehousing standards.

Along with the above inclusions, we were able to provide the customer with performance enhancements on the complete data warehouse model as an value-add and also been able to fix few know issues/bugs.

Future Work

Once the DW2.0 is released into production, we’ll move into the Production Support phase where the entire data warehouse environment would be monitored closely until the system stabilizes and also be able to fix any issues that crop-up during the initial stages.

Once the data warehouse system is stable, we’ll start with the next level of enhancements as a part of DW2.1.

Thank You


Recommended