PMLC Project Management Life Cycle
The George Washington University eExpense System Implementation Project Test Plan & Procedures
Prepared By: Jeff Pearson
Version: 1
Date: August 13, 2012
Project Owners: Deb Dickenson, Comptroller’s Office Donna Ginter, Procurement
Project Manager: Adam Donaldson Business Management & Analysis Group (BMAG) – Office of Finance
STATUS: Draft
Routing for Document Approval
Approved
Unapproved
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 2
Table of Contents Revision History ....................................................................................................................................................4
1 Introduction ..................................................................................................................................................5
1.1 Executive Summary...............................................................................................................................5
1.2 Purpose of Test Plan .............................................................................................................................5
1.3 Solution Description / Project Deliverables ..........................................................................................5
1.4 Scope of Testing ....................................................................................................................................6
1.4.1 Types of Testing Not Performed: ....................................................................................................6
1.5 Testing Risks, Assumptions & Constraints ............................................................................................7
1.5.1 Test Plan Risks and Risk Mitigation Strategy...................................................................................7
1.5.2 Test Plan Assumptions: .................................................................................................................10
1.5.3 Test Plan Constraints:....................................................................................................................10
2 Test Strategy ...............................................................................................................................................10
2.1 Reliance on Support Resources and Highly Knowledgeable End Users:.............................................10
2.2 Approach for Testing Concur Functionality / Configuration...............................................................11
2.3 Approach for Oracle Financials (EAS) Development / System Changes .............................................11
3 Test Preparation & Planning .......................................................................................................................11
3.1 Test Schedule & Milestones................................................................................................................11
3.2 Test Timeline.......................................................................................................................................12
3.3 Testing Timeline by Test Organization / Test Cycle ............................................................................13
3.4 UAT Testing Timeline Constraints .......................................................................................................14
3.5 Testing Resources Required................................................................................................................15
3.6 Test Scenario Preparation...................................................................................................................15
3.6.1 Test Scenario Development ..........................................................................................................15
3.6.2 Mapping Scenarios to Business Requirements: ............................................................................16
3.6.3 Test Automation............................................................................................................................16
3.7 Test Environment................................................................................................................................16
3.7.1 Facilities.........................................................................................................................................16
3.7.2 EAS Environment (Related Applications / Test Objects) ...............................................................16
3.7.3 Related Applications / Test Objects Excluded from current testing .............................................17
3.7.4 Hardware & Software....................................................................................................................17
3.7.5 Sensitive Data................................................................................................................................18
4 Test Execution.............................................................................................................................................18
4.1 Testing Cycles / Workflow...................................................................................................................18
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 3
4.2 Testing by Test Cycle by Iteration .......................................................................................................19
4.3 Exclusions from Testing / Concur Functionality Not Implemented ....................................................22
4.4 Traceability Matrix ..............................................................................................................................22
4.5 Test Entrance Criteria .........................................................................................................................22
4.5.1 UAT Entrance Criteria....................................................................................................................23
4.6 Test Readiness Review........................................................................................................................23
4.7 Test Results Documentation ...............................................................................................................23
4.8 UAT Tool Test Results Categorization .................................................................................................23
5 Test Review & Reporting.............................................................................................................................24
5.1 Test Exit Criteria ..................................................................................................................................24
5.1.1 End‐to‐End Exit Criteria.................................................................................................................24
5.1.2 UAT Exit Criteria ............................................................................................................................24
5.2 Test Results Review.............................................................................................................................24
6 Test Scenarios .............................................................................................................................................24
7 Approval......................................................................................................................................................25
8 Appendix A – Concur / eExpense Process System Flow Diagrams .............................................................26
9 Appendix B – Business Requirements from Project Proposal ....................................................................27
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 5
1 Introduction
1.1 Executive Summary This document describes the testing strategy and the planned testing approach related to the University’s eExpense System Implementation project. The eExpense solution will allow for a seamless integration with the Travel Portal solution, which collects data at the time of travel booking and through the travel of the user, pre‐populating expense reports. Non‐travelers who are required to submit expense reports are also able to easily navigate the system. eExpense allows for closer‐to‐paperless end‐to‐end processing, offering a fax‐or email‐to‐report option for those users without access to electronic versions of receipts. eExpense will also allow for integration of PCard reconciliation and reimbursement into one process for the end user, and for streamlining backend auditing functionality.
1.2 Purpose of Test Plan The purpose of this document is to summarize the test strategy as it relates to the eExpense System Implementation project. This test plan provides managers and test personnel with the necessary approach to validate that each process performs correctly and that the requirements of the system have been satisfied. This test plan will provide the following:
Detail the approach and strategy for testing of the solution
Describe the planning, test case preparation and scheduling, including resource requirements
Explain the execution, results documentation and review of the testing
Provide the test cases which will be executed for this testing effort
1.3 Solution Description / Project Deliverables A number of project deliverables were identified during an initial review of the EAS code and environment. The deliverables can be grouped into the following four categories:
Travel Portal: As part of a separate but related project effort, the Procurement department is charged with implementing a centralized, easy to use University‐specific gateway for travel, accommodations, and ground transportation reservations. It proposes to do so by contracting with American Express Business Travel using Concur/Cliqbook technology as the online booking tool (OBT). GW travelers will utilize PCards to purchase travel through this system. This project launched in March 2012 and is the predecessor to this eExpense System Implementation project.
Expense Reporting: Concur Expense allows for a seamless integration with the Travel Portal solution (see above) with data collected at the time of travel booking. Additionally, expenses from vendors that provide feeds of e‐receipts of expense activity (including hotels, airlines, hotels, rail and car rental companies) will electronically auto‐populate the user’s expense reports. Non‐travelers who are required to submit expense reports can also easily navigate the system. Concur Expense allows for closer‐to‐paperless end‐to‐end processing, offering a fax‐ or email‐to‐report option for those users without access to electronic versions of receipts.
Procurement Card (PCard) Processing: The Expense Reporting solution described above will provide a mechanism for users to access expense data from the University’s Corporate Procurement (PCard) Card and travel; users will also be able to enter data for other expenses and upload of supporting documentation. This will require integration of a PCard flat file feed from JPMorgan to capture line‐item expenses charged by PCard users (directly to Concur). The eExpense solution will eliminate the
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 10
1.5.2 Test Plan Assumptions:
1.5.2.1 Reliance upon Software Vendor’s Quality Assurance: Concur Travel and Expense is a Software as a Service (SaaS) solution with a large customer base. The vendor’s Quality Assurance team tests all product functions of the software prior to release. Software bugs or defects introduced by the vendor would impact and be detected by the entire customer base for Concur. Additionally, there will be no University‐specific customizations to the Concur product. Therefore, the University will not test core Concur functionality but instead will test to ensure the Concur functionality as configured for the University’s instance meets business requirements.
1.5.2.2 Approach to developing Test Scripts: The following describes the approach to be taken regarding developing keystroke level test scripts, training materials, and the high level test scenarios to be utilized during UAT:
1.5.2.2.1 Detailed Keystroke Level Test Scripts: The UAT participants will include representatives from organizations across the University and are assumed to not have any experience or familiarity with Concur. Based upon the project schedule, end user training will not be delivered to the UAT participants prior to the completion of the UAT testing phase. Therefore, we will be relying upon the following approaches:
1.5.2.2.2 Leverage End User Training Materials: GW‐developed training materials are available in time for UAT. Testers will follow the instructions per the training materials along with separate sheets of individualized directions for keying test scenario data into Concur.
1.5.2.2.3 High Level Test Scenarios: A significant amount of testing will involve testing integrations and data flows from Concur into EAS. The testing of EAS (Oracle Financials) functionality, including transactions flowing through the EAS AP, Grants, and General Ledger modules, will be performed by support resources and highly knowledgeable end users. Therefore, the use of keystroke‐level test scripts will not be required. Instead, the testers will execute high level scenarios including a description of the testing to be performed. The results will then be recorded in the University’s UAT tool with a brief description of the results.
1.5.3 Test Plan Constraints: 1.5.3.1 Competing priorities conflicting with project activities: Financial audits and production support / system development will continue throughout the eExpense project timeline. The competing support and enhancement requests will be aggressively managed to ensure the support teams from DIT and FSS will be able to support and focus on project deliverables.
2 Test Strategy
2.1 Reliance on Support Resources and Highly Knowledgeable End Users:
As an opportunity to make the testing process more efficient, the project team will focus on engaging testing participants that have been identified as highly knowledgeable end users and system support resources. Previous testing efforts included many casual end users with the belief that the more eyes looking and more hands on the new system the better. However, lessons learned from previous projects concluded that including casual users required a lot of support and resources without realizing any significant benefits.
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 13
3.3 Testing Timeline by Test Organization / Test Cycle Below presents the planned schedule when test participants from each identified organization will be expected to test (by week) and the testing to be performed by each organization:
Testing Timeline by Organization / Test Cycle
FSS Integration Testing End-to-End UAT Testing
Week 1 (2 days)
Week 2 Week 3(3 days)
Week 4 Week 5 Week 6 Week 7 Week 8 Week 9
7/12-7/13 7/16-7/20 7/23-7/25 7/30-8/3 8/6-8/10 8/13-8/16 8/20-8/24 8/27-8/31 9/3-9/7
Organization:Enterprise Information Systems (DIT)Financial System Support (FSS)Financial Reporting (FR) Note (a)University Accounting Services (UAS) Note (a)Accounts Payable (AP) Note (a) Note (a)Procurement (PROC) Note (a) Note (a)Finance Directors / Other Representatives (REP) Grants & Contracts Acc ing Services (GCAS)Office of the VP of Research (OVPR)
Test Cycle:14.10) Concur Integra ions FSS / DIT FSS / DIT FSS / DIT FSS FSS FSS - - -
14.20) Concur Admin & Config FSS / DIT FSS / DIT FSS / DIT FSS AP Proc
FSS AP Proc
FSS AP Proc
FSS AP PROC
FSS AP PROC
FSS AP PROC
14.25) Concur User Profile Administra ion - - - Proc Proc Proc REP REP REP
14.30) Concur Expense - Entry - - - FSS FSS FSS REP REP REP
14.40) Concur Expense - Approvals - - - FSS FSS FSS REP OVPR REP OVPR REP OVPR
14.50) Repor ing - Concur / Cognos - - - FSS FSS FSS REP OVPR REP OVPR REP OVPR
14.60) Concur Expense - Audit - - - FSS FSS FSS / AP AP AP AP
14.70) Accounts Payable Processing - - - FSS FSS / AP FSS / AP AP AP AP
14.80) Grants Accoun ing Processing - - - FSS FSS FSS - GCAS OVPR
GCAS OVPR
14.90) General Ledger Processing - - - FSS FSS FSSFR UAS
- FR UAS FR UAS
Note (a): Overlapping of End‐to‐End and UAT Phases for AP and UAS / Financial Reporting: The End‐to‐End and UAT testing procedures include running processes related to the interfaces and integrations between EAS and Concur. A significant amount of test data will be generated during the End‐to‐End testing by support resources from FSS and DIT. For the sake of efficiency and as a learning opportunity, FSS will coordinate the integration processes testing during End‐to‐End testing with the support resources in Finance who will be ultimately responsible for those processes once the application is in production. Additionally, test data generated during the UAT sessions will be tested by the support resources during the duration of the UAT.
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 18
3.7.5 Sensitive Data No concerns regarding sensitive data need to be considered during this project.
4 Test Execution
4.1 Testing Cycles / Workflow All test cycles will be executed within the EASTST environment. The integrations between EAS and Concur will primarily be via .csv file transfers between the two applications. Note that funds checking (real time validation against EAS data to determine funding for Grants transactions) will be performed via web service instead of via .csv file. The following describes the iterations / phases of the testing:
Unit / Integration Testing – includes primarily IT developers and analysts and FSS analysts performing unit testing (testing individual objects) and integration testing (testing related groups of objects together). The Concur test environment will be pointed to the EAS test (EASTST) environment.
End‐to‐End – More robust testing will be performed during the End‐to‐End testing and will be performed primarily by IT and FSS resources. The FSS testing efforts will focus on executing the necessary test scenarios and procedures.
UAT – The UAT test cycle involves End User Acceptance Testing (UAT) which is testing performed by end users to verify that the project deliverables meet the user’s requirements as well not having caused unintended adverse effects.
Regression Testing – Any issues identified during UAT will be addressed during the four week period after the UAT testing is complete (between September 7th and October 5th). Regression testing is not truly a test phase or cycle, it is testing performed after all system changes have been made to ensure the existing functionality of the application is unaffected by the changes and confirmation that the modifications made as a result of the project are working as required.
Testing Limitations
Concur Functionality Only Available In Production (No Available Development / Test Environments)
Certain Concur functionality that will be available when Concur goes live in production will not be available in the test Concur environments and cannot be tested prior to the project launch. The functionality is expected to be heavily utilized by the University community and will be part of the core baseline Concur functionality.
Although the functionality can’t be tested, it is part of the baseline product used by every Concur customer. Therefore, the assumption is the vendor has adequately tested the baseline functionality and any bugs would have been discovered by other Concur production customers and resolved by Concur. The functions that cannot be tested include:
Concur Travel / Travel Reservations: Due to limitations with the vendor’s support environments, Concur Travel functionality, including making a travel reservation and creating a travel itinerary, is not available in a support (development or test) environment and will not be tested as part of this project.
Mobile Device / Emailing Receipts: Due to limitations with the vendor’s support environments, the mobile functionality, including emailing receipts from mobile devices, will not be available in the test environments. This functionality will only be available in the production environment.
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 23
• The eExpense Project Test Plan (this document) must be complete and reviewed by appropriate project stakeholders,
All programs, interfaces, and reports must be completed and unit tested in the test environment and available for testing in the EASTST environment ,
• Related test environments for integrated applications must be in place and stable for the environment, including all integration / data flows between all related application test instances (e.g. the test instances of Banner),
• IT and FSS stakeholders must agree that the test environments is properly configured, tested, stable, and ready for testing,
Necessary and appropriate test data must be available,
• Agreements for defect fix and re‐test times agreed to amongst technical support and test teams must be established,
• A Detailed Issues Log and resolution process must be in place and communicated to team, and
• The test scenarios in 4.2 Testing by Test Cycle by Iteration must be completed, reviewed and approved by IT and FSS.
4.5.1 UAT Entrance Criteria All code related to the particular functions under test has been frozen.
All UAT test cases have been documented, reviewed and validated.
All required data entry, database population, and system configuration has been completed.
Test facilities are available.
4.6 Test Readiness Review Test Readiness Reviews will be held prior to each major testing phase/cycle. The test plan, detailed list of scenarios, and schedule of UAT sessions will be shared with the representative stakeholders. The stakeholders will review the artifacts to compare against their requirements and verify the test coverage. The stakeholders will be walked through the test artifacts and necessary corrections and changes will be made. A review session will be held with the stakeholders to obtain the necessary signoffs.
The plan will also be forwarded for a cursory review by the project’s sponsors and stakeholders. All entrance criteria for this phase will be met prior to the beginning of testing.
4.7 Test Results Documentation The University’s UAT Testing tool is the standard application for tracking user acceptance testing and will be used for tracking the status of the UAT testing. Test results will be tracked and reported using the tools available in the UAT Testing Tool along with spreadsheet data exports of the raw data.
4.8 UAT Tool Test Results Categorization The following table presents the standard categories for reporting the results of the testing within the UAT tool:
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 24
5 Test Review & Reporting
5.1 Test Exit Criteria
5.1.1 End‐to‐End Exit Criteria All End‐to‐End test scenarios have been executed.
All test results have been reviewed and approved by Test Manager
All required benchmarks have been met per the Project Proposal document
All High & Critical issues discovered during End‐to‐End have been fixed
All remaining Medium & Low issues discovered during End‐To‐End have been identified, logged and
an estimated resolution date has been provided and agreed to by GWU.
5.1.2 UAT Exit Criteria All User Acceptance Testing test scenarios have been executed.
All High & Critical issues discovered during UAT have been fixed and Regression Testing has been
performed.
All test results have been reviewed and approved by the project stakeholders.
All remaining Medium & Low issues discovered during UAT have been identified, logged and an
estimated resolution date has been provided and agreed to by GWU.
5.2 Test Results Review The final Testing and Readiness Gate Review will be conducted at the conclusion of the Rework/Regression Test Phase where the team will meet to review the status to date. In addition to the other criteria considered during the PMLC Gate Review, a decision will be made whether the test exit criteria from 5.1.1 End‐to‐End Exit Criteria and 5.1.2 UAT Exit Criteria have been met and whether the testing has been satisfactorily completed.
6 Test Scenarios The Test Scenarios will be tracked in GWU's User Acceptance Testing (UAT) Tool. The test scenarios will originally be compiled in a spreadsheet and stored in the following directory: G:\GROUPS\FSS Projects\2012 eExpense Concur Project\4 Test with the file name GWU Concur E2E UAT Test Scenarios MMDDYY.xls.
PMLC Test Plan & Procedures eExpense System Implementation Project
Confidential Not for release outside of The George Washington University without written permission
Page 25
7 Approval By signing, the individuals listed below have approved this test plan.
Name: Deb Dickenson Name: Donna Ginter Title: Comptroller Title: Director, Procurement Role: Project Owner Role: Project Owner Date: Date:
Name: Adam Donaldson Name: Jeff Pearson Title: Manager, Technology Sector BMAG Title: Director, Financial Systems Support Role: Project Manager Role: Project Test Manager Date: Date: