Date post: | 08-May-2015 |
Category: |
Documents |
Upload: | hp-software-solutions |
View: | 2,518 times |
Download: | 0 times |
1 ©2010 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice
A practitioner's approach to measuring the success of a centralized, independent test center
Srinivas YeluripatySenior Project Manager, Infosys Technologies Ltd.
Jim FolokyDirector, US Business Testing Services, MetLife
2
Knowledge Attendees Will Take Away
Importance of a Test Center of Excellence (TCoE)
Deploying it successfully – Key Considerations
Quantifying the success/value of Test Center of Excellence
Strategies to link Business/IT and QA goals
Effective utilization of Test Management Tool to deploy your
key metrics
3
Flow
Testing challenges, How TCoE can address them
Test Center of Excellence (TCoE) deployment strategy
TCoE success measurement framework
Metrics Program Deployment Strategy – Integrated with TCoE
processes deployment strategy
How HP Quality Center helps in deploying metrics program
Results
4
Testing Challenges – How TCoE Address Them
Test
Team
Pro
jA
Pro
jB
Pro
jC
Pro
j Z
Develo
pm
en
t
Develo
pm
en
t
Develo
pm
en
t
Develo
pm
en
t
Test
Team
Test
Team
Test
Team
Test
Team
Test
Team
Pro
j A
Pro
j B
Pro
j C
Pro
j Z
Develo
pm
en
t
LOB- 1 QAD
evelo
pm
en
t
Develo
pm
en
t
LOB-2 QA
Develo
pm
en
t
LOB-3 QA
Automation, SOA testing
Tools/ Infrastructure
Knowledge/Test process advisory
Centralized Governance
Metrics
Uniform Process
Pre Centralization
• Centralized structure for QA Team serving all projects, with uniform processes
• QA governance team defining and driving the testing strategy
• Achieve QA operational efficiency through metrics based management
• Well defined QA organizational structure with common pool of resources and structured career path
• Training and Knowledge Management for capability uplift of QA team
• Regression Testing and Automation competency centers for Quality and Productivity improvements
Key Characteristics of Centralized QA State
Post Centralization
5
Test Center of Excellence Deployment Strategy
Create
Transition & Built Centralized QA
Standardize
Process Standardization Initiatives
Improve
Productivity Improvement Initiatives
Q1,Q2 2008
Q3,Q4 2008
Q1 2009
Q1 – Q4 2009
Matu
rity
Create
• Establish TCoE Vision, Strategy, Governance structure (align teams based on LOB/Product Areas)
• Form testing team (recruitment), transition applications to centralized QA, implement change management
• Measure transition success, ensure business as usual (no releases are impacted)
Standardize
• Assess current capability of the team, processes
• Define & implement standardized and repeatable processes that can establish predictable quality
• Train & uplift people capability
• Implement metrics based measurement
Improve
• Baseline current capabilities using metrics
• Benchmark against Best in Class
• Implement continuous improvement program
• Share best practices, lessons learned ,etc.
6
Measurement Framework
• How can I measure the „Value of Testing‟?
• Is reduction in cost by XX% sufficient to prove the success of TCoE?
• When do we say the TCoE deployment is successful?
What is TCoE
Success Criteria
Define Measurement framework key drivers (goal oriented)
Strategy Drivers Operational Drivers
Key Features of the metrics framework
IntegralUniform TransparentSimple
How to Implement?
AdoptPilot ImproveAssess
• Align organization‟s objectives, goals to TCoE
objectives and goals
• Build strategies and create measurement
framework to measure progress
• Efficiently measure the day to day operations of
QA through operational metrics
• The metrics framework should be simple, easy
to understand
• It must be uniform across the QA groups
• The QA processes must be well integrated with
metrics
• The data, reporting should be transparent
• Assess the reporting needs of various
stakeholders, business, IT, management, QA
etc. Assess current state
• Pilot metrics deployment by implementing few
metrics in selected few projects
• Adopt the framework across the QA function
• Benchmark current capability & improve
7
Measurement Framework
Objectives Goals Strategies Metrics
“Cheaper”
Reduce QA Cost
“Better”
Improve Quality
“Faster”
Reduce Cycle
Time
“Focused”
Flexible QA
Specialists
Reduce QA cost by 25%
Reduce estimation
variance to +/- 8%
Near-zero defect delivery
Reduce defect rejection
rate to 10%
Improve automation
coverage in 45% apps
Build 20% SME
knowledge
Ramp up team within
2 weeks of lead time
Faster delivery through
14 hr test cycle time
• Improve productivity, estimation accuracy, reuse, automation, etc.
• Measure DRE and implement defect RCA
• Improve KM process
• Implement ROI driven automation strategy, leverage Global Delivery Model
• Implement core-flex model
• Partner with vendor
• % of QA Cost• Blended Rate• Effort Adherence, etc.
• DRE – Defect Removal Efficiency
• DRR – Defect Rejection Rate
• Schedule adherence• % automation• % reduction test cycle
• % core• On boarding time
8
Q1 2009 Q2 2009 Q3 2009 Q4 2009Q4 2008Q3 2008
Year 2008
QA Thought
Leadership
(TCoE Process
& Governance)
Quality Center Standardization
Build Regression Test beds
Estimation Model
RCA Defect prevention ProcessDefined Defect
Prediction Model
Re-use Initiative
Risk Management Model
Uniform Configuration Mgmt
Communication Mgmt
Roles & Responsibilities
Onsite-offshore optimization
Competency MgmtCompetency
Management
(& KM)
Comprehensive Training
ProgramsProduct KM documents
Define Operating Model
Metrics Program Ph1 Metrics Program Ph2Performance
Management
(Metrics)Metrics Portal Prototype
Quarterly/Monthly AD updatesReporting structure
implemented
Year 2009
• Metrics program aligned with Test Center of Excellence deployment
• Quality Center deployment and standardization is key to the success of Metrics Program implementation
• Metrics program is divided into two phases,
• Phase1 -> Strategy metrics, key operational metrics
• Phase2 -> Benchmarking performance, implementation of remaining operational metrics
Metrics Program Deployment Strategy – Integrated with TCoE
processes deployment strategy
9
Metrics Categorized Based on the Key Drivers
Abstraction Independent QA Vision Metric Name Aim of the Metric
Strategy Metrics
Reduced Cost
Resource Distribution Geographical distribution of QA team
Blended Rate Unit cost - average
Application Spread Measures % of application coverage by QA
% of QA Cost Measure cost effectiveness
Improved Quality QA EffectivenessMeasures improvement in QA effectiveness (pre-post centralization)
Operational
Metrics -
Quality &
Productivity
Metrics
(Q&P Metrics)
Reduced Cost
Effort Deviation Measures accuracy of estimation process
Productivity (Test Planning and Test Execution) Measures efficiency of Testing Team
% of Test Case Reusability Measure the test case reusability factor
Improved Quality
% of Entry and Exit Criteria achieved (*)Measures % Test Criteria Met. How effective the process is performing against the standards
DRE Measures Testing Effectiveness of Testing process
Defect Rejection RateMeasure Testing Group‟s understanding of Requirements (Domain) and technology
Test Case Effectiveness Measures Test Coverage
Requirement Stability Index (*) Measures accuracy of requirement definition
Code Drops Efficiency (*) Measure of efficiency of Delivered code drops
Smoke Testing Success Rate (*) Measures quality of Code Drops
Increased Agility (Time to Market)
Schedule Adherence Measure of ability to deliver on time
% of automation (Regression Testing) Measure the overall automation coverage
(*) Metrics identified to track quality gateways
Top5
Top5
Top5
Top5
Top5
10
Integration of Metrics with Project Life Cycle Stages
Requirements Design Build Test UAT & Deploy
Requirements Stability Index
All Life Cycle Stages Build & Test Phases Test Phase
Smoke Testing Success Rate
Defect Removal Efficiency
Entry & Exit Criteria Achieved
Code Drops Efficiency
Schedule Adherence
Effort Deviation
% of Test Case Reusability
Test Case Effectiveness
Defect Rejection Rate
Productivity
Test, UAT and Production
11
Communication model
Defining a Uniform and Transparent Metrics Data
Source Is Key
Metrics Data Sources
Defect Data (HP
Quality Center)
Financial Data
(Business Mgmt
Office - AD)
Effort Data (Infosys
DART, FTE effort
tracker)
Industry
benchmarks or
Best in Class
benchmarks
Metrics Generation
QA Process Data
(Effort, Schedule,
Entry Exit Criteria
etc.,)
S.No Forum Frequency Participants/Audience Agenda and benefits
1 QA Communication to
Sr. Mgmt of AD
Quarterly AD Directors, VP and Managers
Chair: QA Director
• Communicate the QA updates, latest initiatives, metrics
from QA and take feedback from AD
• Helps in reaching out to entire AD management in one
common forum
2 QA Manager‟s
meeting with AD
Monthly AD Directors, Managers/Teams
Chair: QA managers
• Monthly QA updates, Metrics
• Discuss initiatives, DRE, Defect Prevention Strategies etc.
• Financial updates
3 Monthly Metrics Monthly Offshore, onshore management
Chair: Metrics Team
• Review metrics and perform a trend analysis
• Measure key improvements
12
HP Quality Center (QC) standardization helping metrics deployment succeed
• The strategy to adopt HP Quality center as Test Management
tool and standardizing it across testing projects proven to
accelerate deployment of metrics program
• HP QC helped in,
– Building Consistency: Implementing uniform defect management
– Creating Transparency: Creating transparency in the data
collection, reporting
– Aiding Simplification: Simplify the metrics data collection using
automated queries
– Integrating Metrics with Process deployment
• HP QC is effectively used in generating below metrics,
– Defect removal efficiency
– Defect rejection rate
– QA effectiveness
– Test case effectiveness
– Test case reusability %
13
Reuse Tracking with Quality Center
Reusable test cases
Business/Functional/Technical
Requirements
Test Scenarios
Test Cases
Reusable test casesRelease specific test cases Quality Center Reusable Suite
Test Plan release folder in QC
IdentifyPrepare
Upload
Release test cases (Test Lab)
Release version
“Reusable”
version
Result:
Improved reuse from 0% to 10%
14
Results
2009 Productivity Improvement -
Projects Vs. Effort
34%
26%
-14%-20%
-10%
0%
10%
20%
30%
40%
% Growth in Projects % Growth in Application
Coverage
% Reduction in Effort
2009
Productivity Improvement - Overall Test Cases
3854966536
13441
0
20000
40000
60000
80000
100000
2008 2009
No
. o
f T
est
Cases
Additional TCs delivered
Expected No. of TCs based on 2008 productivity
DRE (Yearly Average)
75.5% 80.8%90.60%95.0% 96.8% 97.80%
0.0%
20.0%
40.0%
60.0%
80.0%
100.0%
2007 2008 2009
IT DRE Pre-Prod
Defect Rejection Rate
22% 21%
9%
0%
5%
10%
15%
20%
25%
2007 2008 2009
Defect Rejection Rate
Doing
more
things
With less
Cost
With
improved
Quality
% of QC Cost over IT Cost
14%12% 11%
0%
10%
20%
2007 2008 2009
% of QC Cost
20% improvement
Onsite-Offshore Consultants (End of Year)
21%
10%
0%
10%
20%
30%
40%
2008 2009
Onsite-Offshore Consultants (End of Year)
Considering code defects
Q&A
“The contents of this document are proprietary and confidential to Infosys Technologies Ltd. and may
not be disclosed in whole or in part at any time, to any third party without the prior written consent of
Infosys Technologies Ltd.”
“© 2010 Infosys Technologies Ltd. All rights reserved. Copyright in the whole and any part of this
document belongs to Infosys Technologies Ltd. This work may not be used, sold, transferred, adapted,
abridged, copied or reproduced in whole or in part, in any manner or form, or in any media, without the
prior written consent of Infosys Technologies Ltd.”
16
About the authors
A senior project manager with Infosys Technologies ltd., and testing
centralization strategist, Srinivas Yeluripaty has more than 11 years of
experience in verification & validation services. Srini is PMP certified and Two
times Black Belt Certified in Six Sigma Implementation for testing productivity
and business results improvement. In the recent 5 years he has been involved
in, Test Center of Excellence strategy development and implementation,
Consulting for building Independent and Centralized QC function - Setting
goals, direction and execution for large US corporations in Banking, Financial
and Insurance Testing Space.
E-Mail: [email protected] Yeluripaty
A QA Director with Metlife, Jim has more than 19yrs of experience in IT within
the Insurance industry. Has held positions as a Business Analyst, Application
Development Manager and handled multiple initiatives including Development,
Conversion, Migration etc. He took on responsibilities for a Centralized &
Independent Test Center around 20 months back and helped successfully
implement a Centralized Testing Team and Measurement Framework that has
proven to be beneficial to the IT and Business. Lead the initiative by
establishing vision, goals, direction, defining metrics, continuous tracking of
performance and achieving results.
E-Mail: [email protected] Foloky
Thank You
“The contents of this document are proprietary and confidential to Infosys Technologies Ltd. and may
not be disclosed in whole or in part at any time, to any third party without the prior written consent of
Infosys Technologies Ltd.”
“© 2010 Infosys Technologies Ltd. All rights reserved. Copyright in the whole and any part of this
document belongs to Infosys Technologies Ltd. This work may not be used, sold, transferred, adapted,
abridged, copied or reproduced in whole or in part, in any manner or form, or in any media, without the
prior written consent of Infosys Technologies Ltd.”
18 ©2010 Hewlett-Packard Development Company, L.P.
To learn more on this topic, and to connect with your peers after
the conference, visit the HP Software Solutions Community:
www.hp.com/go/swcommunity
19