Date post: | 07-Aug-2015 |
Category: |
Documents |
Upload: | techwellpresentations |
View: | 49 times |
Download: | 4 times |
4/27/2015
1
©2015 InfoStretch Corporation. All rights reserved.
Liana Gevorgyan | May 6, 2015
Measuring
QualityQA Metrics and Trends in Practice
US - UK - India
©2015 InfoStretch Corporation. All rights reserved.
SECTION 1) TECHNOLOGY IN LIFE
Bugs Are Costly
4/27/2015
2
1999
Mars Climate Orbiter Crash
Instead of using the provided metric system for navigation, the contractor carried out measurements using imperial units and the space craft crashed into Mars.
COST
$135 Million
1996
ARIANE Failure
Ariane 5 rocket exploded 36.7 seconds after take off. The engine of this satellite was much faster than that of the previous models, but it had a software bug that went unnoticed.
COST
>$370 Million
4/27/2015
3
2003
EDS Fails Child Support
EDS created an IT system for a Child Support Agency in the UK that had many software incompatibility errors.
COST
$1.1 Billion
2013
NASDAQ Trading Shutdown
August 22, 2013 NASDAQ Stock Market Shut down trading for three hours because of a computer error.
COST
$2 Billion
4/27/2015
4
1985-1987
Therac-25 Medical Accelerator
A software failure caused wrong dosages of x-rays. These dosages were hundreds or thousands of times greater than normal, resulting in death or serious injury.
COST
5 Human Lives
Technology In Our Daily Life
Average usage of electronic systems in developed countries:
� One PC or desktop in each home.
� 80% of people are using mobile phones
� 40% of people have cars with various electronic systems
� People are traveling via train, plane on an average once a year
� Dozens of other embedded systems in our homes
� Dozens of software programs in our work place, service systems
QualityQualityQualityQuality of all mentioned systems are equal to the Quality of life!of all mentioned systems are equal to the Quality of life!of all mentioned systems are equal to the Quality of life!of all mentioned systems are equal to the Quality of life!
4/27/2015
5
SECTION 2) DEFINING THE “WHAT”
Known QA Metrics & Trends
Defining “What”
10
� Metrics and Trends
� Measure to Understand
� Understand to Control
� Control to Improve
4/27/2015
6
Several Known QA Metrics and Trends
11
� Manual & automation time ration during
regression cycle
� Scripts maintenance time during delivery
iteration
� Daily test cases manual execution
� Automation effectiveness for issues
identification
� Issues found per area during regression
� Areas impacted after new features integration
� Issues identification behavior based on major
refactoring.
� Software process timetable metrics
� Delivery process productivity metric
� Software system availability metrics
� Test cases coverage
� Automation coverage
� Defined issues based on gap analysis
� Ambiguities per requirement
� Identified issues by criticality
� Identified issues by area separation
� Issues resolution turnaround time
� Backlog growth speed
� Release patching tendency and costs
� Customer escalations by
Blocker/Critical/Major issues per release
� QA engineer performance
� Continuous integration efficiency
Metrics Classification
12
PRODUCT METRICS
PROCESS METRICS
QA METRICS
4/27/2015
7
Metrics Examples by Classification
13
� Delivery process productivity metrics
� Continuous integration efficiency
� Release patching tendency and costs
� Backlog growth speed
� QA engineer performance
� Software process timetable metrics
� Software system stability metrics
� Identified issues by criticality
� Identified issues by area separation
� Customer escalations by Blocker/Critical/Major
issues per release
� Ambiguities per requirement
� Backlog growth speed
PRODUCT METRICSPROCESS METRICS
Sample Metrics Visual
14
55%20%
15%
7% 3%
Automated UI and BE
Automated UI
In Progress
Pending Automation
Not Feasible
AUTOMATION COVERAGE BUGS BY SEVERITY
3 5
12
34
45
Blocker Critical High Medium Low
4/27/2015
8
Visual Depiction Of Sample Trends
15
Blocker
High
Low
0
10
20
30
40
Blocker
Critical
High
Medium
Low
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
ISSUE ESCALATIONS BY CRITICALITY - MONTHLY TREND
1 2 3 4 5
6 WEEKS
REJECTED BUGS % PER WEEK
Expectations
�Smooth releases
�Predefined risks with mitigation plans
�Nice feedback and appreciation
�Top notch and innovative products
16
4/27/2015
9
Real Life
� Delivery not always ideal
� We are familiar what is patching the release
� Lack of process tracking data for analysis
� Experimental delivery models not exactly the Best practice models
17
SECTION 3) DELIVERY
PROCESSES & METRICS
Waterfall & Agile
4/27/2015
10
Waterfall Process
REQUIREMENTS
VALIDATION
ARCHITECTURE
VERIFICATION
MODULE DESIGN
VERIFICATION
IMPLEMENTATION
SYSTEM TEST
OPERATIONS AND MAINTENANCE
REVALIDATION
Agile Process
20
TESTING/VALIDATION/VERIFICATION
Product Backlog� Client prioritized
product features
Sprint Backlog� Features assigned
to Sprint
� Estimated by team
� Team Commitment
Working Code Ready For
DEPLOYMENT
Time-Boxed
Test/Develop
PRODUCT BACKLOG BACKLOG TASKS
4/27/2015
11
Agile Process Metrics
21
SCRUM TEAM SPRINT METRICS
Scrum Team’s Understanding
of Sprint Scope and Goal
Scrum Team’s Adherence to
Scrum Rules & Engineering
Practices
Scrum Team’s
Communication
Retrospective Process
ImprovementTeam Enthusiasm
Quality Delivered to
Customer
Team Velocity
Technical Debt
Management
Actual Stories
Completed vs. Planned
Processes Are Not Always Best Practices
22
� Unique way of Agile
� Transition from Waterfall to Agile
� Transition from Agile to Kanban
4/27/2015
12
Metrics Set Definition for Your Project
23
� Process
� Technology
� Iterations
� Project/Team Size
� GoalGoalGoalGoal
SECTION 4) WEIGHT BASED ANALYSIS
FOR QA METRICS & MEASUREMENTS
Mapping With Graph Theory
4/27/2015
13
Metric and Trends for your Project
25
You are watching Metrics/Trends set, are they the right ones?
Trends are in an acceptable range, but the products quality is
not improving?
Trying to improve one metric and another is going down?
How do you How do you How do you How do you analyze and fix it?analyze and fix it?analyze and fix it?analyze and fix it?
Mapping QA Metrics Info Graph Theory
26
Process Metrics > A= Metric 1, B=Metric 2…
Actions/Data set that takes effect in Metrics > A1, A2…
Metric dependencies of specific action
Product Metrics > C= Metric 3, D=Metric 4…
4/27/2015
14
Preconditions & Definitions For Metrics & Actions mapped model
27
� Node’s initial weight is predefined and has value from 1-10
� Edge’s weight is predefined and has value from 1-10
� Connections between Nodes is defined based on dependencies of Metrics
from each other and from Actions
� All Actions have fixed 1 weight
Initial Metrics Model & Dependencies
28
Assume
Current Metric set is:
2 Process Metrics -> M1, M2
2 Product
Metrics -> M3, M4
Where :
M1 has dependency on M3
M1 has dependency on M4
M2 has dependency on M3
There are 3 Actions or Data sets that have effect on
some of the Metrics. Those are A1, A2, A3
Where :
M1 has dependency on A1 and A2
M4 has dependency on A3
Initial Priority
Initial Priority based on
Best Practices
W(M1) = 5W(M1) = 5W(M1) = 5W(M1) = 5
W(M2) = 4W(M2) = 4W(M2) = 4W(M2) = 4
W(M3)W(M3)W(M3)W(M3) = 3= 3= 3= 3
W(M4)W(M4)W(M4)W(M4) = 2= 2= 2= 2
4/27/2015
15
Metrics Visualization via Graph
29
M2M2
M3M3
M1M14
3
M4M4
5
2
A2A2A2A2A2A2A2A2
A1A1A1A1A1A1A1A1
A3A3A3A3A3A3A3A3
Process Metrics > A= Metric 1, B=Metric 2…
Actions/Data set that takes effect in Metrics > A1, A2…
Metric dependencies of specific action
Product Metrics > C= Metric 3, D=Metric 4…
Weight Assignment On Undirected Graph
30
M2M2
M3M3
M1M14
3
M4M4
5
2
A2A2A2A2A2A2A2A2
A1A1A1A1A1A1A1A1
A3A3A3A3A3A3A3A3
23
5
1
1
6
1
1 1
Process Metrics > A= Metric 1, B=Metric 2…
Actions/Data set that takes effect in Metrics > A1, A2…
Metric dependencies of specific action
Product Metrics > C= Metric 3, D=Metric 4…
4/27/2015
16
Calculation Formula for Metrics New Priority
31
� Priority of the node is calculated the following way:
where
� ---- node weight assigned by user
� ---- cumulative weight of each node's edges
New Priority Calculations For One Metric
32
M2M2
M3M3
2
4
3A2A2
A1A1
1
11
1
Initial Priority
M2 = W(M2)=4
New Priority
M2 = W(M2) * (W(M2-A1) + W(M2-A2) + W(M2-M3))
M2 = 4 * (1+1+2) = 4*4 = 16
4/27/2015
17
New Priority Calculations For Graph
33
M1M1M1M1 M2M2M2M2 M3M3M3M3 M4M4M4M4 A1A1A1A1 A2A2A2A2 A3A3A3A3
W 5 4 3 2 1 1 1
M1M1M1M1 5 3 5 40
M2M2M2M2 4 2 1 1 16
M3M3M3M3 3 3 2 6 33
M4M4M4M4 2 5 10
C A L C U L AT I O N S
Metrics
New
Priority
M1M1M1M1
M3M3M3M3
M2M2M2M2
M4M4M4M4
Metrics
Initial
Priority
M1M1M1M1
M2M2M2M2
M3M3M3M3
M4M4M4M4
Metrics Priorities: Current Vs. Calculated
34
Initial Priority Based on
Best Practices
M1
M2
M3
M4
Project Dependent
Calculated Priority
M1
M3
M2
M4
INITIAL PRIORITY NEW PRIORITY
4/27/2015
18
SECTION 5) METRICS WEIGHT
BASED ANALYSIS IN PRACTICE
Defining “How”
Metrics Definition For Test Project
36
� Process – Agile with Area ownership
� Technology – SAAS Based Enterprise Web & Mobile App
� Iteration – 2 weeks
� Project Size – 5 Scrum Teams
� Goal – Customer Satisfaction, No Blocker, Critical Issues Escalation
by Customer
4/27/2015
19
Key Metrics and Dependencies
37
Metrics
M1 - Customer Escalations per defect severity – Product Metric
M2 – Opened Valid Defects per Area – Product Metric
M3 – Rejected Defects – Process Metric
M4 - Test cases Coverage – Process Metric
M5 - Automation Coverage – Process Metric
M6 - Defect fixes per Criticality – Product Metric
Actions and Data Sets
A1 – Customer types per investment and escalations per severity
A2 – Most Buggy areas
Metrics Initial PriorityWeight assignment and dependency analysis
38
Metric Name Predefined
Node
Weight
Metrics By
Initial
Priority
M1 - Customer Escalations per
defect severity
8 M1
M2 – Opened Valid Defects per
Area
5 M6
M3 – Rejected Defects 4 M2
M4 - Test cases Coverage 3 M3
M5 - Automation Coverage 2 M4
M6 - Defect fixes per Criticality
per Team
6 M5
Node
Weight
M1 M2 M3 M4 M5 M6 A1 A2
M1 = 8 2 6 4 5 2
M2 = 5 2 1 3
M3 = 4 1 3
M4 = 3 6 3 3 2
M5 = 2 2
M6 = 6 4 2
4/27/2015
20
Graph Creation
39
3
M1M1
M3M3
M2M2
6
8
4
M4M4
5
A2A2
A1A1 2
51
1
3
M6M6
M5M5 26
4
3
1
22
2
Calculations and Metrics Prioritization
40
Node Weight M1 M2 M3 M4 M5 M6 A1 A2 Calculated
Priority
M1 = 8 2 6 4 5 2 152
M2 = 5 2 1 3 30
M3 = 4 1 3 16
M4 = 3 6 3 3 2 42
M5 = 2 2 4
M6 = 6 4 2 36
Initial
Priority
M1
M6
M2
M3
M4
M5
Calculated
Priority
M1
M3
M6
M2
M4
M5
4/27/2015
21
Key Metric Changes & Improvement Plans
41
Metrics by Calculated Priority
M1 - Customer Escalations
per defect severity
M3 – Rejected Defects
M6 - Defect fixes per
Criticality per Team
M2 – Opened Valid Defects
per Area
M4 - Test cases Coverage
M5 - Automation Coverage
� Group Defect by Severity and per Customer investment to understand
real picture. 1000 Minor issues can cost more than 1 High severity issue.
� Proceed Trainings to low defect rejection, so developers will not spend
more time on analysis of invalid issues
� Make sure Defect fixes are going in Parallel with new feature
development for each sprint
� Continuously update Test case after each new issue, to make sure you
have good coverage
� Automate as much as possible to cut the costs and increase the coverage
Monitoring of Trend Based Priority MetricsBased on Process Changes
42
60
70 68
75
2025
2934
70
80 8278
0
10
20
30
40
50
60
70
80
90
Jan Feb March April
M1 M3 M6
4/27/2015
22
Let the Challenge BeginC & Have FUN
43
Thank You
Global Footprint
About Us
A leading provider of next-gen mobile application
lifecycle services ranging from design and
development to testing and sustenance.
Locations
Corporate HQ: Silicon Valley
Offices: Conshohocken (PA), Ahmedabad (India),
Pune (India), London (UK)
InfoStretch Corporation
4/27/2015
23
References� Narsingh Deo, Graph Theory with Applications to Engineering and Computer Science, Prentice Hall 1974.
� A.A. Shariff K, M.A. Hussain, and S. Kumar, Leveraging un- structured data into intelligent information – analysis and evaluation, Int. Conf. Information and Network Technology, IPCSIT, vol. 4, IACSIT press, Singapore, pp. 153-157, 2011.
� http://en.wikipedia.org/wiki/List_of_software_bugs
� http://www.starshipmodeler.com/real/vh_ari52.htm
� http://news.nationalgeographic.com/news/2011/11/pictures/111123-mars-nasa-rover-curiosity-russia-phobos-lost-curse-space-pictures/
� http://www.bloomberg.com/news/articles/2013-08-22/nasdaq-shuts-trading-for-three-hours-in-latest-computer-error
©2015 InfoStretch Corporation. All rights reserved. 46
Q & ALiana Gevorgyan
Sr. QA Manager
InfoStretch Corporation Inc.
www.linkedin.com/in/lianag/en
www.infostretch.com