W13 Session 10/26/2016 3:00:00 PM
Dell's Journey to Achieve TMMi Level 3 Certification
Presented by:
Mark Keating
Dell
Brought to you by:
350 Corporate Way, Suite 400, Orange Park, FL 32073 888---268---8770 ·· 904---278---0524 - [email protected] - http://www.starcanada.techwell.com/
Mark Keating Dell As test architect for commercial sales and enterprise solutions, Mark Keating guides the definition and implementation of the test processes and test strategies for many of Dell's servers, systems management, and solutions products. A ten-year veteran of Dell, his roles as director, lead engineer, and developer have spanned the server/storage/software development spectrum.
10/17/2016
1
Dell - Internal Use - Confidential
PGCO ValidationPGCO ValidationTMMi OverviewMark KeatingSenior Principle Engineer / Strategist
September 2016
Key Challenges
Product Hardware / SW Development and Maintenance quality is a significant concern for all organizations it affects the ability to deliver
value and customer satisfaction if we get it wrong
To discuss and ultimately improve “quality” we have to have a commonTo discuss and ultimately improve quality , we have to have a common definition but there are numerous competing definitions
10/17/2016
2
Dell - Internal Use - Confidential
Key Challenges – Cont.
Demonstrating our organizations effectiveness andDemonstrating our organizations effectiveness and capability to our stake holders
Ensuring Consistency and Predictability
How we benchmark our teams capability against industry leading targets
Identifying Key performance Indicators to Measure Success
The Search
ISO/IEC/IEEE 29119 Software Testing is an internationally agreed set of standards for software testing that can be used within any software
development life cycle or organization. Based on 5 standards
Capability Maturity Model Integration (CMMI) is a process improvement training and appraisal program Administered by the CMMI Institute atraining and appraisal program. Administered by the CMMI Institute, a subsidiary of ISACA, it was developed at Carnegie Mellon University (CMU). It is required by many DoD and U.S. Government contracts,
especially in software development.
10/17/2016
3
Dell - Internal Use - Confidential
What is TMMiWhat is TMMi
TMMi
TMMi is Testing Maturity Model integration,
developed by the TMMi Foundation
Contains guidelines and framework for
test process improvement
Has five maturity levels for process evaluation in
systems and software engineering
Addresses cornerstones of structured testing as
lifecycles, techniques, infrastructure and organization
By far the most popular model in the Industry for benchmarking
Testing processes / practices
10/17/2016
4
Dell - Internal Use - Confidential
What Value Does TMMi Provide
TMMi is a model of testing best practices and testing processeswhich represent an important set of tools for improving the quality
of the delivered products
Increased testing maturity can improve an organization’s bottom line by improving customer satisfaction, increasing development productivity,
speeding delivery rates and reducing costs
What is TMMi
Testing Maturity Model Integration (TMMi) is a model of testing best practices that can help organizations determine whether its
testing processes are complete and whether they are effective
Organizations evolve through stages of the model from one that is ad hoc and unmanaged to levels that are managed, defined,
measured, and optimized
10/17/2016
5
Dell - Internal Use - Confidential
How Does TMMi Work
The TMMi approaches improving testing by providing a reference model so that strengths & best practices can be identified & weaknesses and
efficiencies can be improved
TMMi seeks to help organizations improve the whole testing process through a holistic approach to quality assurance
How Does TMMi Work
Continuous ProcessImprovement
OPTIMIZED• Test Process Optimization• Quality Control5.
Quantitatively Managed
Process Standardization
• Quality Control
ES
S M
AT
UR
ITY
MEASURED• Test Measurement• Product Quality Evaluation• Advanced Peer Reviews
DEFINED• Test Organization• Test Training Program• Test Life Cycle and Integration
5.4.3. • Non-Functional Testing
• Peer Reviews
Stable Process Can Repeat Tasks
Ad Hoc Informal Unpredictable
PR
OC
E
INITIAL• Testing is chaotic• Undefined Process
2.1.
MANAGED• Test Policy and Strategy• Test Planning• Test Monitoring and Control
• Test Design and Execution• Test Environment
10/17/2016
6
Dell - Internal Use - Confidential
Mapping the Model To Business Goals
Test Polic and Strateg
TMMi Levels Improvement Areas Cost Quality Speed to Market
Test Policy and StrategyTest PlanningTest MonitoringTest Design and Execution
Managed
DefinedTest OrganizationTest Training ProgramTest Life Cycle and Integration Non-Functional Testing
Peer Reviews
Largely Partially No BenefitMaximum
Measured
Optimized
Test MeasurementProduct Quality EvaluationAdvanced Peer Review
Test Process OptimizationQuality Control
10/17/2016
7
Dell - Internal Use - Confidential
TMMi Model RatingProcess Areas are Designated the Appropriate Maturity Levels Based on The Following Guidelines
Fully Achieved ‐• Convincing evidence of process compliance• Systematic and widespread implementation of process• No obvious weakness in distribution, application and results of this
process• Process achievement is between 85 and up to 100%
Largely Achieved –• Significant evidence of process compliance• Systematic and widespread implementation of process• Minor weakness in distribution, application and results of this
process • Process achievement is between 50 and up to 85%
Partially Achieved –• Some evidence of process found• Process exhibits significant weaknesses, can be incomplete, not
widespread or inconsistent in application or resultswidespread, or inconsistent in application or results• Process achievement is between 15 and up to 50%
Not Achieved ‐• Little or no evidence of process• Process achievement is between 0 and up to 15%
Not Rated –• Any supporting goal is Not Rated
Not Applicable –• The process area is considered not to be in the scope of the
assessment or applicable to the organizational unit by the Lead Assessor
The AuditInterview Participants
Stakeholders spanning the organization including
• Senior Management, Development Managers Validation Leads Validation
Validation Leads
Validation Engineers
Automation T
31 Interviews across the organization
Validation Leads
Validation Managers
Managers, Validation Leads, Validation Managers, Validation Architects, Validation Engineers, Automation Team, Program Management and Product Marketing
Information Sources
Interview Sessions
Artifacts – including project specific ones (Test Plans, Metrics, Weekly Status Reports etc.) and
f ( l d db k
31 Interviews across the
organization
Senior Management
Team
Program Manager
Product Marketing
Validation Managers
generic references (PG Validation Handbook, Metrics Document, Automation, Templates etc.)
Validation ArchitectsDevelopment
Managers
10/17/2016
8
Dell - Internal Use - Confidential
Benchmark Result – Level 2
Fully Achieved Largely Achieved Partially Achieved Not Achieved Not Rated / NotApplicable
Strongest Process Area is Test EnvironmentWeakest Process Area is Test Monitoring & Control
Benchmark Result – Level 3
Fully Achieved Largely Achieved Partially Achieved Not Achieved Not Rated / NotApplicable
Strongest Process Area is Test Lifecycle & IntegrationWeakest Process Area is Peer Review
10/17/2016
9
Dell - Internal Use - Confidential
Key GapsTest Policy & Strategy Adherence to defined processes is inconsistent across programsTest Planning Scientific techniques not leveraged for test estimation Risk management not robust enough
Test Monitoring & Control Conformance to entry criteria not consistent Lack of rigor in adherence to milestones
COST QUALITY SCHEDULE
X
X
X
X
X
X
X
X
Lack of rigor in adherence to milestones Lack of Suspension and resumption criteria
Test Environment Timely availability of infrastructure and hardware to
support testing is regular bottleneck
Test Design & Execution No bi‐directional traceability between requirements and defects Test design techniques not leveraged for test case documentation
Test Organization Lack of function to objectively evaluate adherence to defined Test Process
Test Lifecycle & Integration Exceptions to process adherence not tracked Project specific tailoring guidelines to processes not captured
lX
X
X
X
X
X
X
X
X
X
X
X
X
Non‐Functional Testing Inconsistent capturing of non‐functional requirements Existing non‐functional testing lacks a structured approach
Test Training Program Training needs for FTE's are observed to be ad‐hoc Skill baseline unavailable Training and learning plans aligned to project specific needs not available
Peer Review Lack of defined approach for peer reviews Review outcomes not captured and tracked consistently
I M P A C T O N
X
X
X
X
X
X
X
Identified Areas Of Concern
Continuous Test process improvements for defined lifecycle is not evidenced
Consolidated view of key and standard KPI's to track overall health and progress at an organization level is missing (Executive Level Dashboard)organization level is missing (Executive Level Dashboard)
Project Metrics level are not aligned to Business objectives
Lack of a centralized tool management strategy (Single tool for end to end test lifecycle management across all programs is missing)
Lack of function to objectively evaluate adherence to defined Test Process
Evidence of key person dependency seen to a great extent
Projects/programs appear to be self-sufficient and operating in silos
10/17/2016
10
Dell - Internal Use - Confidential
Identified Areas Of Concern
Awareness levels of process/standards not consistent across various levels/roles
High level of dependency on external teams is hurting the validation function due toHigh level of dependency on external teams is hurting the validation function due to inconsistent adherence to milestones
BA QA synergy is missing as Dev acts as a liaison for any Requirement clarification
Testing effort not utilized to the fullest possible extent due to involvement in non-core g ptesting activities (manual effort in tool handling)
Need for improved governance and overview for release/sustaining programs (management focus sometimes tends to be more on new product development)
Organizational Strengths
Well defined phase gates (Entry and Exit criteria) across all phases of the testing lifecycle
Test environment needs are well documented at project/program levelp j p g
Testing processes to be followed by validation organization are clearly defined and documented
Availability of standard tools for specific areas of testing
Dedicated Automation Leadership Team to review and monitor automation strategyDedicated Automation Leadership Team to review and monitor automation strategy across the organization
Robust "feed-forward" mechanism through Post Project Assessment (PPA) to incorporate lessons learned and best practices into future projects
Consistent adherence to PCR’s to manage changes across projects
10/17/2016
11
Dell - Internal Use - Confidential
AchievingAchieving TMMi Level 3
Keys to Success – Business Level
Senior Leadership Buy Inp y
Baseline the Organizations Capability Against the Model
Setting Organizational Expectations
Timeframe required to Achieve TMMi
Organizational Change Control Board
Training programs that are visual, online and group focused
10/17/2016
12
Dell - Internal Use - Confidential
Keys to Success - Team Level
Single point of contact for program management of the effort is essential
Creation of a Technical Working Group (Change Control Board) of individuals from around th l b h ’ i ibilit i l d d b i i ti l h i dthe globe who’s primary responsibility included being organizational champions and
evangelists for TMMi to ensure adoption
Responsible for process creation, review and implementation as well as documentation and training
Identification of Program TMMi leads that monitored and trained specific teams on i l ti TMMiimplementing TMMi
Utilized beta programs to validate process functionality in preparation for final audits
Requires training extended teams on TMMi process for areas that required engagement i.e. Program Management, Development Managers.
Reaching the Goal
Implementation Of New Test Methodologies
o 11 Areas investigated and targeted for
CY2015
2014 2015 2016 CY2014
g gimprovement.
o 30 documents created to improve areas of coverage, traceability, skill growth, risk management and effort estimation.
CY2016
Implementation Of New Test Methodologies
oImplement TMMi Level 3
Implementation Of New Test Methodologies
oAchieved Industry Accreditation – TMMi Level 3
10/17/2016
13
Dell - Internal Use - Confidential
Q&A
Dell - Internal Use - Confidential