+ All Categories
Home > Documents > Quality Management Peer Review

Quality Management Peer Review

Date post: 11-Jan-2016
Category:
Upload: wei
View: 37 times
Download: 4 times
Share this document with a friend
Description:
Quality Management Peer Review. Supannika Koolmanojwong CSCI577. Outline. Quality Management In CMMI 1.3 In ISO 15288 In CSCI577ab. Objectives of QM. To ensure the high quality process in order to deliver high quality products. Quality Management in CMMI 1.3. - PowerPoint PPT Presentation
40
University of Southern California Center for Systems and Software Engineering Quality Management Peer Review Supannika Koolmanojwong CSCI577
Transcript
Page 1: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality ManagementPeer Review

Supannika Koolmanojwong

CSCI577

Page 2: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Outline

• Quality Management – In CMMI 1.3– In ISO 15288– In CSCI577ab

(c) USC-CSSE 2

Page 3: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Objectives of QM

• To ensure the high quality process • in order to deliver high quality products

(c) USC-CSSE 3

Page 4: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality Management in CMMI 1.3

(c) USC-CSSE 4

Process Areas

Configuration Management (CM) Product Integration (PI)

Causal Analysis and Resolution (CAR) Project Monitoring and Control (PMC)

Decision Analysis and Resolution (DAR) Project Planning (PP)

Integrated Project Management (IPM) Quantitative Project Management (QPM)

Measurement and Analysis (MA) Requirements Development (RD)

Organizational Performance Management (OPM)

Requirements Management (REQM)

Organizational Process Definition (OPD) Risk Management (RSKM)

Organizational Process Focus (OPF) Supplier Agreement Management (SAM)

Organizational Process Performance (OPP)

Technical Solution (TS)

Organizational Training (OT) Validation (VAL)

Process and Product Quality Assurance (PPQA)

Verification (VER)

Page 5: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

PPQA - Product and Process Quality Assurance

(c) USC-CSSE 5

Page 6: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

PPQA - Product and Process Quality Assurance

(c) USC-CSSE 6

Page 7: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

PPQA for Agile development

(c) USC-CSSE 7

Page 8: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

CM – Configuration Management

(c) USC-CSSE 8

Page 9: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

CM – Configuration Management

(c) USC-CSSE 9

Page 10: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

CM – Configuration Management

(c) USC-CSSE 10

Page 11: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

MA – Measurement and Analysis

(c) USC-CSSE 11

Page 12: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

VER - Verification

(c) USC-CSSE 12

Page 13: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

VER - Verification

(c) USC-CSSE 13

Page 14: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

VAL - Validation

(c) USC-CSSE 14

Page 15: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

VAL - Validation

(c) USC-CSSE 15

Page 16: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality Management in ISO 15288Activitiesa) Plan quality management.

1. Establish quality management policies

2. Establish organization quality management objectives

3. Define responsibilities and authority for implementation of quality management.

b) Assess quality management. 1. Assess customer satisfaction and report.

2. Conduct periodic reviews of project quality plans.

3. The status of quality improvements on products and services is monitored.

c) Perform quality management corrective action. 1. Plan corrective actions when quality management goals are

not achieved.

2. Implement corrective actions and communicate results through the organization.

(c) USC-CSSE 16

Page 17: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Configuration Management in ISO 15288Activities

a)Plan configuration management. 1) Define a configuration management strategy

2) Identify items that are subject to configuration control.

b)Perform configuration managementa) Maintain information on configurations with an

appropriate level of integrity and security

b) Ensure that changes to configuration baselines are properly identified, recorded, evaluated, approved, incorporated, and verified.

(c) USC-CSSE 17

Page 18: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality Management in 577ab

• IIV&V• Configuration Management• Defect Reporting and Tracking• Testing• Buddy Review• Architecture Review Board• Core Capability Drive through• Design Code Review• Document template• Sample artifacts

(c) USC-CSSE 18

Page 19: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality Guidelines

• Design Guidelines– Describe design guidelines on how to improve

or maintain modularity, reuse and maintenance– How the design will map to the implementation

• Coding Guidelines– Describe how to document the code in such as

way that it could easily be communicated to others

(c) USC-CSSE 19

Page 20: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Coding Guidelines

• C: http://www.gnu.org/prep/standards/standards.html• C++ : http://geosoft.no/development/cppstyle.html• Java: http://geosoft.no/development/javastyle.html• Visual Basic:

http://msdn.microsoft.com/en-us/library/h63fsef3.aspx

(c) USC-CSSE 20

Page 21: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality Guidelines

• Version Control and History– Chronological log of the changes introduced to

this unit

• Implementation Considerations– Detailed design and implementation for as-built

considerations

• Unit Verification– Unit / integration test– Code walkthrough / review / inspection

(c) USC-CSSE 21

Page 22: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Quality Assessment Methods

• Methods, tools, techniques, processes that can identify the problems– Detect and report the problem– Measure the quality of the software system

• Three methods of early defect identification– peer review, IIV&V, Automated Analysis

(c) USC-CSSE 22

Page 23: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Peer Review

• Reviews performed by peers in the development team– Can be from Fagan’s inspections to simple

buddy checks– Peer Review Items– Participants / Roles– Schedule

(c) USC-CSSE 23

Page 24: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Defect Classification

• Severity : Major/ Minor• Type: Missing / Wrong / Extra• Category

– Logic , Syntax, Clarity. Performance, Interface, …

(c) USC-CSSE 24

Page 25: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Defect terminologies

• Activity: This is the actual activity that was being performed at the time the defect was discovered.

• Trigger: The environment or condition that had to exist for the defect to surface. What is needed to reproduce the defect?

• Impact: For in-process defects, select the impact which you judge the defect would have had upon the customer if it had escaped to the field. 

(c) USC-CSSE 25

Page 26: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Orthogonal Defect ClassificationTarget Defect Type Triggers Impact Severity Qualifier

Operational Concept / Requirements

Correctness Completeness Consistency Clarity / Testability Traceability

1. Design Conformance 2. Logic/ Flow 3. Backward Compatibility 4. Lateral Compatibility 5. Concurrency 6. Internal Document 7. Language Dependency 8. Side Effect 9. Rare Situations 10.Simple Path 11.Complex Path 12.Coverage 13.Variation 14.Sequencing 15.Interaction 16.Workload/Stress 17.Recovery/Exception 18.Startup/Restart 19.Hardware Configuration 20.Software Configuration 21.Blocked Test (previously

Normal Mode)

1. Installability 2. Serviceability 3. Standards 4. Integrity/Security 5. Migration 6. Reliability 7. Performance 8. Documentation 9. Requirements 10.Maintenance 11.Usability 12.Accessibility 13.Capability

Major Minor

Missing Wrong /

Incorrect Extra

Design / Code

Assign/Init Checking Alg/Method Func/Class/Object Timing/Serial Interface/O-O Mes

sages

Relationship

(c) USC-CSSE 26http://www.research.ibm.com/softeng/ODC/DETODC.HTM

Page 27: Quality Management Peer Review

University of Southern California

Center for Systems and Software EngineeringOpener Section(These attributes are usually available when the defect is opened.)

Defect Removal Activities

Triggers Impact

Design Rev, Code Inspection, Unit test, Function Test, System Test.

1.Design Conformance2.Logic/ Flow3.Backward Compatibility4.Lateral Compatibility5.Concurrency6.Internal Document7.Language Dependency8.Side Effect9.Rare Situations10.Simple Path11.Complex Path12.Coverage13.Variation14.Sequencing15.Interaction16.Workload/Stress17.Recovery/Exception18.Startup/Restart19.Hardware Configuration20.Software Configuration21.Blocked Test (previously Normal Mode)

1.Installability2.Serviceability3.Standards4.Integrity/Security5.Migration6.Reliability7.Performance8.Documentation9.Requirements10.Maintenance11.Usability12.Accessibility13.Capability 

(c) USC-CSSE 27

Page 28: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Closer Section(These attributes are usually available when the defect is fixed.)

Target Defect Type Qualifer Age Source

Design/Code

1.Assign/Init2.Checking3.Alg/Method4.Func/Class/Object5.Timing/Serial6.Interface/O-O Messages7.Relationship

1.Missing2.Incorrect3.Extraneous

1.Base2.New3.Rewritten4.ReFixed

1.Developed In-House2.Reused FromLibrary3.Outsourced4.Ported

(c) USC-CSSE 28

Page 29: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Requirement Defect Type (1/2)• A requirements defect is an error in the definition of system

functionality• Correctness: Wrongly stated requirements• Examples:

– An incorrect equation, parameter value or unit specification– A requirement not feasible with respect to cost, schedule and

technology

• Completeness: Necessary information is missing• Examples:

– Missing attributes, assumptions, and constraints of the software system.– No priority assigned for requirements and constraints.– Requirements are not stated for each iteration or delivery

(c) USC-CSSE 29

Page 30: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Requirement Defect Type (2/2)

• Consistency: A requirements that is inconsistent or mismatched with other requirements

• Examples:– requirements conflict with each other– Requirements are not consistent with the actual operation

environment (eg. Test, demonstration, analysis, or inspection) have not been stated.

• Traceability: A requirement that is not traceable to or mismatched with the user needs, project goals, organization goals

(c) USC-CSSE 30

Page 31: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Unavoidable defects

• Changes arising because of – The dynamic of learning– Exploration in IKIWIKI situations– Code or screen content reorganization taken on

as an “afterthought” – Replacement of stubs of placeholders in code

(c) USC-CSSE 31

Page 32: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Avoidable defects

• Changes in analysis, design, code or documentation arising from human error

• Can be avoided though better analysis, design, training

(c) USC-CSSE 32

Page 33: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Product Assurance

• Requirements Verification• IIV&V• Automated Analysis

(c) USC-CSSE 33

Page 34: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Configuration Management

• Configuration items and rationale• Identification systems• Storage of configuration items• Configuration Controls• Status and accounting• Baselining events

(c) USC-CSSE 34

Page 35: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Defect and Change Management

• Processes• Change Control Board

(c) USC-CSSE 35

Page 36: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

COQUALMO•  Cost, Schedule and Quality

(c) USC-CSSE 36

Page 37: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

37

COCOMO II

Current COQUALMO System

COQUALMO

DefectIntroduction

Model

DefectRemoval

Model

Software platform, Project, product and personnel

attributes

Software Size Estimate

Defect removal profile levelsAutomation,

Reviews, Testing

Software development effort, cost and schedule

estimate

Number of residual defects

Defect density per unit of size

(c) USC-CSSE

Page 38: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

Coding defects introduction range

(c) USC-CSSE 38

Page 39: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering Defect Removal Profiles

(c) USC-CSSE 39

Page 40: Quality Management Peer Review

University of Southern California

Center for Systems and Software Engineering

40

Partion of COQUALMO Rating Scale

Very Low Low Nominal High Very High Extra High

Automated Analysis

Simple compiler syntax

checking

Basic compiler

capabilities

Compiler extension

Basic req. and design

consistency

Intermediate-level module

Simple req./design

More elaborate

req./design

Basic dist-processing

Formalized specification, verification.

Advanced dist-

processing

Peer Reviews No peer review

Ad-hoc informal

walk-through

Well-defined preparation,

review, minimal

follow-up

Formal review roles and

Well-trained people and

basic checklist

Root cause analysis,

formal follow

Using historical data

Extensive review

checklist

Statistical control

Execution Testing and

Tools

No testing Ad-hoc test and debug

Basic test

Test criteria based on checklist

Well-defined test seq. and

basic test coverage tool

system

More advance test tools,

preparation.

Dist-monitoring

Highly advanced

tools, model-based test

COCOMO II p.263

(c) USC-CSSE


Recommended