+ All Categories
Home > Documents > James Nowotarski 7 November 2006 SE 325/425 Principles and Practices of Software Engineering Autumn...

James Nowotarski 7 November 2006 SE 325/425 Principles and Practices of Software Engineering Autumn...

Date post: 20-Dec-2015
Category:
View: 215 times
Download: 1 times
Share this document with a friend
Popular Tags:
58
James Nowotarski 7 November 2006 SE 325/425 Principles and Practices of Software Engineering Autumn 2006
Transcript

James Nowotarski

7 November 2006

SE 325/425Principles and

Practices of Software Engineering

Autumn 2006

2

Topic Duration

Recap 20 minutes

Capability maturity model (CMM) 60 minutes

*** Break

Current event reports 20 minutes

CMM (continued) 60 minutes

Today’s Agenda

3

Design

Communication project initiation requirements

Modeling analysis design

Construction code test

Deployment delivery support

Planning & Managing

Primary deliverablesDesign model:• Data/Class • Architecture• Interfaces• Components

4

Key principles of architectural design

Abstraction Modularity Reuse

5

CohesionA measure of the relative functional strength of a module

Two qualitative criteria

Func A-1

Func A-2

Func A-3

Func B-1

Func B-2

Func B-3

High Cohesion (good)

CouplingA measure of the relative interdependence among modules

High coupling (bad)

6

Architecture

arch·i·tec·ture n. An architecture depicts the overall structure of a man-made complex system

7

Software architecture

Applications and Data

Middleware

Hardware/Network

System Software

• Presentation layer• Application logic• Data management

8

Taxonomy of architectural styles

Data-centeredData flow (aka pipes and filters)Call and returnObject oriented architecturesLayered SystemsOnline transaction processingProcess control

9

Data centered

10

Examples: UNIX shell commands Compilers:

Lexical Analysis -> parsing -> semantic analysis -> code generation Batch Processing

FilterFilter

Filter Filter

Filter

Filter Filter

Pipe Pipe

Pipe

Pipe

Pipe

Pipe

Pipe

Pipe Pipe

Pipe

Data flow

11

Call and return

PROCESS_PAYROLL for each employee get_data(:employee_data) calc_salary(employee_data:salary) calc_tax(salary:tax) print_check(employee_data, salary, tax)

GET_DATA

employee_data

CALC_SALARY

employee_ data

salary

CALC_TAX

salary

tax

PRINT_CHECK

employee_data

salary

tax

12

Deriving a software architecture: Structured approach

Derive architectural context diagram (ACD)

Refine the DFD Map DFD to program structure:

Transform mappingTransaction mapping

13

Architecture context diagram (ACD)

target system: Security Function

uses

uses peershomeowner

Safehome Product

Internet-based system

surveillance function

sensors

control panel

sensors

uses

14

Transform mapping

data flow model

"Transform" mapping

ab

c

d e fg h

ij

x1

x2 x3 x4

b c

a

d e f g i

h j

15

Transaction Mapping

a

b

t

g

h

d

e

f

i

k

j

l

m

n

Data flow model

x1

b

a

t

x2 x3 x4

d e f g h x3.1 l m n

i j

k

mapping

program structure

16

The art of coordinating software changes to minimize confusion

SCM activities: Identification Change control Version control Configuration auditing Reporting

Software configuration management (SCM)

17

SCIs

SCIs SCIs

SCIs

modified

approved

extracted

SCIs

stored

Project database

Formaltechnicalreviews

Softwareengineering

tasks

SCM controls

Baselined SCI’s

18

Req No. Description Traces To

U2 Users shall be able to process retirement claims S10, S11, S12

U3 Users shall be able to process survivor claims S13

S10 The system shall accept retirement data U2

S11 The system shall calculate the amount of retirement U2

S12 The system shall calculate point-to-point travel time U2

S13 The system shall calculate the amount of survivor annuity.

U3

Entities U2 U3 S10 S11 S12 S13

U2 X X X

U3 X

S10 X

S11 X

S12 X

S13 X

Traceability matrix

19

An alternate and probably more common representation.

Traceability matrix

20

Control the Change1. Need for change is

recognized2. Change request is

submitted as a “request for change” (RFC)

3. Developer evaluates4. Change report is

generated5. Change control

authority makes a decision to either: Proceed Deny request.

6. Request if queued for action. ECO is generated(Engineering Change Order).

7. Individuals assigned to configuration objects.

8. Objects checked out and change made.

9. Change audited.10. Objects checked in.11. Baseline established.

SQA activities performed.

12. Rebuild & distribute.

21

Sam

ple

RF

C f

orm

fro

m:

http

://w

ww

.nw

s.no

aa.g

ov/o

so/o

so1/

oso1

1/os

o112

/drg

/drg

rc.h

tm

22

Check-in/Check-out

Most version control tools in widespread use employ the checkout-edit-checkin model to manage the evolution of version-controlled files in a repository or codebase.

http://www.cmcrossroads.com/bradapp/acme/branching/branch-intro.html

23

Serial development with exclusive checkouts.

In a strictly sequential development model, when a developer checks-out a file, the file is write-locked:

No one may checkout the file if another developer has it checked-out. Instead, they must wait for the file to be checked-in (which releases or removes the write-lock).

This is considered a pessimistic concurrency scheme which forces all work to take place on a single line of development.

http://www.cmcrossroads.com/bradapp/acme/branching/branch-intro.html

24

Concurrent development using branches

Branching is a common mechanism to support concurrent software development.

Allows development to take place along more than one path for a particular file or directory.

When the revision on the new branch is modified and checked-in, the two lines of development will have different contents and will evolve separately

http://www.cmcrossroads.com/bradapp/acme/branching/branch-intro.html

25

Merging is the means by which one development line synchronizes its contents with another development line.

The contents of the latest version on a child branch are reconciled against the contents of the latest version on the parent branch (preferably using a 2-way or 3-way file differencing or comparison tool).

http://www.cmcrossroads.com/bradapp/acme/branching/branch-intro.html

Synchronizing using merges

26

Topic Duration

Recap 20 minutes

Capability maturity model (CMM) 60 minutes

*** Break

Current event reports 20 minutes

CMM (continued) 60 minutes

Today’s Agenda

27

Software process assessment and improvement

Software Process

Software Process Assessment

is examined by identifies capabilitiesand risk of

identifiesmodifications to

Software Process Improvement

Capability Determination

leads to leads to

motivates

28

Sources of improvement ideas

29

Software Process Improvement Models

ISO 15504 ISO 9000-3 TickIT Capability Maturity Model Integration (CMMI)

IT specific models

A number of models enable software development organizations to compare their practices to a set of “best practices”

Total Quality Management (TQM) Six Sigma

General models

30

Capability Maturity Model Integration (CMMI)

“the de facto process improvement framework for software developers”

- Gartner Group

31

What is CMMI

CMMI = Capability Maturity Model Integration Developed in1991 by Software Engineering

Institute (SEI) to assess the software engineering capability of government contractors

A framework for software process improvement (SPI) that has gained wide acceptance in the industry

A roadmap of effective practices that build on one another in a logical progression coherent ordered set of incremental improvements

32

What is SEI

SEI = Software Engineering Institute Federally funded research & development

center Sponsored by Department of Defense Affiliated with Carnegie Mellon University in

Pittsburgh Established in 1984 Research and publications oriented Mission is to improve the state of the

practice of software engineering

33

Brief History - CMMI

1989 - Publication of Managing the Software Process by Watts Humphrey

1991 - Capability Maturity Model for Software (CMM) v1.0 released by Software Engineering Institute (SEI)

1993 - CMM v1.1 released1994 - Systems engineering (SE) CMM

released2001 - CMM Integration (CMMI)-SE/SW

v1.0 released2002 - CMMI-SE/SW/IPPD/SS v1.1

released2006 - CMMI-Dev v1.2 released

34

A proliferation of models Different capability maturity models

Software CMM (SW)Systems Engineering CMM (SE)Integrated Product and Process Development

CMM (IPPD)Supplier Sourcing (SS)Software Acquisition (ACQ)Services (SVC)Team Software ProcessPersonal Software ProcessPeople CMM (P-CMM)

35

Brief History - CMMI

1989 - Publication of Managing the Software Process by Watts Humphrey

1991 - Capability Maturity Model for Software (CMM) v1.0 released

1993 - CMM v1.1 released1994 - Systems engineering (SE) CMM

released2001 - CMM Integration (CMMI)-SE/SW

v1.0 released2002 - CMMI-SE/SW/IPPD/SS v1.1

released2006 - CMMI-Dev v1.2 released

36

Why CMMI?

Practical Structured Proven reputation Quantitative benefits (median):

cost 34%productivity: 61%time to market: 50%post-release defects: 48%customer satisfaction: 14%return on investment: 4:1

Benefits

37

CMMI Maturity Levels

Managed(2)

Managed(2)

Defined(3)

Defined(3)

QuantitativelyManaged

(4)

QuantitativelyManaged

(4)

Optimized(5)

Optimized(5)

Initial(1)

Initial(1) Process poorly controlled and unpredictable

Process characterized for projects and is often reactive

Process characterized for the organization and is proactive

Process measured and controlled

Process improvement (“nirvana”)

38

Process areas (PAs)

Maturity levels

Process areas

Contain

39

CMMI Process Areas

Level 3Defined

Requirements DevelopmentTechnical SolutionProduct IntegrationVerification ValidationOrganization Process Focus

Organization Process Definition Organizational Training Integrated Project Management

Risk Management Decision Analysis & Resolution

Level 2Managed

Requirements Management Project Planning Project Monitoring & Control Supplier Agreement Management

Measurement & Analysis Product & Process Quality Assurance Configuration Management

Level 5Optimized

Causal Analysis & Resolution Organizational Innovation & Deployment

Level 4Quantitatively

Managed

Organizational Process Performance Quantitative Project Management

Process Areas

40

Process areas (PAs)

Maturity levels

Process areas

Contain

Specific practices

Contain

Specific goals

Achieve

Process area categories

Contain

41

Process areas (PAs)

Process area “A cluster of related practices in an area that, when

performed collectively, satisfy a set of goals considered important for making significant improvement in that area.”

Specific goals What must be achieved to satisfy the process area

Specific practices Refine a goal into a set of process-related activities

42

Process areas (PAs)

Level 2 - Managed

Project planning

Process area

Determine estimates of effort and cost

Specific practice

Establish estimates

Specific goal

Maturity level

Project management

Process area category

43

Level 1: Initial

Instability Dependence on “heroes” Inability to meet targets Key process areas:

none

44

Class Activity

Summarize and explain to the rest of the class: The 22 key process areas

45

Appraisal process

CMMI Reference model

Standard CMMI Appraisal Method for Process Improvement (SCAMPI)

Appraisal process

used by

46

CMMI Appraisal Method

TeamSelection

1

MaturityQuestionnaire

2

ResponseAnalysis

3

On-site visit

Interviews &documentreviews

4

Findingsbased on the CMMI

5

PAProfile

6

47

Appraisal Process For internal purposes:

Performed in open, collaborative environment Focuses on improving the organization’s

software process For external credential:

Performed in a more audit-oriented environment

Focuses on identifying risks associated with a contractor

Team’s recommendation will help select contractors or set fees

48

CMMI Issues in the Real-World

“Level envy” Areas not addressed:

Business strategy and linkage to ITOperations, help desk, supportManagement of the IT human resourceApplication portfolioTools

Many question whether it is worth the effort to pursue levels 4 and 5

49

Process Maturity Profile

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0%

Initial

19.3%

Repeatable

43.2%

Defined

23.4%

Managed

7.3%

Optimized

6.8%

% o

f O

rga

niz

ati

on

s

1998 thru August 2002

Based on assessments from 1998-2002 of 1124 organizations

50

Process Maturity Profile, April 2002-June 2006

51

Time to Move Up

# of monthsto move tonext level

0

75

50

25

1 to 2

23 22

2 to 3

28

3 to 4

17

4 to 5

Largest observed value thatis not an outlier

75th percentile

Median (50th percentile)25th percentileSmallest observed value thatis not an outlier

Recommended time between appraisals (18-30 mos)

52

CMMI Market Pressure

Marketing tool to win clients, who are based predominantly in US and Europe Clients using Indian service providers should have certain key processes in place:

service level agreements identifying business requirements scoping requirements managing changes

Many, if not most, of the publicly-acknowledged Level 5 CMM-certified organizations are in India

53

CMMI-based Software Process Improvement (SPI)

Time and cost often exceed expectations 18-24 months to advance 1 level Can cost $2K per software engineer per year 1-2% full-time resources (e.g., 5-10 in a 500-person

organization) 2-4% of rest of organization’s time

Key success factors Senior management is engaged Participation and buy-in at all levels, including middle

management and technical staff Clearly stated, well understood SPI goals Clear assignment of responsibility SEPG staffed by highly respected people

54

For more information

http://www.sei.cmu.edu/cmmi/cmmi.html

55

See course home page for light reading Current event reports:

KarasModiPaternostro

Final exam review

For November 14

56

Extra slides

57

Technology

ProcessPeople

The focus of SE 425 is the process component of software engineering

Core Concepts

Technology

ProcessPeople

… for the delivery of technology-enabled business solutions

58

Software Process Improvement Models

International collaborative effort (including SEI) Sparked by an investigative study sponsored by

the U.K. Ministry of Defense (MOD) Objective: To develop a standard in the area of

software process assessment establish a common framework for expressing the

process capability ratings resulting from a 15504-conformant assessment

provide a migration path for existing assessment models and methods wishing to become 15504-conformant

ISO 15504


Recommended