+ All Categories
Home > Documents > The CIPP Organizational Process Model Symposium/2016...The CIPP Organizational Process Model - An...

The CIPP Organizational Process Model Symposium/2016...The CIPP Organizational Process Model - An...

Date post: 15-May-2020
Category:
Upload: others
View: 9 times
Download: 0 times
Share this document with a friend
29
The CIPP Organizational Process Model - An Application to Military Test Range Management Joe Stufflebeam, Ph.D. White Sands Missile Range, NM Daniel Stufflebeam, Ph.D. Western Michigan University (Distinguished University Professor - Retired) International Test & Evaluation Association 33rd Annual Intl Test and Evaluation Symposium Reston, VA 05 October 2016 DISTRIBUTION A: Approved for public release; distribution is unlimited. OPSEC review conducted on 27 SEP 2016
Transcript

The CIPP Organizational Process Model -An Application to Military Test Range Management

Joe Stufflebeam, Ph.D.White Sands Missile Range, NM

Daniel Stufflebeam, Ph.D.Western Michigan University

(Distinguished University Professor - Retired)

International Test & Evaluation Association33rd Annual Intl Test and Evaluation Symposium

Reston, VA05 October 2016

DISTRIBUTION A: Approved for public release; distribution is unlimited.

OPSEC review conducted on 27 SEP 2016

Need Statement

• Define and develop an efficient and effective modern implementation of the “Mission Execution Process”

The Universal Documentation System (UDS)Mission Execution Process

Requirements Generation

Mission Planning, Budgeting & Scheduling

Mission Setup Mission ExecutionQuick-Look, Data

Collection, TeardownData Reduction, Mission Results

RequirementsMission

Planning Tool & TRMS

Remote Control /

Instruments

Remote Control /

Instruments

Remote Control /

Instruments

Range Customer

Data Reduction

Tools

Range Customer

The current mission execution process is driven by the RCC defined Universal Documentation System (UDS)

Goal: Maintain functionality of the UDS process, but implement in a more efficient & effective process

Universal Documentation System(Defines documentation for requirements & resource commitments)

Program Introduction (PI): The initial planning document submitted by a potential user to the support range immediately upon identification of general program requirements and schedules.

Statement of Capability (SC): The range’s response to the PI. It provides the user with a preliminary cost estimate, acceptance of the program and/or prerequisites for support.

Customer Range

Operations Requirements (OR): The OR is a mission-oriented document that describes the specific requirements for each mission, special test, or series of test in detail. It is prepared by the user.

Program Requirements Document (PRD): The PRD, normally used for complex or long lead-time programs, contains detailed program support requirements identified by the user.

Operations Directive (OD): The OD is the range’s response to the OR and is a detailed plan for implementation of support functions for a specific test or series of tests.

Program Support Plan (PSP): The PSP is the range’s response to the PRD and contains information relating to support commitments, including any alternatives.

Requirements for an Enhanced Process• Efficiency (Faster, Cheaper)

– Shorter time to test while maintaining information exchange rigor

– Improve mission throughput

– Quicker turn-around

– More timely delivery of data products

– Reduce operational costs and institutional costs

– Eliminate efforts and data products that have little or no value

• Effectiveness (Better)– Maximize the quality of data products

– Create new data products that add value to the customer

– Ensure data products are tied to customer requirements

– Ensure customer understands their requirements vs. expected data products

6

How to Meet the Need?

• Implement a mission execution process and toolset that facilitates

convergence to & sustainment of quality & efficiency

• Implement a toolset that is designed to:

• Meet stated requirements

• Improve management insight into details of the overall process

• Improve communication across mission execution components

• Provide meaningful and consistent feedback through integrated evaluation &

assessment

• Improve the transition and integration of development efforts and other

investments

• Increased standardization and automation. Eliminate developmental stovepipes.

Mission Execution Process

Operational SOPs & Work Instruction

Capture, Validate & Prioritize

Requirements

Planning, Budgeting,

Scheduling & Design

MissionExecution

Analysis, Outcomes, Results, Data

Products

Generalized Range Mission Execution Process

• How do we support this process?

• How do we evaluate the performance of this process?

• How do we manage this process?

• How do we improve this process?

Customer Request

Delivered Products

Business Process

Leveraging Well Established Process Models

Lean Six Sigma Origins in Manufacturing

• Tailored for building large numbers of the same thing

• Focused on eliminating waste and variation

Context-Input-Process-Product (CIPP) Origins in Government & Education

• The CIPP Evaluation Model is a systematic approach to evaluation

• Designed to assist in the planning and execution of programs

• Facilitates accountability within an organization

©

Context-Input-Process-Product (CIPP) Evaluation Model

• The CIPP model maps to the four major components of a standard process:

– Requirements generation, planning & design, execution, and results

• It utilizes both formative and summative evaluation

We Desire a Basic Feedforward / Feedback Process Control Model

Test Range

Feedforward Process(Defines Steady State Performance)

Feedback Process(Implements Corrective

Response to Unexpected Results)

Delivered Products

Mission Execution

Customer Requirements

Feedforward Process: Defines planned performance in the steady-stateFeedback Process: Defines response to unexpected or transient events

Man

agem

ent

Inte

ract

ion

Test Range

Defining an Overall Range Operations Process Model

Business Process Analysis & Information Database Dashboard

Identified Needs

Performance Metrics

Distilled Requirements

Evaluation Reports

Organizational Management

Key Management Functions

Strategic Planning

Tasking

Policies & Procedures

Accountability

Budgeting & Funding Staffing & Training

CommunicationsQuality ChangeCompliance

Range Operations Process

Delivered Products

Personnel Operational SOPs Instrumentation

Capture, Validate & Prioritize Requirements

Planning, Budgeting, Scheduling & Design

Mission Execution

Analysis, Outcomes, Results, Data Products

Customer Request

CIPP Evaluation Process

Evaluation Plan

Context Evaluation[Assess Needs & Goals]

Input Evaluation[Assess Options & Planning]

Process Evaluation[Assess Implementation]

Product Evaluation[Assess results]

InnovationOrganizational, Process &

Resource Improvement Efforts

Developed by Dr. Daniel & Dr. Joe Stufflebeam, 19 Sep 2014

Developed by Dr. Daniel & Dr. Joe Stufflebeam, 19 Sep 2014

Developed by Dr. Daniel & Dr. Joe Stufflebeam, 19 Sep 2014

Process Support Elements (Personnel, Tools & Infrastructure)

Key Support Functions

Development SustainmentMaintenance AcquisitionTrainingSafety/Security/ Environmental

Cyber / Network

New Capability Transition Testbeds (Testing & Integration of Improvements)

Agile

Test Range

Personnel, Operational SOPs, Instrumentation, etc.

Re-Arranging the Process Model for Mission Execution to Show the Feedforward & Feedback Components

Database

Management

CIPP Evaluation & Metrics Capture

Delivered Products

Improved

Capabilities

Process Support

Mission Execution

Customer Request

Control Theory

Command Input

FeedForward Control

FeedBack Control Controlled Process Position Output

Comparing Proven Process ModelsCIPP Model

Requirements Management Controlled Process Results+

Evaluation Reports & Metrics CIPP Evaluation

Support Elements

Personnel, Operational SOPs, Instrumentation, etc.

+Analyze

Encoder

++

Lean Six Sigma

Define Goals

Control

Improve Controlled Process Results+ +

Measure

Analyze

“DMAIC”

Realization of the Model- The Mission Execution Toolset

• Visualization Layer– Mission Planning Tool

– Real-Time Visualization

– Post Mission Review

• Automation Layer– Built into instrumentation and remote control system software

• Evaluation Layer– Evaluation manual

– Checklists and forms

• Continuous Process Improvement Layer– Database and dashboards

Implementation of the Model - The Mission Execution Toolset

Continuous Process Improvement

Requirements Generation

Mission Planning & Scheduling

Mission Setup Mission ExecutionQuick-Look, Post

Mission TeardownData Reduction, Mission Results

Context Evaluation

Input Evaluation Process EvaluationProduct

Evaluation

OR Document

Mission Planning Tool & Scheduling

Control Consoles /

Field Sensors

Control Consoles /

Field Sensors

Control Consoles /

Field Sensors

MissionDoc.xml

- Requirements Section- Planning Section

- Sensor locations- Sensor settings- File names & paths- Network config.- Etc.

- Scheduling Section- Setup date- Launch time- Roadblocks- Etc.

MissionDoc.xml

- Requirements Section- Trajectory Info- Measurements- Locations- Etc.

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section

- FRC check results- Logging of changes- Etc.

Performance MetricsEvaluation Checklists & Reports

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section- Mission Section

- Operational status- Tracking status- Seeing conditions- Weather- Etc.

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section- Mission Section- Post Mission Section

- Data capture status

- Download status- Trigger status- Etc.

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section- Mission Section- Post Mission Section- Data Reduction Section

- Analysis results- Unit performance- SoS performance- Etc.

Process Improvement Documentation Repository

Range Customer

Mission History Database

Performance Metrics Dashboard QA Analysis

Load from Prior Mission to Start?

Data Reduction

Range Customer

Mission Scenario Mission Scenario Mission Scenario Mission Scenario Mission Scenario Mission Scenario

Consolidated MissionDoc Database

Mission Execution Layer

Visualization Layer

Continuous Process Improvement Layer

Automation Layer

Continuous Process Improvement

Mission Execution Toolset – Visualization Tools

Requirements Generation

Mission Planning & Scheduling

Mission Setup Mission ExecutionQuick-Look, Post

Mission TeardownData Reduction, Mission Results

Context Evaluation

Input Evaluation Process EvaluationProduct

Evaluation

OR Document

Mission Planning Tool

& TRMS

RCF Control Consoles /

Instruments

RCF Control Consoles /

Instruments

RCF Control Consoles /

Instruments

MissionDoc.xml

- Requirements Section- Planning Section

- Sensor locations- Sensor settings- File names & paths- Network config.- Etc.

- Scheduling Section- Setup date- Launch time- Roadblocks- Etc.

MissionDoc.xml

- Requirements Section- Trajectory Info- Measurements- Locations- Etc.

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section

- FRC check results- Logging of changes- Etc.

Performance MetricsEvaluation Checklists & Reports

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section- Mission Section

- Operational status- Tracking status- Seeing conditions- Weather- Etc.

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section- Mission Section- Post Mission Section

- Data capture status

- Download status- Trigger status- Etc.

MissionDoc.xml

- Requirements Section- Planning Section- Scheduling Section- Setup Section- Mission Section- Post Mission Section- Data Reduction Section

- Analysis results- Unit performance- SoS performance- Etc.

Range Customer

Mission History Database

Load from Prior Mission to Start?

ODMS / TrackEye

Range Customer

Mission Scenario Mission Scenario Mission Scenario Mission Scenario Mission Scenario Mission Scenario

Consolidated MissionDoc Database

Process Improvement Documentation RepositoryPerformance Metrics Dashboard QA Analysis

Visualization w/ Pre-Mission

Plug-ins

Visualization w/ Real-Time

Plug-ins

Visualization w/ Data Reduction

Plug-ins

Pre-Mission – Mission Planning Tool

Real-Time Visualization Tool

Post Test - Target Miss Distance Tool

Moving Forward

• Develop a mission process evaluation manual and checklists, built on the CIPP model

• Build automation capabilities into software systems

• Develop a database and metrics dashboard to house performance information

• Develop a management strategy and an implementation plan for defining, executing and monitoring process improvement efforts

Final Thoughts• The UDS process:

– Requires a significant amount of time and resources in its current implementation.

– Current tools are inefficient and don’t facilitate automation. As a result, the process has room for improvement.

• The current state of affairs at the range dictates a more efficient and effective process.

• The CIPP process model

– Provides the required avenue for an efficient and effective realization of the mission execution process.

– Assists in the planning and execution of programs

– Facilitates accountability within an organization

Questions?

Joe Stufflebeam, Ph.D.TRAX International

White Sands Missile Range, NM

CIPP Evaluation

• Context evaluations assess needs, problems, assets, and opportunities, plus relevant contextual conditions and dynamics.

• Input evaluations assess a program’s strategy, action plan, staffing arrangement, and budget for feasibility and potential cost-effectiveness to meet targeted needs and achieve goals. An input evaluation may be comparative as in identifying and assessing optional ways to achieve a program’s goals, or non-comparative in assessing a single plan and its components.

• Process evaluations monitor, document, assess, and report on the implementation of program plans. Such evaluations provide feedback throughout a program’s implementation and later report on the extent to which the program was carried out as intended and required.

• Product evaluations identify and assess a program’s costs and outcomes — intended and unintended, short term and long term. These evaluations provide feedback during a program’s implementation on the extent that program goals are being addressed and achieved; at the program’s end impact evaluations identify and assess the program’s full range of accomplishments. The key questions are: Did the program achieve its goals? Did it successfully address the targeted needs and problems? What were any unexpected outcomes, both positive and negative? Were the program’s outcomes worth their cost?

CIPP Evaluation

• Formative Evaluation: An evaluation that proactively assesses a program from start to finish. It regularly issues feedback to assist the formulation of goals and priorities, provide direction for planning by assessing alternative courses of action and draft plans, guide program management by assessing and reporting on implementation of plans and interim results, and supply a record of collected formative information and how it was used.

• Summative Evaluation: A comprehensive evaluation of a program after it has been completed. It draws together and supplements previous evaluative information to provide an overall judgment of the program’s value. Such evaluations help interested audiences decide whether a program—refined through development and formative evaluation—achieved its goals, met targeted needs, constitutes a significant contribution in the program’s substantive, and is worth what it cost.

CIPP EvaluationEvaluation

Roles

Types of Evaluation

Context Input Process ProductFormative: Proactive

application of

descriptive and

judgmental

information to assist

decision making,

program

implementation,

quality assurance,

and accountability

Guidance for

identifying needed

interventions,

choosing goals, and

setting priorities by

assessing and

reporting on needs,

problems, risks,

assets, and

opportunities

Guidance for choosing a program

strategy (and possibly an outside

contractor) and settling on a

sound implementation plan and

budget by assessing and

reporting on alternative

strategies and resource allocation

plans and subsequently closely

examining and judging the

operational plan and budget

Guidance for executing

the operational plan by

monitoring,

documenting, judging,

and repeatedly

reporting on program

activities and

expenditures

Guidance for continuing, modifying,

certifying, or terminating the program

by identifying, assessing, and

reporting on intermediate and longer

term outcomes, including side effects

Summative:

Retroactive use of

descriptive and

judgmental

information to sum

up the program’s

value, e.g., its

quality, efficiency,

cost, practicality,

safety, impact, and

significance

Judging goals and

priorities by

comparing them to

assessed needs,

problems, risks,

assets, and

opportunities

Judging the implementation plan

and budget by comparing them

to targeted needs, problems, and

risks; contrasting the plan and

budget with critical competitors;

and assessing their compatibility

with the implementation

environment and compliance

with relevant codes, regulations,

and laws

Judging program

execution by fully

describing and assessing

the actual process and

costs, comparing the

planned and actual

processes and costs,

and assessing

compliance with

relevant codes,

regulations, and laws

Judging the program’s success by

comparing its outcomes and side

effects to targeted goals, needs,

problems, and risks; examining its

cost-effectiveness; and, as feasible,

contrasting its costs and outcomes

with competitive programs; also

interpreting results against the effort’s

outlay of resources and the extent to

which the operational plan was both

sound and effectively executed

WSMR

Personnel, Operational SOPs, & Instrumentation

A Simplified View of a Process Model for Mission Execution

Process Analysis & Performance Information System

Management

Performance Evaluation & Metrics Capture

Delivered Products

Improved Capabilities

Mission Execution ProcessCustomer Request

Improvement

Process Support Elements(Personnel, Tools & Infrastructure)

A Quick Digression to Look at the Effectiveness of the Feedforward-Feedback Process Model

Precision Control of a Defined Process “G(s)”(Machine tool, missile tracker, etc.)

Confirmation of the Feedforward-Feedback Control Architecture

E s R s Y s0 ( ) ( ) ( )

E s R s s K sK R s G s KK

sE s G s

sK G s Y s

aff vff p

i

d

0

2

0( ) ( ) ( ) ( ) ( ) ( )

( ) ( )

)()()(

)()()()()()(

0

0

2

0

sEsRsGsK

sGsEs

KKsGsRsKKssRsE

d

ipvffaff

)()()()(

)()()()()()(

0

0

2

0

sEsGsKsRsGsK

sGsEs

KKsGsRsKKssRsE

dd

ipvffaff

E s

s K sK sK G s R s

KK

ssK G s

aff vff d

p

i

d

0

21

1

( )( ) ( )

( )

E ss K s K K s K K s K K s R s

s K s K K s K KK K

s

aff vff d

d p

i0

2

2 1

2

1 1

2

2 1 1

1

( )( )

E ss K s K K s K K s K K s R s

s K s K K s K K s K K

aff vff d

d p i

0

3

2

2

1

3

1

2

1

2

3

2

2

1

2

1 1

( )( )

E sK K s K K K K K s R s

s K K K s K K s K K

aff vff d

d p i

0

1

3

2 1 1

2

3

2 1

2

1 1

1( )

( )

KKaff 1

1

KK

KKvff d

2

1

lim( )

( )

ssE s

K K s K K K K K s sR s

s K K K s K K s K K

aff vff d

d p i

0

1

0

1

3

2 1 1

2

3

2 1

2

1 1

Steady State Errors:

Confirmation of a Generalized Feedforward-Feedback System Process Configuration

)()()(0

sYsRsE

)()(0

)()()()()()(0

sGsEsFBsGsRsFFsRsE

To Drive Steady State Errors To Zero:

)()(0

)()]()(1)[()(0

sGsEsFBsGsFFsRsE

)]()(1)[()()(0

)()(0

sGsFFsRsGsEsFBsE

)]()(1)[()]()(1)[(0

sGsFFsRsGsFBsE

)]()(1[)]()(1[

)(

)(0

sGsFBsGsFF

sR

sE

)()]()(1[)]()(1[

)(0

sRsGsFBsGsFF

sE

)()]()(1[)]()(1[

)(00

lim ssRsGsFBsGsFF

ssEs

0)()(1 sGsFF

)(1)(sG

sFF

Analyzing the Process Error:

If

then steady state errors go to zero.

,

Determine FF(s), such that


Recommended