+ All Categories
Home > Documents > Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 29 January...

Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 29 January...

Date post: 21-Dec-2015
Category:
View: 216 times
Download: 1 times
Share this document with a friend
Popular Tags:
34
Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 29 January 2009
Transcript

Evaluate SE Methods, Processes and Tools Technical Task Plan

Evaluate SE Methods, Processes and Tools Technical Task Plan

USC WorkshopLos Angeles, CA29 January 2009

AgendaAgenda• Overview and changes in the MPT task• Near term activities

– Sponsor environment• Revised schedule• Workshop activities

SOW LanguageSOW LanguageLook at current SE methods, processes, and tools (MPTs) as they are applied across the DoD acquisition life cycle focusing on three different development environments: individual weapons systems, SoS, and network-centric systems. Research will be targeted at improving current/identifying new SE MPTs that will better support the practice of SE in these three environments. Specifically, this task will: 1. Define critical attributes of current SE MPTs across the weapons system, SoS, and network-centric services environments; 2. Identify strengths and weaknesses for these current MPTs and any shortcomings in their application across DoD; 3. Recommend, in priority order, MPTs for further study to innovate or create improved or new MPTs to eliminate identified shortcomings; 4. Upon selection by the government of MPTs recommended in sub-task 3 for further study, perform research to innovate or create improved or new MPTs to eliminate identified shortcomings, thereby advancing the state of practice of SE within the community; and 5. For the improvements delivered in sub-task 4 above, propose a methodology for validating the programs.

eWorkshop

MPT Sources

MPT Sources

Select MPTs for Evaluation

Select MPTs for Evaluation

Describe MPTsDescribe MPTs

Evaluate MPTsEvaluate MPTs

MPT AnalysisMPT Analysis

Apply Selection Criteria

Apply Selection Criteria

Complete Detailed Attributes

Complete Detailed Attributes

Apply Evaluation Criteria

Apply Evaluation Criteria

Reports and Recommendations

Reports and Recommendations

Cumulative MPT Coverage

Cumulative MPT Coverage

Identified Raw MPTs

HH HH MM MM Queue of selected and prioritized MPTs

Fully described MPTs to evaluate

Evaluated MPTsBPCHBPCH

Establish Criteria and Validate with Sponsor Establish Criteria and

Validate with Sponsor

RepeatRepeat

HH LLHH HH MM MMHH LL

DoD GuideboooksDoD Programs/ReviewsService RepositoriesDefense IndustryCommercial industry

To Users

Recommended MPTsImprovements Needed

Overall Gap AnalysisResearch Areas

3.3.1

3.3.2

3.3.3

3.3.4

3.3.6

3.3.5

MPT Task OverviewMPT Task Overview

Systems Engineering andLevel-of-Effort ContractsSystems Engineering andLevel-of-Effort Contracts

Dennis Barnabe, SERC PM21 Nov 2008

SERC Kickoff Meeting

Slippery Slope LogicSlippery Slope Logic

• Mission-card

• Agility

• Prototype/Discovery

• LOE Contract

Relationships to AgilityRelationships to Agility

High

Low

L Requirements Detail HLocal Mission Satisfaction BroadL Maintainability HH Redundancy Risk LL Scalability HL Complexity/Size HL Integration/Interoperability H

Project Type

Prototype

QRC + O&M LRIP + O&M P

rod + O&M

Contract Type

LOE/T&M/TTO

Delivery/Turnkey

SE “Equalizer”SE “Equalizer”

Requirements ConfigurationManagement

TechnicalReviews

TechnicalDocumentation

Testing Life CyclePlanning

Prototype or Discovery

SE “Equalizer”SE “Equalizer”

Requirements ConfigurationManagement

TechnicalReviews

TechnicalDocumentation

Testing Life CyclePlanning

QRC

SE “Equalizer”SE “Equalizer”

Requirements ConfigurationManagement

TechnicalReviews

TechnicalDocumentation

Testing Life CyclePlanning

Development with eye toward sustained Ops

(Usual) LOE SE Implications(Usual) LOE SE Implications• Requirements lacking• Limited (if any) ‘formal’ Reviews

– No coordination/insight among related efforts– Interface and duplication risks– No ability to assess technical health

• Standards application, etc• No formal ‘transition’ planning

– What if it works?• Build to Cost

– No actual cost estimate of satisfying mission need– If successful, Operations cuts into Development

• Deemed ‘tech transfer issue’• Schedule lacking

– Inability to coordinate among other efforts• “Success” defaults to ‘what is delivered’

eWorkshop

MPT Sources

MPT Sources

Select MPTs for Evaluation

Select MPTs for Evaluation

Describe MPTsDescribe MPTs

Evaluate MPTsEvaluate MPTs

MPT AnalysisMPT Analysis

Apply Selection Criteria

Apply Selection Criteria

Complete Detailed Attributes

Complete Detailed Attributes

Apply Evaluation Criteria

Apply Evaluation Criteria

Reports and Recommendations

Reports and Recommendations

Cumulative MPT Coverage

Cumulative MPT Coverage

Identified Raw MPTs

HH HH MM MM Queue of selected and prioritized MPTs

Fully described MPTs to evaluate

Evaluated MPTsBPCHBPCH

Establish Criteria and Validate with Sponsor Establish Criteria and

Validate with Sponsor

RepeatRepeat

HH LLHH HH MM MMHH LL

DoD GuideboooksDoD Programs/ReviewsService RepositoriesDefense IndustryCommercial industry

To Users

Recommended MPTsImprovements Needed

Overall Gap AnalysisResearch Areas

3.3.1

3.3.2

3.3.3

3.3.4

3.3.6

3.3.5

MPT Task OverviewMPT Task OverviewChanges RequiredChanges Required

Changes to identification process1Changes to identification process1

• Guidance:– Focus on IC environment (context) changes strategy to initially

leverage BPCh Content Provider Network (CPN)– Requires different candidate MPT collection strategy based on IC

context and requirements• New Strategy:

– Extend context attributes of current MPT description to support definition of IC environment

– Define and validate IC environment and requirements using a revised MPT description template with extended context attributes

– Compare to other environments (contexts) and where similarities are found, mine environment for MPTs

Changes to identification process2Changes to identification process2

• Tactics:– Develop initial set of context attributes and values that characterize NSA

environment based on current understanding – Revise current MPT template to include extended attribute list, MPT

requirements, information summary, selection recommendation and support rationale

– Validate attributes and practice criteria in requirements interviews with NSA personnel (critical)

– Use template for 3-pronged MPT identification efforts1. Review sources provided in SOW and BPCh CPN2. Review open literature and web-based sources3. Capture current applicable NSA MPTs

1 and 2 can begin when initial template is available; 3 depends on sponsor participation and ability to coordinate schedules/access

– Adapt selection criteria and process to using the new template

Identify and Mine Comparative Environments for MPTs

Identify and Mine Comparative Environments for MPTs

MPT Task ChangesMPT Task Changes

Refine Templates and Select MPTs

for Evaluation

Refine Templates and Select MPTs

for Evaluation

Initial MPT Candidate Templates

Queue of selected and prioritized MPTs

Validate Environment and Needs with Sponsor

Validate Environment and Needs with Sponsor

HH HH MM MMHH LL

Review sources

provided in SoW

Establish Preliminary Sponsor Environment and

Needs

Establish Preliminary Sponsor Environment and

Needs

Develop Extended MPT Template

Develop Extended MPT Template

Review literature and web

Access experts through

team

Extended environmental

attributes

Conduct interviews

MPT Identification/selection Activities1MPT Identification/selection Activities1

• Establish Preliminary Sponsor Environment and Needs– Develop initial MPT evaluation/characterization template– Revise and extend proposed attribute set

• Extend context attributes• One-page template for candidate identification

• Validate Environment and Needs– Interview sponsor personnel

• Describe the type of people to interview• Develop interview structure based on the template• Revise preliminary attributes, values and needs and capture new

ideas – Revise/extend template as needed– Revise evaluation criteria based on needs assessment

Italics indicate tasks of the workshop

MPT Identification/selection Activities2MPT Identification/selection Activities2

• Gather MPT candidates from broader community based on environmental description– Identify best approach for this– First target is INCOSE Workshop next week

• Identify comparable environments– Through literature, web and expert inputs, identify

development/acquisition/deployment environments that share attribute values with validated sponsor environment

• Mine comparable environments for candidate MPTs– Review SOW-specified sources– Review comparable environments as they are identified for

candidate MPTs

MPT Identification/selection Activities3MPT Identification/selection Activities3

• Select MPTs for evaluation– Review candidate templates– Refine and extend template descriptions for promising

candidates– Select evaluation candidates

Initial environment descriptionInitial environment description• The NSA environment can be described as

– A short development cycle to meet quick response needs with lowered quality requirements at initial deployment

– Evolutionary deployment strategy that may begin with limited deployment at relatively low-quality and evolve into broader deployment at higher quality

– High level of interdependency with existing products• “Mashing” and expanding of results from other projects to create

new results• Providing new results for further processing by others• Modifying existing capabilities to meet rapidly changing

constraints and/or availability of different data• High level of glueware

Original MPT Attributes Original MPT Attributes MPT Attributes – What changes are needed?

Proposed Revised ScheduleProposed Revised Schedule

MPT Activities during workshopMPT Activities during workshop• First Session

– Clean up environment description (Rich, Ken) • Less geek language – more general description• Is there a taxonomy to help with completeness?• Hopefully discuss with customer

– Determine and define attribute changes (including values) (Forrest)– Develop the MPT mining template (Paul?)

• Second Session– Brainstorm MPT mining activities (Paul)

• Opportunities, “helper” groups (INCOSE, Redstone SE group, etc.)• Methodology

– Build necessary instruments for INCOSE (Forrest)• Possible extra session after the SERC reception tonight?

– Possibly at Radisson or near airport (depending on majority)

BackupBackup

Systems Engineering andLevel-of-Effort ContractsSystems Engineering andLevel-of-Effort Contracts

Dennis Barnabe, SERC PM21 Nov 2008

SERC Kickoff Meeting

Slippery Slope LogicSlippery Slope Logic

• Mission-card

• Agility

• Prototype/Discovery

• LOE Contract

Relationships to AgilityRelationships to Agility

High

Low

L Requirements Detail HLocal Mission Satisfaction BroadL Maintainability HH Redundancy Risk LL Scalability HL Complexity/Size HL Integration/Interoperability H

Project Type

Prototype

QRC + O&M LRIP + O&M P

rod + O&M

Contract Type

LOE/T&M/TTO

Delivery/Turnkey

(Usual) LOE SE Implications(Usual) LOE SE Implications• Requirements lacking• Limited (if any) ‘formal’ Reviews

– No coordination/insight among related efforts– Interface and duplication risks– No ability to assess technical health

• Standards application, etc• No formal ‘transition’ planning

– What if it works?• Build to Cost

– No actual cost estimate of satisfying mission need– If successful, Operations cuts into Development

• Deemed ‘tech transfer issue’• Schedule lacking

– Inability to coordinate among other efforts• “Success” defaults to ‘what is delivered’

Tailoring vice AvoidanceTailoring vice Avoidance

Iterative Acquisition

Development

Discovery Deploy?

QRC

Operational Baseline

90Days

90Days

Deployments

Right Tool for the JobRight Tool for the Job• LOE has its niche• SE (& Acquisition) approach must evolve as Objective changes

– Prototype/Discovery– QRC– Limited Ops– Full Ops– Production

SE “Equalizer”SE “Equalizer”

Requirements ConfigurationManagement

TechnicalReviews

TechnicalDocumentation

Testing Life CyclePlanning

Prototype or Discovery

SE “Equalizer”SE “Equalizer”

Requirements ConfigurationManagement

TechnicalReviews

TechnicalDocumentation

Testing Life CyclePlanning

QRC

SE “Equalizer”SE “Equalizer”

Requirements ConfigurationManagement

TechnicalReviews

TechnicalDocumentation

Testing Life CyclePlanning

Development with eye toward sustained Ops

Possible LOE SE Leverage PointsPossible LOE SE Leverage Points• Ensure Standard Inclusions

– On contract– Adherence

• ‘Formal’ Gates for phase transitions– Prototype/PofC QRC Limited Ops Sustained Ops

• Evolve SE Processes appropriately for given Phase– TTOs must be written to support

Possible new contextual attributesPossible new contextual attributes• Brainstormed attribute list with values where available – to be refined!

– Criticality for meeting requirements (QRC-high, QRC-low, high, medium, low)

– Volatility/evolution of requirements (High (>1%/month), Normal(.01-1%/month), low (<.01%/month)

– Level of quality required at deployment (functional, reliable, critical)– Level of security required at deployment (SCI, Classified, Unclassified)– Dependence on other systems for critical data and functionality (Very high,

high, medium, low, none)• Need to coordinate among other efforts• Assessability of technical health (health of data sources required?)

– Length/stability of life cycle• Stability of life cycle definition (phases)• Evolution/stability of required ceremony in response to system life cycle needs

– how do I prepare enough ceremony up front to be able to make adjustments easily when system maturity/deployment change – nondeterministic?

• Breadth of applicability• Uniqueness of application (are 3 people already doing this and you don’t

know it)• Scalability – in function and number of copies deployed• Level of transition planning required


Recommended