1U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Industry-Government Collaboration
that is Delivering Results
Eli MinsonPrincipal Mission Assurance EngineerBall Aerospace & Technologies Corp.
CQSDI, March 7-8, 2016
2
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Setting the Stage
• Have you ever found yourself asking…
– How do I plan for quality deployment in the face of new and evolving
technologies?
– What is the best approach for Supplier Risk Evaluation and Control?
– What are the best practices for Counterfeit Parts Avoidance and achieving
Resilience to Cyber Attacks?
– How do my aerospace industry peers evaluate Heritage Hardware and
Software for potential reuse?
– What are customers expectations for Design Reviews and other Critical
Gated Events?
• Your options, given that requirements may be silent, might include;
– Search the internet
– Consult professional organizations or societies
– Ask your peer at Company XYZ
• These may work, but there is a better way….
3
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
MAIW Objective and Mission
• Objective
– Facilitate mission assurance (MA) collaboration across the US Space
Program community in industry and government
– Identify and generate products that address multi-agency MA challenges in a
cohesive and consistent manner focusing on near earth, unmanned space
programs
• Mission Statement
– Use issues-based approach to enhance government/industry mission
assurance processes, supporting disciplines, and associated products
collaboratively with industry and government MA teams at an enterprise level
• Value
– Deliver actionable mission assurance products
– Enhance experience and development of the participating engineers (industry & government)
– Develop communities of practice understanding of prime and government policies, practices and expectations
4
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
2015 MAIW hosted by Lockheed Martin Corporation
2015 MAIW ParticipantsMay 5 - 7, 2015
Harris
Honeywell
IDA
Intelsat
JPL
Lockheed Martin
MacAulay-Brown
Micropac Industries
Missile Defense Agency
Aeroflex
The Aerospace Corp.
Air Force – SMC
BAE
Ball Aerospace
Boeing
DCMA
General Dynamics
Flight Microwave
NASA
Northrop Grumman
Naval Research Lab
NRO
Orbital ATK
Raytheon
SSL
University of Maryland
XILINX
5
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
2016 Management Structure
Supply Chain
Escapes
Lessons
Learned
ASIC/FPGA
Malicious
Attack:
Production
Assurance
Additive
Manufacturing:
Considerations
for Successful
Implementation
of Flight
Hardware
Value
Proposition
for Mission
Assurance
Process Control
for High Impact
Defect Mitigation
Todd Nygren (Aerospace), ChairAnne Ramsey (Harris), Co-Chair
Eli Minson (Ball) Robert Adkisson (Boeing)John Kowalchik (LMSSC)
Craig Wesser (NG)Robin Gillman (NG Alt)
Dave Swanson (Orbital ATK)Mark Baldwin (Raytheon)
Brian Kosinski (SSL)Patrick Martin (NASA)Richard Fink (NRO)
Frederick Kelso (MDA)Dave Davis (SMC)
Wayne Blackwood (NOAA)Brian Reilly (DCMA)
Rich Haas (Aerospace), Advisor
Senior Government Leaders
Hal Bell (NASA)
Brig Gen Anthony Cotton (NRO)
Mike Wadzinski (MDA)
Tom Fitzgerald (SMC)
Mark Paese (NOAA)
Dr. Wanda Austin (Aerospace)
Program Committee
Jackie Wyrwitzke (Aerospace), Chair
C. J. Land (Harris), Co-ChairEd Jopson (NG)
Gail Johnson-Roth (Aerospace)Mike Tolmasoff (Boeing)
Cheryl Sakaizawa (Aerospace), Coordinator
Arrangements Committee
Holly Stewart (Aerospace), LeadC.J. Land (Harris), Co-Lead
Trina Kilpatrick (Aerospace), Coordinator
Cheryl Sakaizawa (Aerospace), Support
Steering Committee
6
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
• Produce a guidebook of actionable recommendations and
best practices for mitigating unintended consequences of
seemingly inconsequential changes to space hardware
manufacturing processes
• Assemble lessons learned and best practices for
common supply chain escapes encountered when
producing hardware. Include background on why
important and the impact these can have to NSS
missions.
• Collect available methodologies & techniques. Identify
supportive tools to vet untrusted PLDs. Collect available
methodologies for mitigating exposure to malicious
attacks in PLD end-to-end physical production. Specify
techniques to vet untrusted microcircuit production
• Summarize current guidance and activities related to
mission assurance considerations for use of additive
manufacturing components
• This topic would attempt to gain insight into the various
MA perspective and the value / effort function from the
Customer and Contractor viewpoints.
2016 MAIW Topics and Goals
Process Control for High Impact Defect Mitigation
Supply Chain Escapes Lessons Learned
Programmable Device Malicious Attack Production Assurance
Additive Manufacturing Mission Assurance Considerations
Value Proposition for Mission Assurance
7
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
MAIW Teams Have Created 39 Products
2008
Test Like You Fly – Concepts and Reference Info
Critical Clearances – Criteria and Control
Processes
Acquisition Program Critical Gated Events
Heritage Hardware and Software Reuse
Assessments
Verification Processes for Late Changes
2009
Test Like You Fly – Test Program Checklist
Fault Management Verification – Reference Guide
Failure Modes & Effects Analysis – Systems Eng Guide
Test Equipment / Procedure Readiness Assessments
Design Assurance – Risk Based Implementation Guide
2010
Flight Unit Qualification – Guidelines
Objective Criteria for Heritage Hardware Reuse
Mission Assurance Program Framework
Modeling and Simulation
Test beds & Simulators
20
11
Guidelines for Failure Review Board
Space Segment Software Readiness Assessment
Space Segment Information Assurance Guidance for
Mission Success
Mission Assurance Guidelines for Class A-D Missions
Supplier Risk Evaluation and Control
2012
Space Mission Resilience to Cyber Attacks
Guidance for Efficient Resolution of Post-contract
award MA Requirement Issues
MA Approach within a Mission Class
Development vs. Production Programs
Electrical Design Worst-Case Circuit Analysis:
Guidelines and Draft Standard2013
Preventive Measures Based on Hybrid and Integrated
Circuit Lessons
Architectures for Lithium Ion Based Power Subsystems
Mission Assurance Practices for Satellite Operations
Electrical Design Worst-Case Circuit Analysis: Guidelines
and Draft Standard
Key Considerations for Mission Success for Class C/D
Mission
2014
RF Breakdown Prevention in Spacecraft
Components
Guidelines for Hosted Payloads Integration
Counterfeit Parts Prevention Strategy Guide
Technical Risk Identification at Program Inception
Root Cause Investigation Best Practice Guide
2015
Design Review Improvement Recommendations
Process Approach to Determining Quality Inspection
Deployment
Standard/Handbook for RF Ionization Breakdown
Prevention in Spacecraft Components
White paper on Multicarrier excitation of Multipactor
breakdown
Countermeasures to Mitigate Malicious Attack Risks to
ASIC and FPGA Circuit Development
8
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
10 MAIW Products Went on to ConferencesYear MAIW Product Conference
2008 Test Like You Fly – Concepts and
Reference Info
GSAW and Every Aerospace Testing
Seminar since 2009
2008 Critical Clearances – Criteria and Control 2009 Aerospace Testing Seminar
2008 Verification Processes for Late Changes* AIAA and ESA conferences
2011 Space Segment Information Assurance
Guidance for Mission Success
MITRE invitational workshops on cyber
security and resiliency.
Basis for GSAW Working Groups2012 Space Mission Resilience to Cyber Attacks
2012/
2013
Electrical Design Worst-Case Circuit
Analysis: Guidelines and Draft Standard
Related papers at 2014 Space Power
Workshop
2013 Architectures for Lithium Ion Based Power
Subsystems
Moderating an energy storage
workshop on lithium ion safety at Space
Power Workshop May 7, 2014
2013 Mission Assurance Practices for Satellite
Operations
AIAA Space Ops 2014, May 7, 2014
2013 Key Considerations for Mission Success for
Class C/D Mission
Accepted at conference that was
deferred due to Government shutdown
2014 Design Integration of Hosted Payloads - Do
No Harm Analysis
Abstract submitted to Reliability and
Maintainability Symposium
* Incorporated into AIAA-S117–2010 Space Systems Verification Program and Management Process
9
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
High Level MAIW Process
Select Topic Areas
Assemble Teams & Generate
Draft Products
Invite Additional
Expert Review
Workshop Event to
Adjudicate Comments
Approval and
Publication
May Aug – Feb Mar – Apr May Jun - Oct
10
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
How to Engage in the MAIW• How can the larger US Space community participate?
– Nominate topics for future MAIW topic teams to the Steering Committee
– Nominate subject matter experts (SMEs) to review upcoming topics to the
Steering Committee
– Review and adopt best practices from released MAIW products
– To request copies of products, use the following link:
• Interested parties may gain information from ([email protected])
Mr. Todd Nygren
General Manager, Systems Engineering Division
MAIW Steering Committee co-chair
The Aerospace Corporation, (310) 336-3528, [email protected]
Ms. Jacqueline Wyrwitzke
Principal Director, Mission Assurance Subdivision
MAIW Program Committee co-chair
The Aerospace Corporation, (310) 336-3418, [email protected]
11U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Process Approach to Determining
Quality Inspection Deployment
Eli Minson, Ball AerospaceFrank Pastizzo, SSL
Eric Richter, The Aerospace Corporation
Training Package
CQSDI, March 7-8, 2016
12
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Agenda
• Motivation and Team Charter
• Product Overview
• Examples
“Without data you’re just another person with an opinion”
W. Edwards Deming
13
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Quality Deployment Team Membership
Core TeamFirst Name Last Name Organization
Kathy Augason Lockheed Martin
Kevin Craig SSL
Ken Dodson SSL
Frank Fieldson Harris
Edward Gaitley The Aerospace Corporation
Anthony Gritsavage NASA
Michael Kelly NASA
Neil Limpanukorn SSL
Michael Phelan DCMA
Robert Pollard Ball Aerospace
Thomas J. Reinsel Raytheon
Ric Alvarez Northrop Grumman
Dave Newton Northrop Grumman
Ethan Nguyen Raytheon
First Name Last Name Organization
Art McClellan The Aerospace Corporation
Eli Minson Ball Aerospace
Frank Pastizzo SSL
Eric Richter The Aerospace Corporation
Jack Harrington Boeing
Jeanne Kerr Lockheed Martin
Dan Gresham Orbital
Dave Martin Raytheon
Brian Reilly DCMA
Daniel Hyatt MDA
Bold – co-leads
SME Team
14
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Where can you find the full document
15
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Motivation for Topic
• Definition: Inspection is the verification of a requirement independent of the
method(s) used
• Problem Statement
– How can the appropriate level of inspection practices be identified, reviewed, and updated
to keep up with technology changes and/or trend results?
– Space industry doesn’t have a process to determine if the level of inspection is appropriate.
– Inspection practices have not kept up with new technologies.
• Examples
– A SMT line adds an AOI system. What impact has this had to the quality inspector role?
– Are mate/de-mate checks required if any damage is identified during test and the
technicians are appropriately trained for performing mates?
– An x-ray inspection system is introduced for non-destructive testing (NDT) by
manufacturing. Is there a continued need for the quality inspector?
– Dimensional Inspection has a high percentage of “no defect found” dispositions. What is
the value in continuing to perform the dimensional inspection vs. performing a fit check?
– Are solder inspections still needed if the PWA undergoes flying head probe testing as part of
in-circuit test verification?
– New torque tools automatically identify torques remotely. Is a separate torque inspection
required in these cases?
16
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Proposed Solution
• We are attempting to answer the question: How can appropriate level(s) of
inspection practice(s) be identified, reviewed, and updated to keep up with
technology changes, trend results, facility changes, etc.?
– We typically have data, but perhaps not the right data to make decisions
– Are we capturing the “right” data and/or metric to make the desired decisions
• Answer lies in evaluating process performance data against risk to
evaluate current vs. alternative approaches, including new technology
introduction
• Intended content of product
– Create a process/tool to make decisions on the appropriate level of quality inspections in a
data driven risk acceptable way
– Provide a set of guidelines for determining appropriate inspection methods including
surveillance
– List barriers and potential solutions to implement the process
– Align process to AS9100C
– Include section on applicability to Suppliers
17
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
What is a “GOOD” Inspection
• Aligns execution
uncertainty to demands
of end user / customer
– Risk
– Product
– Personnel requirements
– Schedule
– Cost
• Repeatable
• Reproducible
• Ensures performance of
desired end product
requirement
18
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Triggers for Starting the Process
• Introduction of new inspection or manufacturing technologies
– New manufacturing techniques inducing new failure modes
– New manufacturing techniques changing historical assumptions (homogeneity
of materials, input material requirements, etc.)
– New inspection techniques can detect to a lower level than the manufacturing
system can control (3D X-ray)
• Observed lack of findings or failures during historical inspections or
reduction in manufacturing mishaps
– Improved personnel and/or tool controls result in no findings from inspection
– Findings during inspection do not result in product or program changes
• Use as is dispositions
• Significant process change or facility change
– Can induce unexpected failures due to unanticipated input / personnel changes
• Change in management or customer requirements
– Unrelated failures can drive change in requirements
• Cost or schedule drivers
19
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Tool Decision Tree
Applicability
Internal process changes
Subcontractor process changes
20
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Decision TreeStart
1.0 Start The decision to review the inspection function with the expectation of reducing inspection may
result from various changes to the production line:
• Introduction of manufacturing equipment which increased the quality of the product
• Introduction of new inspection equipment, perhaps automated
• Directives from management or the stakeholder resulting from data analysis or change in
requirements
Question to Drive Entry
Does the process meet
the demands of the usage
environment?
Entry Criteria:
1. Establish baseline before embarking on a change
2. Place inspection at the highest upstream point possible
21
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Decision TreeNon-Critical Process
2.0 Critical
process?
Critical processes are those where
• Failure would seriously endanger the safety of personnel or produce product that
could seriously degrade the mission or result in mission failure
• Require more study before changing their inspection functions
• Often identified by the contractor in conjunction with the customer
2.1 Conduct
effectiveness
study
Non-critical Process
• Less scrutiny is required
• Inspection function is evaluated for effectiveness through data and observation
(review of inspector efficiency, escapes, differences between inspectors, etc.)
2.2 Improve
inspection or
reduce /
eliminate
Simple modifications to the inspection process may be undertaken to reduce
inspection, perhaps through sampling, or elimination if it is obvious that inspection is
no longer needed
2.3 End Monitor with data collection to assure the proper decision was made
22
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Decision TreeProcess Capability
3.0 Process capable? Entry Criteria:
• New manufacturing process has been introduced
• Critical Process
Before considering any changes to the inspection function: Demonstrate the process is capable
3.1 Produce capable
process
Capable process is 1) Repeatable and 2) Produces the desired product
Validation of capable process:
• Conduct a PFMEA
• Monitoring the process for stability
• Conduct a pilot study
• Monitor the process for changes in the nonconformity rates
Tool Inputs: 1-5
23
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Decision TreeEffective and/or New Technology
4.0 Inspection
effective?
Effective inspection: Successful in finding errors (Inspection, Manufacturing, etc.)
Determined by: Reviewing nonconformity and inspector escape data
Manufacturing Process change: Is inspection function is still effective? Changes in the
manufacturing process may require new inspection tools or techniques
New Technology Introduction: Does the historical inspection method meet the demands of the new
production technology?
4.1 Perform RCCA Inspection function not effective: Perform root cause corrective action
Utilize tools / approaches to identify required changes:
• Gage R&R
• Lessons learned
• Inspection escape review
4.2 Use new
technology?
If New Technology is to be introduced then a longer term cost and risk study should be performed
and evaluated in step 5.0
4.3 Modify
conditions to
create effective
inspection
Entry Criteria: no significant technology changes
Improve through:
• Training
• Inspection system improvement
• Sharing of best practices and lessons learned
Tool Inputs: 6 - 8
24
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Decision TreeImplementation Decision
5.0 Cost study
p < k1/k2?Entry Criteria: Manufacturing process is capable ; Inspection is effective
Utilize cost analysis techniques to identify cross over point of inspection vs. repair
Deming rule: example of one method
k1 = cost to inspect ; k2 = cost to repair
P < 1: better to inspect the unit than repair
P = 1: decision driven by outside forces
P > 1: better to repair than inspect
5.1 Reduce
inspectionProbability of error sufficiently low: Reduce inspection
Review data (internal, customer, field, etc.) to ensure correct decision made
5.2 Maintain
inspection Probability of error unacceptably high: reduction and/or change not advised
5.3 End Conclusion of the inspection evaluation
Tool Inputs: 9 - 10
25
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Tool Design
Analyses
Fixed by Tool
Justification
Weight
• Manufacturing
Process
Change
• Inspection
Process
Change
• Management or
Customer Input
User Modifiable
Return
1. Does not justify
removal of
inspection
process
2. Additional data
required before
decision can be
made
3. Data Justifies
capabilities study
for process
modification
4. Justifies
modification of
inspection
process
5. Justifies removal
of inspection
process
User Modifiable
Investment
1. Low Effort (Easy
or completed,
limited
personnel, <3
months)
2. Between Low
and Medium
3. Medium Effort
(Hurdles,
somewhat
difficult, >6
months)
4. Between
Medium and
High
5. High Effort
(Complex, lots of
people, >1 yr)
User Modifiable
Weighted results
1. Do the results of a PFMEA show
potential for improved quality?
2. Is the process qualified and
capable?
3. Does the first article indicate less
inspection is required?
4. Does the current process have a low
level of nonconformities?
5. Does the proposed process output
rate affect inspection capabilities?
6. Was a gage R&R performed with
personnel performing the inspection
function?
7. Will the improved inspection process
increase the ability to find
nonconformities?
8. Will the process change reduce
inspection escapes?
9. Has a cost analysis been performed
(p<k1/k2, see Appendix B)?
10.Will the customer allow the change?
26
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Studies Included in Tool and Weights
Process
Step
Tool
LineStudy Name Study Type
Manufacturing
Process Change
Inspection
Process
Change
Management or
Stakeholder
Input Change
3 1 Process Failure Modes and
Effects Analysis (PFMEA)
Manufacturing 10% 10% 10%
3 2 Process Qualification &
Capability
Manufacturing 10% 3% 3%
3 3 Pilot Manufacturing 5% 3% 3%
3 4 Nonconformity Rate Manufacturing 10% 3% 3%
3 5 Production Rate Manufacturing 5% 3% 3%
4 6 Gage R&R Inspection 10% 15% 8%
4 7 Lessons Learned Inspection 10% 10% 10%
4 8 Inspection Escapes Inspection 10% 20% 20%
5 9 Deming’s Cost Analysis Cost or Risk 20% 30% 30%
5 10 Stakeholder’s Reaction to
Change
Stakeholder Input 10% 3% 10%
Team Agreed Targets
Not independently verified
27
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Example of Tool OutputProcess Being Reviewed
Scope
Nature of change
Process Revision
Lead Reviewer
Date
Analysis Revision
Quadrant
Conclusion
Change due to introduction of
new manufacturing
equipment.
Click cell below
to see pulldown
Click cell below
to see pulldown
Studies to be conducted Justification
1
Do the results of a PFMEA show
potential for reducing inspectors?
Decision making should be based on risk
priority numbers or an equivalent
approach.
This activity was completed in September 2014.
Considerable increase in quality with less rework than
use of stencil. A reduced inspection staff is likely. 10%Reduce
inspectors 75% $ $ $ 0.3 No cost/effort 0
2
Is the new process qualified and
capable? Based on what is known about
the new process, can inspectors be
reduced? Conclusions should be based
on data.
This effort is underway.
10%Reduce
inspectors 75% $ $ $ 0.3 High cost/effort $ $ $ $ 0.4
3
Does the pilot or proof of concept
indicate inspectors can be reduced?
This effort is underway. Suspect it will suggest less
inspection.
5%Reduce
inspectors 75% $ $ $ 0.15Medium-high
cost/effort $ $ $ 0.15
4
Will the new process significantly
reduced nonconformities and thus
reduce the number of inspectors
needed?
Manufacturier data indicates significant
improvement over stencil approach. We should
conduct studies on our particular type of hardware. 10%Reduce
inspectors 75% $ $ $ 0.3 Low cost/effort $ 0.1
5
Will the new process total output rate or
output variability affect the number of
inspectors needed? Are any inspector
bottlenecks predicted?
Studies indicate a doubling in output rate. Inspection
per unit may not change significantly. Will need to
change product flow to reduce inspection
bottlenecks.5%
Do not reduce
inspectors0
Medium-high
cost/effort $ $ $ 0.15
6
Will the process change inspector
reproducibility and repeatability or
equipment precision to tolerance ratio
causing a change in inspector numbers?
(See GageR&R definition)
This effort needs to be conducted. Suspect inspection
variability would go down as fewer items need to be
checked. 10%Reduce
inspectors 75% $ $ $ 0.3 High cost/effort $ $ $ $ 0.4
7
Are the historical reasons for the
inspector staffing levels still valid? Do
lessons learned indicate a change in
inspector levels is warranted?
This effort needs to be conducted but historical
reasons are no longer applicable in most cases.
10%Remove
inspectors $ $ $ $ 0.4 High cost/effort $ $ $ $ 0.4
8
Will the new process help reduce
inspector escapes and alter the number
of inspectors required?
This needs to be determined.
10%Reduce
inspectors 75% $ $ $ 0.3 High cost/effort $ $ $ $ 0.4
Weighted score falls in the following quadrant:
9
Does the cost or risk to repair defects
escaped from inspection justify reducing
inspectors. (See Appendix B for Deming
rule discussion.)
This needs to be determined. Suspect process will not
affect metric.
20%Reduce
inspectors 50% $ $ 0.4Medium
cost/effort $ $ 0.4
STRATEGIC
10
Will the stakeholder allow the change?
Take into account any stakeholder
mandated inspections.
Held customer meeting June 2014. Customer is on
board with the change.
10%Reduce
inspectors 75% $ $ $ 0.3 No cost/effort 0
Total Weighting Sum 100%
2.75 2.4
69% 60%
Weight cells shaded in red indicate a change from the pre-set weight and may indicate results could be incorrect.
If a analysis is not to be done, the suggested ranks are ROI=Do not reduce inspectors and I=High cost/effort.
Return on Investment
(ROI) Rank
(Savings)
Investment or Effort
Rank
(Cost)
ROI Score
Weighted Return
Investment Score
Weighted Investment
Printed wiring board, paste application process
Flight hardware
Stencil approach replaced by solder paste printing machine
Upgrading stencil process improves quality and inspection could be reduced. Implementation is expensive though.
STRATEGIC
Weight
(0-
100%)
ROI
Value
Effort
Value
0%
50%
100%
0% 50% 100%
RET
UR
N O
N IN
VES
TMEN
T
INVESTMENT
Pick Chart
28
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Tool Input: What Process is being evaluated?
Populated by tool output
29
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Tool Input: Analysis and Value Capture
Study Description
and proposed
questions study
should address
User Entered
description of analysis
performed and
synopsis of results
ROI Selection
5 Potential values
Effort Selection
5 Potential values
Weights used by tool
If user modified changed to red
30
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Tool Input: Scoring
Ensuring Weights total 100%: Built in error checking
ROI and Investment scoring automated based on inputs
Automated plotting of proposed process change
occurs as values are entered
• Perform sensitivity analyses
• Save and re-evaluate as other process changes
impacting inputs are performed
• Rank multiple efforts on same plot to evaluate
budgetary investments
31
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Target Audience and Intended Product Use
• Target Audience
– Quality organizations tasked with verification of requirements by
inspection
– Manufacturing organizations pursuing new technology
– Customers and management seeking ways to reduce unnecessary
costs
• How Used
– Best applied when change to process is first considered
– Useful when many trades are possible
• Provides best indication of tradeoffs resulting from a proposed
process change
32
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Example: ICT via Flying Head Probe
Manufacturing Process Change
Inspection Process Change
Management or Customer Input
• Shift inspection of PWA from manual
inspection to flying head automated
probe
– Introduction of new inspection
technology
– Reduced false errors from manual
inspection
– Time study of the same board shows
significant time reduction
– Output of machine lists part non-
conformities
– Manual Inspection covers10-20% of
parts not covered by the machine
– Verified with customer and other
stakeholders
Example
ICT via Flying ProbeCritical Process: Yes
Capable Process: Yes
Effective and Efficient: Yes
Cost, Risk, Customer: Yes
33
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
In-Circuit Test via Flying Head ProbeAnalyses Performed
Critical Process
• Reviewed historical inspection process output
• Reviewed customer requirements
• Identified potential tool suppliers
• Performed risk analysis against existing processes
• Study of cost vs. CAPEX vs. inspection performance completed
Process Capability
• Reviewed supplier tool sets
• Performed bench test using EDU boards
• Verified results against existing inspection method
• Identified process accuracy and repeatability issues
• Compared results to risk and cost analyses
34
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
In-Circuit Test via Flying Head ProbeAnalyses Performed
Effective Inspection
• Test board coverage and issues reviewed
• Identified requirements against typical part usage
• Identified part types and applications where ICT not able to capture all issues
ROI
• Performed study for purchasing unit vs. outsourcing
• Identified multiple suppliers and reviewed capabilities against requirements
35
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
BACKLOG QUADRANT
JUST DO IT QUADRANT
STRATEGIC QUADRANT
FORGET IT QUADRANT
LOW HIGH
0% 50% 100%
HIG
HLO
W
0%
50%
100%
INVESTMENT
RET
UR
N
Analysis Results into Tool
Analysis Category Entries in tool Manufacturing
Process Change
Inspection
Process Change
Management or
Customer Input
Manufacturing Lines 1-5 40% 22% 22%
Inspection Lines 6-8 30% 45% 38%
Cost and
Customer
Lines 9-1030% 33% 40%
Strategic
36
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Additional Examples in Product
Torque Witness by Inspection Personnel
Forget It Just Do It
Evaluating whether or not to eliminate
Inspection witness of 'Torque" operations
Test to flight (class 2) electrical mates
Elimination of a secondary inspection (by
QA) for test to flight connector mates
37
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Additional Examples in Product
Backlog
Evaluating reduction in duplicative
inspection efforts upon receipt for items
that are Final Source Inspected
Receiving Inspection of subcontracted
products (QSI-1002)
BACKLOG QUADRANT
JUST DO IT QUADRANT
STRATEGIC QUADRANT
FORGET IT QUADRANT
LOW HIGH
0% 50% 100%
HIG
HLO
W
0%
50%
100%
INVESTMENT
RE
TU
RN
Examples of Each Potential Outcome
38
U.S. SPACE PROGRAM MISSION ASSURANCE IMPROVEMENT WORKSHOP
HARRIS CORPORATION | MELBOURNE, FL | MAY 3 - 5, 2016
Don’t Loose Sight of the Forest for the Trees
• Evaluate low level process changes on Mission Risk and Success
– Understand impact of Ps and/or risk is acceptable
• Control uncertainty at mission level through build-up of risk inputs
– Inspection and new technology build-ups through entire product structure
resulting in unquantified residual risk