+ All Categories
Home > Documents > September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability...

September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability...

Date post: 11-Jan-2016
Category:
Upload: jeffery-austin
View: 216 times
Download: 1 times
Share this document with a friend
Popular Tags:
62
September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 1 Validating Dependability and Safety Requirements Stephen B. Driskell, Stephen B. Driskell, NASA IV&V Facility NASA IV&V Facility Man-Tak Shing, Man-Tak Shing, Naval Postgraduate School Naval Postgraduate School
Transcript
Page 1: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 1

Validating Dependability and Safety Requirements

Stephen B. Driskell, Stephen B. Driskell, NASA IV&V FacilityNASA IV&V Facility

Man-Tak Shing, Man-Tak Shing, Naval Postgraduate SchoolNaval Postgraduate School

Page 2: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 2

Outline

Dependability Requirements Defining System Qualities

Utility Tree Quality Attribute Scenarios

Validating Dependability Requirements Expected Outcomes Dependability Modeling and Validation Example –

JWST Validation Metric Framework for System Software

Safety Requirements Conclusions

Page 3: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

AcknowledgementsAcknowledgements

Dr. Butch S. Caffell, VisionaryDr. Butch S. Caffell, Visionary Marcus S. Fisher, Chief EngineerMarcus S. Fisher, Chief Engineer Dependability Tiger Team membersDependability Tiger Team members

Dr. Man-Tak ShingDr. Man-Tak Shing Dr. James Bret MichealDr. James Bret Micheal Kenneth A. CostelloKenneth A. Costello Jeffrey R. NortheyJeffrey R. Northey Stephen B. DriskellStephen B. Driskell

SRMV Example ContributorsSRMV Example Contributors Dr. Khalid LatifDr. Khalid Latif Jacob CoxJacob Cox Dr. Karl FrankDr. Karl Frank

04/21/2304/21/23 33

Page 4: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 4

Dependability Requirements (1/3)

Defined by a set of system quality attributesAvailabilityReliabilitySafetySecurity (confidentiality and integrity)Maintainability

Page 5: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 5

Dependability Requirements (2/3)

They are modifiers of the functional requirementsCross-cutting

Affects many different functions of a system

Architectural driversSubstantial influence on a choice among

architectural alternativesThe basis for prioritizing the design goals to

guide design tradeoffs

Page 6: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 6

Dependability Requirements (3/3)

Attributes are system characteristics defined in the context of the operating environment Some can be measured directly (e.g. performance,

availability) Others (e.g. safety and security) are expressed in

terms of quantifiable proxy attributes

All attributes must be traced from requirements to model to implementation in order to accomplish IV&V analysis & testing

Page 7: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 7

Two types of Quality Attribute Requirements

Requirements that define quality attributes and how and when to measure them

E.g.: The system shall measure and report ‘reservation completion time’, starting from the display of the first flight query screen and ending with the display of the screen giving the reservation confirmation number

Requirements that specify what values of the quality attribute measure indicate sufficient quality

E.g.: The ‘reservation completion time’ for an experienced tester on a lightly loaded system making a Type 3 reservation shall be less than two minutes

Page 8: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 8

Defining Dependability Requirements via Utility Trees

A top-down vehicle for characterizing the quality attribute requirements

quality goals

quality attribute

scenarios

prioritization, difficulties

Page 9: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 9

Quality Attribute Scenario (1/4)

Textual description of how a system responds to a stimulus

Make up of A stimulus A stimulus source An artifact being stimulated An environment in which the stimulus occurs A response to the stimulus A response measure (to quantitatively define a

satisfactory response))

Page 10: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 10

Quality Attribute Scenario (2/4)

Example – Configurability Stimulus: Request support for a new type of sensor Stimulus source: The customer Artifact: The system and the customer support

organization Environment: After the software has been installed

and activated Response: The customer support engineer

reconfigures the system to support the new sensor Response measure: No new source code, no

extraordinary downtime, and commencing operation within one calendar week

Page 11: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 11

Quality Measures The amount of new source code, the amount of

downtown, the amount of calendar time to bring a new sensor on line

Quality Requirement Zero new source code No extra downtime (need to define the meaning of

“extra”) Less than one calendar week to bring new sensor on

line

Quality Attribute Scenario (3/4)

Page 12: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 12

Implications Zero new source code =>

limit on the range of new sensors No extra downtime =>

reconfiguration without shutting down the system and has to be performed by expert from the installation organization (instead of the customer organization)

Quality Attribute Scenario (4/4)

Page 13: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 13

Quality Attribute Scenario Types

Normal operations This are the most obvious scenarios

System-as-object scenarios Treat the system as a passive object being manipulated by

an actor (e.g. programmer, installer, administrator) Growth scenarios

Deals with the likely or plausible changes to the requirements in the future (e.g. 50% increase in capacity requirement)

Help future-proof the system under development Exploratory scenarios

Improbable scenarios, useful to stimulate thinking about implicit assumptions underpinning the architecture

E.g., loss of power from an uninterruptible power supply

Page 14: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 14

Quality Attribute Scenario vs. Use Case Scenarios

Use case scenarios focus on the functional behaviors (normal operation and exceptions) of the system

Quality attribute scenarios focus on satisfaction of a response measure

Need to establish trace links between quality attribute scenarios and the use cases/use case scenarios they correspond to

Page 15: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 15

Validating Dependability Requirements (1/2)

Sufficiency of the requirements Do the dependability requirements adequately

describe the desired attributes of a system that meets the operation needs?

Are the dependability requirements correctly prioritized?

Are the dependability requirements verifiable? Or are they traceable to derived system requirements that are verifiable?

Page 16: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 16

Validating Dependability Requirements (2/2)

Expected outcomes Evidence of correct IV&V understanding and

prioritization of the quality attributes via the SRMs Evidence of properly specified system capabilities to

achieve the required quality attributes Dependability and Safety requirements are clearly

defined in terms of dependability/safety cases Evidence is formally captured in Validation Reports

Page 17: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 17

Dependability Modeling Example - JWST (1/2)

Focus on availability and safety NASA IV&V uses capabilities and limitations, along with

Portfolio Based Risk Assessment to prioritize behaviors for model development

Availability Ensures that the modeled system has preventative

behaviors to avoid the “undesired behaviors (Butch Q2)” Models the responsive behaviors to handle “adverse

conditions (Butch Q3)” Safety

Models the safety criteria by generating model elements that focus on the fault detection, fault response, and fault recovery

Focus on fault response in terms of fail-safing in case of high severity faults (level-1) or fault-isolation in case of low severity faults (Level-3).

Page 18: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 18

Dependability Modeling Example - JWST (2/2)

Fault management elements include use cases and activity diagrams for

Perform Onboard Fault Management (POBFM) Perform Ground Based Fault Management for

Observatory faults (PGBFMO) Ground Based Fault Management for Ground faults

(PGBFMG) Recover from Loss of Uplink, Recover from Loss of

Downlink.

Page 19: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009September 16, 2009 Workshop on Validation and VerificationWorkshop on Validation and Verification 1919

Perform GroundBased Fault

Management forGround Faults

Perform OnBoardFault Management

Recover from Lossof Downlink

Upload Memory file

Recover fromLoss of Uplink

Perform WFS&Ccommissioning

Perform Ground BasedFault Management for

Observatory Faults

Perform Real TimeOperations

Perform Calibration &Final checkout

Upload SCS Establish Contact withObservatory

Recondition Battery

Prepare for WFS&C

Perform SSRPlayback

Deploy HGA

Deploy Sunshield

Upload OP

Evaluate SOH

Deploy solar Array /Radiator Shield

Perform Crusie &Commissioning

Perform LaunchOeprations

PerformDepolyment &

Traj. Correction

Deploy OTE

Perform EventDriven Ops

Perform Sc. OperationsPlanning

Launch system

PerformPost_passOperations

Ground

Perform Traj.correction manuvers

Observatory

1

1

1

1

1

1

1

1

1 11 1

1

1

1

1

11 11

11 1111 11«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»«include»«include»

«include»«include»

«include»«include»

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

11 11

1

1

1

1

1

1

1

1

1

1

1

1

Use Case, JWST SRM: Mission Level UC diagram

Page 20: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Activity Diagram , JWST - NIRSpec DS Test Limitations related to the Activity Diagram , JWST - NIRSpec DS Test Limitations related to the command processing behavior “Availability”command processing behavior “Availability”

(TIM-2419) Test script CX6 not testing the invalid parameter values for the dwell command. This may result in unexpected results from the command execution. This in turn will result in providing an incorrect set of telemetry, or in the worst case, crashing the NIRSpec DS, in turn impacting the JWST science mission.

Page 21: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 21

Dependability Validation Example – JWST (1/4)

Dependability criteria Recoverability Robustness Consistency Correctness

Safety criteria Fault Avoidance Fault Warning Fault Correction Fault Tolerance Fail Operational Fail Safe

Page 22: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 22

Dependability Validation Example – JWST (2/4)

Recoverability validation By correlating the requirements that allow ground controllers or

onboard systems to recover from the failure Most of the JWST observatory recovery is performed by the ground

commands Robustness validation

By correlating the safety behaviors to the requirements that allow the onboard systems to detect and respond to the faults

Consistency validation By ensuring that the multiple requirements for given behavior are

consistent within the SRD (System Requirements Document) and the constraints for a given behavior correlate with the requirements for preventative and responsive behaviors

Correctness validation By ensuring that requirements are fully specifying the behaviors

modeled in the SRM

Page 23: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 23

Dependability Validation Example – JWST (3/4)

Fault Avoidance SRM includes the “preventative behaviors” The requirements correlated to these behaviors allow JWST

SRMV to validate the fault avoidance. Fault Warning

SRM includes the desired behavior for the “detection of faults”

The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance.

Fault Correction SRM includes the “responsive behaviors” that model how

JWST observatory will self-correct such as reset the faulty component in order to continue science operations

The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance.

Page 24: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 24

Dependability Validation Example – JWST (4/4)

Fault Tolerance SRM includes the “responsive behaviors” that model how JWST will

select alternative paths in response to a fault or adverse condition The requirements correlated to these behaviors allow JWST SRMV

to validate the fault avoidance. Fail Operational

SRM includes the “responsive behaviors” that model how JWST will select respond to a single fault such as Level 3 fault with a science instrument. And still stay operational and continue operating with the remaining instruments.

The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance

Fail Safe SRM includes the “responsive behaviors” that model how JWST will

respond to Level 1 or level 2 fault and transition to a safe state The requirements correlated to these behaviors allow JWST SRMV

to validate the fault avoidance

Page 25: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 25

Validating Software Safety Requirements

Cannot be validated in the traditional way Matching stakeholder requirements and expectations (“the

system behaves safely) to system behavior

Need to develop a safety case Sufficiency of hazard identification Adequacy of hazard analysis to identify software’s role in

causing these hazards Traceability from the derived software safety requirements

to the identified system hazards Indication of the completeness of the set of software safety

requirements

Page 26: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 26

A Metric Framework for Software Safety Requirements Validation

Utilize Goal-Question-Metric (GQM) approach Measurement Goals Questions to characterize the object of measurement Metrics to answer the questions in a quantitative way

Augment with Goal Structuring Notation Highlight the context, justification, assumptions,

strategies, and solutions

Page 27: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 27

Validation Metric Framework

(cont’d)

FGValidate system safety through safety metrics

C1Safety Engineering team POV

C2Validating software safety by proxy

C3Products of software safety process

G1 G2

G3

G5

S1Measure Hazard

Identification sufficiency

S2Measure Hazard

Analysis sufficiency

G1Number of Software Hazards identified is sufficient

G2Depth of analysis is measured

G3Software Safety Requirements are measured

S3Measure Software

Safety Requirement Traceability

G5Traceability of software safety requirements is maintained

G4Software Hazards are sufficiently mitigated

G4

Page 28: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 28

G2G1

Q1Are there a sufficient number of Software Hazards identified?

M1Percent Software Hazards

Q2Are second or third

order software causal factors appropriately

identified?

M2Software Hazard

Analysis Depth

C4Risk Rating obtained through Software Hazard Criticality Matrix

Validation Metric Framework (cont’d)

Page 29: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 29

Validation Metric Framework (cont’d)

G4G3

Q4Are all hazards

addressed through requirements?

M4% High Risk

Software Hazards with

Safety Requirements

M5% Medium

Risk Software Hazards with

Safety Requirements

Q3Are there a sufficient

number of safety requirements?

M6% Moderate

Risk Software Hazards with

Safety Requirements

C5Risk context as per SHCM

G5

Q5Are all Software Safety

Requirements traceable?

M7Percent Software

Safety Requirements Traceable to

Hazards

M3Percent Software

Safety Requirements

Page 30: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 30

Sample Application (1/9)

Demonstrate the application of the framework with a fictitious safety-critical, software-intensive Rapid Action Surface-to-Air Missile (RASAM) system

EPSH = 49.1%, σ = 9.1%

System # Sys Hazard

# S/W Hazard PSH

RASAM v0.5 145 70 48.3%ISAM 110 50 45.5%

SAMDS 95 34 35.8%AAMS 165 105 63.6%

RAAMS 198 104 52.5%

Page 31: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 31

Sample Application (2/9)

M1: % S/W Hazards Sample 4 shows a

reduction in PSH, with |PSH – EPSH| = |37.8 – 49.1| = 11.3% > σ

The results of M1 indicate that further investigation needs to be made to determine the sufficiency of hazard identification

M1Sample

# SysHazards

# S/WHazards

PSH

1 140 20 14.3%

2 164 45 27.4%

3 171 67 39.2%

4 180 68 37.8%

Page 32: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 32

M2: Software Hazard Analysis Depth (SHAD)

Hazard Analysis Depth Hazard Analysis Space (HAS)

H1 (High) 3

HASH = # High Risk Software Hazards * 3 = 6 * 3 = 18

H2 2

H3 3

H4 3

H5 2

H6 1

M1 (Med) 2

HASM = # Medium Risk Software Hazards * 3 = 5 * 3 = 15

M2 3

M3 2

M4 2

M5 3

Sample Application (3/9)

Page 33: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 33

Sample 1 shows HAAH = ∑HAA = 3 + 2 + 3 + 3 + 2 + 1 = 14

HAAMED = ∑HAA = 2 + 3 + 2 + 2 + 3 = 12

CH = [HAAH / HASH] * 100% = [14/18] * 100% = 78%

CMED = [HAAMED / HASMED] * 100% = [12/15] * 100% = 80%

SHAD = (CH + CM)/2 = (78% + 80%) / 2 = 79%.

M2Sample

# High Risk

# Med Risk

CH CM SHAD

1 6 5 78% 80% 79%

2 10 12 73% 92% 82.5%

3 15 20 67% 83% 75%

4 18 21 89% 87% 88%

Sample Application (4/9)

Page 34: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 34

M3: Percent Software Safety Requirements EPSSR = 38.2%, σ = 6.5%.

System# S/W

Requirements# S/W Safety

RequirementsPSSR

RASAM v0.5 94 25 26.6%

ISAM 78 31 39.7%

SAMDS 109 43 39.4%

AAMS 120 56 46.7%

RAAMS 132 51 38.6%

Sample Application (5/9)

Page 35: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 35

M3: PSSR Sample 4 shows

|PSSR – EPSSR| = |41.9 – 38.2| = 3.7% < σ.

The results indicate a sufficient number of software safety requirements are being developed, thus instilling confidence that the safety requirements are indeed valid

M3 Sample

# S/W Reqmt

# S/W Safety Reqmt

PSSR

1 83 15 18%

2 94 37 39.4%

3 101 40 39.6%

4 105 44 41.9%

Sample Application (6/9)

Page 36: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 36

M4, M5 and M6 - Percent software hazards with safety requirements

Sample #

#SHHR-SR

#SHHR

#SHMR-SR

#SHMR

#SHMoR-SR

#SHMoR

1 2 4 5 14 7 8

2 6 9 12 18 14 14

3 10 11 23 24 19 20

4 12 12 24 24 20 22

Sample Application (7/9)

Page 37: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 37

M4, M5 and M6 All high- and medium-risk

software hazards have associated software safety requirements

Some moderate-risk software hazards that are not mitigated through software safety requirements.

Further investigation is required to determine either the validity of the software hazard, or the validity of the set of software safety requirements.

Sample Application (8/9)

Page 38: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 38

M7: Percent software safety requirements traceable to hazards

Sample # # SSRTR # SSR M7

1 15 15 100%

2 36 37 97.3%

3 38 40 95%

4 44 44 100%

All software safety requirements are traceable to software hazards at the end of the design phase

This strengthens the case that the derived software safety requirements are valid

Sample Application (9/9)

Page 39: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 39

Conclusion (1/3)

Dependability requirements are modifiers of the functional requirementsAddressed by Butch’s Q2 and Q3

Focused model analysis enables IV&V Focused model analysis enables IV&V improvements to Dependability and improvements to Dependability and SafetySafetyAdditional model elaboration and approach Additional model elaboration and approach

refinement improves the Dependability and refinement improves the Dependability and Safety of the Missions analyzedSafety of the Missions analyzed

Page 40: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 40

Conclusion (2/3)

System software safety requirements cannot be validated directlyNeed to validate their sufficiency via

proxiesNASA Safety standards recommend a “What,

Why, How” Conops approach for establishing goals and delivering products

Validation metrics help identify potential problems early on in the development lifecycle

Page 41: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 41

Conclusion (3/3)

Additional Dependability studies will continue to improve the contributions of requirements to implementation for NASA IV&V

Development of a master glossary and dictionary for IV&V would support the approach

Page 42: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

September 16, 2009 Third NASA IV&V Workshop on Validation and Verification 42

Questions?

Page 43: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 43

Dependability Criteria

Availability The probability that a system is operating correctly and is ready to perform its desired functions.

Consistency The property that invariants will always hold true in the system.

Correctness A characteristic of the model that precisely exhibits predictable behavior at all times as defined by the system specifications (ConOps).

Reliability The probability that a system can successfully operate continuously under specified conditions for a specified period of time.

Robustness A characteristic of a system that is failure and fault tolerant, and has the ability to operate under a wide range of off nominal operating conditions.

Safety The control and management of potentially catastrophic outcomes and risk.

Recoverability The ease for which a failed system can be restored to operational use.

Page 44: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 44

Safety Criteria

Fault Avoidance

Requirements to avoid the occurrence of software hazards.

Fault Warning

Requirements to detect conditions which could be hazardous and provide warning(s) such that appropriate corrective action can be initiated.

Fault Correction

Requirements for autonomous self- correction subsequent to fault detection

Fault Tolerance

Requirements for autonomous selection of alternate paths as part of corrective actions taken.

Fail Operational

Requirements such that when a single failure or error occurs the system fails operational (and safe).

Fail Safe Requirements such that when two independent failures or errors occur the system fails safe (but not necessarily operational).

Page 45: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 454545

Perform GroundBased Fault

Management forGround Faults

Perform OnBoardFault Management

Recover from Lossof Downlink

Upload Memory file

Recover fromLoss of Uplink

Perform WFS&Ccommissioning

Perform Ground BasedFault Management for

Observatory Faults

Perform Real TimeOperations

Perform Calibration &Final checkout

Upload SCS Establish Contact withObservatory

Recondition Battery

Prepare for WFS&C

Perform SSRPlayback

Deploy HGA

Deploy Sunshield

Upload OP

Evaluate SOH

Deploy solar Array /Radiator Shield

Perform Crusie &Commissioning

Perform LaunchOeprations

PerformDepolyment &

Traj. Correction

Deploy OTE

Perform EventDriven Ops

Perform Sc. OperationsPlanning

Launch system

PerformPost_passOperations

Ground

Perform Traj.correction manuvers

Observatory

1

1

1

1

1

1

1

1

1 11 1

1

1

1

1

11 11

11 1111 11«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»

«include»«include»«include»«include»

«include»«include»

«include»«include»

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

11 11

1

1

1

1

1

1

1

1

1

1

1

1

JWST SRM: Mission Level UC diagram

Page 46: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 46

MSL Example - Dependability of MSL Example - Dependability of BehaviorsBehaviors

Dependability using Behavior and Dependability using Behavior and Domain ModelingDomain Modeling

Jacob Cox Jacob Cox [email protected]

The models included are for illustrative purposes only.

Page 47: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 47

ObjectiveObjective

Show how behavior and domain models can be used to derive dependabilityBehavioral models

Activity DiagramsState DiagramsSequence Diagrams

Domain modelsClass Diagrams containing hardware and

software components

Dependability is the probability of success of a behavior

Page 48: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 48

Simplifying Assumption

Focus is on the Main Success ScenarioBecause, in general, NASA IV&V has

adopted the concept of Extension Scenarios as outside the norm.

Decisions rarely matterMost decisions between the initial node

and the main success final node are loops.

Page 49: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 49

Concept

The dependability, probability of the success, of an activity, a behavior, is the product of the success probabilities of all the actions in the Main Success Scenario.

Page 50: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 50

Entry to Mars

MSS

Page 51: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 51

Concept

The dependability of an action in an activity is the product of the success probabilities of the components involved in that activity.These are in the domain model.The action can be mapped to the main

component accomplishing the action.The relations in the domain model capture

the participating components related to the action.

Page 52: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 52

Page 53: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

9/16/2009 Third NASA IV&V Workshop on Validation and Verification 53

Questions

Page 54: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Safety Critical Safety Critical BehaviorsBehaviorsAnalysis based on Behavior ModelingAnalysis based on Behavior Modeling

Diagrams are for illustrative purposes only.

Page 55: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Scope LimitationScope Limitation

Example based on actual practiceDependability of the system wrt

capability to performance of a behavior as intended, not the dependability of a particular system component or deviceDevice dependability often measured in

mean time between failures of the deviceThe system may be dependable despite

incorporating devices prone to failure

Page 56: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

ObjectiveObjective

How alternative behavior views assist in determining safety and dependabilityExamples based on different paradigms for

representing behaviorActivitiesStatemachines

Dependability is the probability of successful performance of a behavior

Page 57: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

OverviewOverview

Suppose a requirement has been expressed in English as: “The crew shall have ability to override

abort requests, subject to limitation in response time relative to conditions.”

Modeling this first in activity diagramAlternative StateMachine elaborationsMoral: it matters what entity changes

state in response to Override signal!

Page 58: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Detailed View in Next SlideDetailed View in Next Slide

Behavior of a crew override Behavior of a crew override capability, modeled as activity capability, modeled as activity enabled by new abort requestenabled by new abort request

Page 59: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Will handling a subsequent abort requestbe prevented by failure to distinguish differentRequests from one another?

abortRequest object abortRequest object starts processstarts process

Page 60: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Life cycle assuming individually Life cycle assuming individually traced abort requeststraced abort requests

Page 61: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Suppose Override Capability is Interpreted as Suppose Override Capability is Interpreted as change of state of ADL, not the command?change of state of ADL, not the command?

Page 62: September 16, 2009Third NASA IV&V Workshop on Validation and Verification1 Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V.

Questions


Recommended