+ All Categories
Home > Business > Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case...

Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case...

Date post: 18-Jul-2015
Category:
Upload: jose-creissac-campos
View: 120 times
Download: 2 times
Share this document with a friend
20
1 Independent Formal Verification of Safety- Critical Systems’ User Interfaces: a space system case study NASA IVV Workshop September, 2013 Manuel Sousa 1 , José Creissac Campos 1 Miriam Alves 2 and Michael D. Harrison 3 1 Dept. Informática/Universidade do Minho &HASLab/INESC TEC, Portugal 2 Institute of Aeronautics and Space - IAE, São José dos Campos, Brazil 3 Queen Mary University of London & Newcastle University, UK * This work is funded by the ERDF - European Regional Development Fund through the ON.2 O Novo Norte Operational Programme, within the Best Case project (ref. N-01-07-01-24-01-26).
Transcript
Page 1: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

1

Independent Formal Verification of Safety-

Critical Systems’ User Interfaces:

a space system case study

NASA IVV Workshop

September, 2013

Manuel Sousa1, José Creissac Campos1

Miriam Alves2 and Michael D. Harrison3

1Dept. Informática/Universidade do Minho &HASLab/INESC TEC, Portugal

2Institute of Aeronautics and Space - IAE, São José dos Campos, Brazil

3Queen Mary University of London & Newcastle University, UK

* This work is funded by the ERDF - European Regional Development Fund through the ON.2 – O

Novo Norte Operational Programme, within the Best Case project (ref. N-01-07-01-24-01-26).

Page 2: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

2

foreword

Dependable device

Dependable system?

User

The impact of users on a system is hard to anticipate

users behave in unexpected ways

users’ behaviour is changed by (adapts to) the device

users must understand the device

We have been working on approaches to consider the user

during the formal verification of interactive systems

Page 3: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

3

VerificationPotentialProblem

Analysis

Modelling

Properties

Model

our approach

Page 4: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

4

systematic analysis – the process

1. Models – system as designed

2. Property patterns – good design practices

3. Verification – models against patterns

4. Traces – when verification fails

Analysis

Scenarios

Prototyping

5. Change models – reflecting analysis findings

Page 5: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

5

systematic analysis – the process

Scenarios

Prototype Model

Traces

abstraction

deployment

analysisverification

Expert analysis

Domain knowledge

Property Templates

Prototype

redesign

be

tte

r d

esig

n

Expert usability

inspection

Universal properties

/Heuristics

simulation

refinement

walkthroughs

techniques

process

information flow

Page 6: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

6

the IVY tool

Page 7: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

7

MAL interactors

interactor MCP

aggregates

dial(Altitude) via ALTDial

dial(ClimbRate) via crDial

dial(Velocity) via asDial

attributes

[vis] pitchMode: PitchModes

[vis] ALT: boolean

actions

[vis] enterVSenterIASenterAHtoggleALT

enterAC

axioms

[asDial.set(t)]effect(enterIAS)

[crDial.set(t)]effect(enterVS)

[ALTDial.set(t)] ensure_ALT_is_set

[enterVS] pitchMode'=VERT_SPD & ALT'=ALT

[enterIAS] pitchMode'=IAS & ALT'=ALT

[enterAH] pitchMode'=ALT_HLD & ALT'=ALT

[toggleALT] pitchMode'=pitchMode& ALT'=!ALT

[enterAC] pitchMode'=ALT_CAP & !ALT'

Behaviour

Page 8: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

8

Page 9: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

9

IAE’s PTGS

MN

Subsystem

SC

Subsystem

EV

Subsystem

PR

Subsystem

CR

Subsystem

Telemetry Time

Synchronization Operators Batteries

Operational

Communication

Operational

Signalization

INTERFACES UMBILICAL

CORDS

RO

CK

ET

PW

Subsystem

Page 10: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

10

IVY analysis of EV subsystem

The system was modelled from the operations manual

model reflects knowledge provided to the operator

properties used to express expected behaviour

A three layered model was built

Each type of variable modelled as an interactor

Each screen modelled as an interactor

Navigation between screens modelled on top of that

Values displayed modelledas attributes

Buttons modelled as actions

Page 11: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

11

From manual to model

How the colouring scheme works (from the operations manual):

“Blinking yellow: For a critical variable, when the current value of the variable is in non acknowledged alert (value within the alert range), there is no acknowledged alarm in the variable, and the previous criterion [non acknowledged alarm criterion] is not satisfied. If over the same critical variable an acknowledged alarm exists, then Fixed Red prevails. For a non critical variable, when the current value of the variable is in non acknowledged alarm (value within the alarm range).”

Page 12: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

12

From manual to model

How the colouring scheme works (from the operations manual):

“Blinking yellow: For a critical variable, when the current value of the variable is in non acknowledged alert (value within the alert range), there is no acknowledged alarm in the variable, and the previous criterion [non acknowledged alarm criterion] is not satisfied. If over the same critical variable an acknowledged alarm exists, then Fixed Red prevails. For a non critical variable, when the current value of the variable is in non acknowledged alarm (value within the alarm range).”

critical & ((_v>= infAlarmLim& _v<infAlertLim) | (_v<= supAlarmLim& _v>supAlertLim)) & (alarmState != AlaRec&alarmState != AlaNRec)

Page 13: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

13

From manual to model

How the colouring scheme works (from the operations manual):

“Blinking yellow: For a critical variable, when the current value of the variable is in non acknowledged alert (value within the alert range), there is no acknowledged alarm in the variable, and the previous criterion [non acknowledged alarm criterion] is not satisfied. If over the same critical variable an acknowledged alarm exists, then Fixed Red prevails. For a non critical variable, when the current value of the variable is in non acknowledged alarm (value within the alarm range).”

!critical& ((_v<infAlarmLim) | (_v>supAlarmLim))

Page 14: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

14

From manual to model

[setValue(_v)](((critical & ((_v>= infAlarmLim& _v<infAlertLim) | (_v<= supAlarmLim

& _v>supAlertLim))) | (!critical & ((_v<infAlarmLim) | (_v>supAlarmLim)))) & (alarmState != AlaRec&alarmState != AlaNRec))

->value’ = _v&colour’ = yellow & error’ = Lim &alertState’ = AleNRec&characteristic’ = Blink &keep(supAlertLim,infAlertLim,supAlarmLim,

infAlarmLim,unity,critical,alarmState)

conditions for blinking yellow

settingblinking yellow

How the colouring scheme works (from the operations manual):

“Blinking yellow: For a critical variable, when the current value of the variable is in non acknowledged alert (value within the alert range), there is no acknowledged alarm in the variable, and the previous criterion [non acknowledged alarm criterion] is not satisfied. If over the same critical variable an acknowledged alarm exists, then Fixed Red prevails. For a non critical variable, when the current value of the variable is in non acknowledged alarm (value within the alarm range).”

Page 15: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

15

From manual to model

(alarmState != AlaNRec&alarmState != AlaRec) becomes (alarmState = AlaRec)

How the colouring scheme works (from the operations manual):

“Blinking yellow: For a critical variable, when the current value of the variable is in non acknowledged alert (value within the alert range), there is no acknowledged alarm in the variable, and the previous criterion [non acknowledged alarm criterion] is not satisfied. If over the same critical variable an acknowledged alarm exists, then Fixed Red prevails. For a non critical variable, when the current value of the variable is in non acknowledged alarm (value within the alarm range).”

Page 16: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

16

the model

Page 17: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

17

analysis

Can a variable be in alarm?

Trying to prove otherwise…

False as expected but…

counterexample highlights a situation where the variable

colour is fixed red under an acknowledged alert condition –

should not be possible.

AG(monitTMT.BD1_A.colour = green -> !EX (monitTMT.BD1_A.colour = red))

Page 18: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

18

analysis

manual not stating what happens to a non-critical alert

model becomes non-deterministic

Page 19: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

19

conclusions/lessons learnt

It was possible to build a relevant model independently (without a deep understanding of the system) and still provide insights to the client

This particular model captures understanding of the system from the operations manual/requirements document perspective

Incomplete or inconsistent information leads to unexpected system behaviour

Computer-aided verification of user interfaces is crucial for critical-complex systems

Results can help:

improve requirements / manuals

define test cases

improve system dependability

As we add complexity to the models, verification time becomes a problem – but, interesting results are possible with manageable models

Page 20: Independent Formal Verification of Safety-Critical Systems’ User Interfaces: a space system case study

20

Thank [email protected]


Recommended