Algebra of Concurrent Programming

Post on 16-Mar-2016

33 views 1 download

Tags:

description

Algebra of Concurrent Programming. Tony Hoare Cambridge 2011. With ideas from. Ian Wehrman John Wickerson Stephan van Staden Peter O’Hearn Bernhard Moeller Georg Struth Rasmus Petersen …and others. Subject matter: designs. - PowerPoint PPT Presentation

transcript

Algebra of Concurrent Programming

Tony Hoare

Cambridge 2011

With ideas from

• Ian Wehrman• John Wickerson• Stephan van Staden• Peter O’Hearn• Bernhard Moeller• Georg Struth• Rasmus Petersen• …and others

Subject matter: designs

• variables (p, q, r) stand for computer programs, designs, specifications,…

• they all describe what happens inside/around a computer that executes a given program.

• The program itself is the most precise.• The specification is the most abstract.• Designs come in between.

Examples

• Postcondition:– execution ends with array A sorted

• Conditional correctness:– if execution ends, it ends with A sorted

• Precondition: – execution starts with x even

• Program: x := x+1 – the final value of x one greater than the initial

Examples

• Safety:– There are no buffer overflows

• Termination:– execution is finite (ie., always ends)

• Liveness:– no infinite internal activity (livelock)

• Fairness:– a response is always given to each request

• Probability:– the ration of a’s to b’s tends to 1 with time

Unification

• Same laws apply to programs, designs, specifications

• Same laws apply to many forms of correctness.

• Tools based on the laws serve many purposes.• Distinctions can be drawn later– when the need for them is apparent

Refinement: p ⊑ q• Everything described by pis also described by q , e.g.,– spec p implies spec q– prog p satisfies spec q– prog p more determinate than prog q

• stepwise development of a spec is– spec ⊒ design ⊒ program

• stepwise analysis of a program is– program ⊑ design ⊑ spec

Various terminologyp ⊑ q

• below• lesser• stronger• lower bound• more precise• …deterministic• included in • antecedent =>

• above• greater• weaker• upper bound• more abstract• ...non-deterministic• containing (sets)• consequent (pred)

Law: ⊑ is a partial order

•⊑ is transitive• p ⊑ r if p ⊑ q and q ⊑ r• needed for stepwise development/analysis

• ⊑ is antisymmetric • p = r if p ⊑ r and r ⊑ p• needed for abstraction

•⊑ is reflexive– p ⊑ p•for convenience

Binary operator: p ; q

• sequential composition of p and q•each execution of p;q consists of– all events x from an execution of p – and all events y from an execution of q•subject to ordering constraint, either– strong -- weak– interruptible -- inhibited

alternative constraints on p;q •strong sequence: – all x from p must precede all y from q•weak sequence: – no y from q can precede any x from p•interruptible: – other threads may interfere between x and y•separated: – updates to private variables are protected.• all our algebraic laws will apply to each alternative

Hoare triple: {p} q {r} • defined as p;q ⊑ r – starting in the final state of an execution of p,

q ends in the final state of some execution of r– p and r may be arbitrary designs.

•example: {..x+1 ≤ n} x:= x + 1 {..x ≤ n} • where ..b (finally b) describes all executions that

end in a state satisfying a single-state predicate b .

monotonicity

• Law: ( ; is monotonic wrto ⊑) :– p;q ⊑ p’;q if p ⊑ p’ – p;q ⊑ p;q’ if q ⊑ q’– compare: addition of numbers

• Rule (of consequence):– p’ ⊑ p & {p} q {r} & r ⊑ r’ implies {p’} q {r’}

• Rule is interprovable with first law

associativity

• Law (; is associative) :– (p;q);q’ = p;(q;q’)

• Rule (sequential composition):– {p} q {s} & {s} q’ {r} implies {p} q;q’

{r}

• half the law interprovable from rule

Unit(skip):

• a program that does nothing• Law ( is the unit of ;):– p; = p = ;p

• Rule (nullity)– {p} {p}

• a quarter of the law is interprovable from Rule

concurrent composition: p | q

• each execution of (p|q) consists of – all events x of an execution of p,– and all events y of an execution of q

• same laws apply to alternatives:– interleaving: x precedes or follows y– true concurrency: x neither precedes nor

follows y.– separation: x and y independent

• Laws: | is associative, commutative and monotonic

Separation Logic• Law (locality of ; wrto |):– (s|p) ; q ⊑ s |(p;q) (left locality )– p ; (q|s) ⊑ (p;q) | s (right locality)

• Rule (frame) :– {p} q {r} implies {p|s} q {r|s}

• Rule interprovable with left locality

Concurrency law• Law (; exchanges with *)– (p|q) ; (p’|q’) ⊑ (p;p’) | (q;q’)– like exchange law of category theory

• Rule (| compositional)– {p} q {r} & {p’} q’ {r’} implies

{p|p’} q|q’ {r|r’}

• Rule interprovable with the law

p|q ; p’|q’

p p’q’q

by columns

p|q ; p’|q’ ⊑

p p’q’q p;p’ | q;q’

by rows

Regular language model

• p, q, r,… are languages– descriptions of execution of fsm.

• p ⊑ q is inclusion of languages• p;q is (lifted) concatenation of strings– i.e., {st| s ∊ p & t ∊ q}

• p|q is (lifted) interleaving of strings• = {< >} (only the empty string)• “c” = {<c>} (only the string “c”)

Left locality

•Theorem: (s|p) ; q ⊑ s | (p;q)•Proof:

in lhs: s interleaves with just p , and all of q comes at the end.in rhs: s interleaves with all of p;q

so lhs is a special case of rhs

• p s s ; q q q ⊑ p s q s q q

Exchange

• Theorem: (p|q) ; (p’|q’) ⊑ (p;p’) | (q;q’)– in lhs: all of p and q comes before

all of p’ and q’ .– in rhs: end of p may interleave with q’

or start of p’ with qthe lhs is a special case of the rhs.

p q p ; q’ p’ q’ ⊑ p q q’ p p’ q’

Conclusion

• regular expressions satisfy all our laws for ⊑ , ; , and |

• and for other operators introduced later

Part 2. More Program Control Structures

• Non-determinism, intersection• Iteration, recursion, fixed points• Subroutines, contracts, transactions• Basic commands

Subject matter

• variables (p, q, r) stand for programs, designs, specifications,…

• they are all descriptions of what happens inside and around a computer that is executing a program.

• the differences between programs and specs are often defined from their syntax.

Specification syntax includes

• disjunction (or, ⊔) to express abstraction, or to keep options open

– ‘it may be painted green or blue’• conjunction (and, ⊓) combines requirements– it must be cheaper than x and faster than y

• negation (not) for safety and security– it must not explode

• implication (contracts)– if the user observes the protocol, so will the system

Program syntax excludes

• disjunction– non-deterministic programs difficult to test

• conjunction– inefficient to find a computation satisfying both

• negation– incomputable

• implication– which side of contract?

programs include

• sequential composition (;)• concurrent composition (|)• interrupts• iteration, recursion• contracts (declarations)• transactions• assignments, inputs, outputs, jumps,…

• So include these in our specifications!

Bottom

•An unimplementable specification – like the false predicate•A program that has no execution– the compiler stops it from running•Define as least solution of: _ ⊑ _ • Theorem: ⊑ r– satisfies every spec, – but cannot be run (Dijkstra’s miracle)

Algebra of

• Law ( is the zero of ;) :– ; p = = p ;

• Theorem : {p} {q}• Quarter of law provable from

theorem

Top ⊤• a vacuous specification,– satisfied by anything, – like the predicate true

• a program with an error– for which the programmer is responsible– e.g., subscript error, violation of

contract…• define ⊤ as greatest solution of: _ ⊑ _

Algebra of ⊤• Law: none• Theorem: none– you can’t prove a program with this

error– it might admit a virus!

• A debugging implementation may supply useful laws for ⊤

Non-determinism (or): p ⊔ q• describes all executions that either

satisfy p or satisfy q .• The choice is not (yet) determined.• It may be determined later– in development of the design– or in writing the program– or by the compiler – or even at run time

lub (join): ⊔• Define p⊔q as least solution of

p ⊑ _ & q ⊑ _• Theorem– p ⊑ r & q ⊑ r iff p⊔q ⊑ r

• Theorem– ⊔ is associative, commutative,

monotonic, idempotent and increasing– it has unit ⊥ and zero ⊤

glb (meet): ⊓• Define p⊓q as greatest solution of

_ ⊑ p & _ ⊑ q

Distribution

• Law ( ; distributive through ⊔ )– p ; (q⊔q’) = p;q ⊔ p;q’– (q⊔q’) ; p = q;p ⊔ q’;p

• Rule (non-determinism)– {p} q {r} & {p} q’ {r} implies {p}

q⊔q’ {r}– i.e., to prove something of q⊔q’ prove the same thing of both q and q’

• quarter of law interprovable with rule

Conditional: p if b else p’• Define p ⊰b⊱ p’ as

b.. ⊓ p ⊔ not(b).. ⊓ p’– where b.. describes all executions that

begin in a state satisfying b .• Theorem. p ⊰b⊱ p’ is associative,

idempotent, distributive, and– p ⊰b⊱ q = q ⊰not(b)⊱ p (skew

symm)– (p ⊰b⊱ p’ ) ⊰c⊱ (q ⊰b⊱ q’) = (p

⊰c⊱ q) ⊰b⊱ (p’ ⊰c⊱ q’) (exchange)

Transaction

• Defined as (p ⊓..b) ⊔ (q ⊓..c)– where ..b describes all executions that

end satisfying single-state predicate b .• Implementation:– execute p first– test the condition b afterwards– terminate if b is true– backtrack on failure of b– and try alternative q with condition c.

Transaction (realistic)

• Let r describe the non-failing executions of a transaction t .– r is known when execution of t is complete.– any successful execution of t is committed – a single failed execution of t is undone, – and q is done instead.

• Define: (t if r else q) = t if t ⊑ r = (t ⊓ r) ⊔ q otherwise

Contracts

• Let q be the body of a subroutine• Let s be its specification• Let (q .. s) assert that q meets s• Programmer error (⊤) if not so • Caller of subroutine may assume

that s describes all its calls• Implementation may just execute q

Least upper bound

• Let S be an arbitrary set of designs•Define ⊔S as least solution of

∀s∊ S . s ⊑ _ – ( ∀s∊ S . s ⊑ r ) ⇒ ⊔S ⊑ r (all r)

• everything is an upper bound of { } , so ⊔ { } = – a case where ⊔S ∉ S

similarly

• ⊓S is greatest lower bound of S• ⊓ { } = ⊤

Subroutine with contract: q .. s

• Define (q..s) as glb of the setq ⊑ _ & _ ⊑ s

• Theorem: (q.. s) = q if q ⊑ s = ⊤ otherwise

Iteration (Kleene *)

• q* is least solution of – (ɛ ⊔ (q; _) ) ⊑ _

• q* =def ⊔{s| (ɛ ⊔ q; s) ⊑ s} – ɛ ⊔ q; q* ⊑ q* – ɛ ⊔ q; q’ ⊑ q’ implies q* ⊑ q’– q* = ⊔ {qⁿ | n ∊ Nat} (continuity)

• Rule (invariance):– {p}q*{p} if {p}q{p}

Infinite replication

• !p is the greatest solution of _ ⊑ p|_– as in the pi calculus

• all executions of !p are infinite– or possibly empty

Recursion

• Let F(_) be a monotonic function between programs.

• Theorem: all functions defined by monotonic operators are monotonic.

• μF is strongest solution of F(_) ⊑ _• νF is weakest solution of _ ⊑ F(_)• Theorem (Knaster-Tarski): These

solutions exist.

Basic statements/assertions

• skip • bottom • top ⊤• assignment: x := e(x)• assertion: assert b• assumption: assume b• finally ..b• initially b..

more

• assign thru pointer: [a] := e• output: c!e• input: c?x• points to: a|-> e– a |-> _ =def exists v . a|-> v

• throw, catch• alloc, dispose

Laws(examples)

• assume b =def b..⊓• assert b =def b..⊓ ⊔ not(b).. • x:=e(x) ; x:=f(x) = x :=

f(e(x))– in a sequential language

more

• (p|-> _ ); [p] := e ⊑ p|-> e– in separation logic

• c!e | c?x = x := e– in CSP but not in CCS or Pi

• throw x ; (catch x; p) = p

Part 3Unifying Semantic Theories

• Six familiar semantic definition styles. • Their derivation from the algebra• and vice versa.

operational rules

algebraic laws

deduction rules

Hoare Triple

• a method for program verification• {p} q {r} ≝ p;q ⊑ r– one way of achieving r is by first doing p and then doing q

• Theorem (sequential composition):– {p} q {s} & {s} q’ {r} implies {p}

q;q’ {r}– proved by associativity

Plotkin reduction

• a method for program execution• <p , q> -> r =def p ; q ⊒ r– if p describes state before execution of q

then r describes a possible final state, eg.–<..(x2 = 18) , x := x+1> -> ..(x = 37)

• Theorem (sequential composition):• <p, q> -> s & <s, q’> -> r

implies <p, q;q’> r

Milner transition

• method of execution for processes• p – q -> r ≝ p ⊒ q;r– one of the ways of executing p is by first

executing q and then executing r .– e.g., (x := x+3) –(x:=x+1)-> (x:=x+2)

• Theorem (sequential composition):– p –q-> s & s –q’-> r => p –(q;q’)-> r(big-step rule for ; )

partial correctness

• describes what may happen• p[q]r =def p ⊑ q;r– if p describes a state before execution of

q, then execution of q may achieve r• Theorem (sequential composition):• p [q] s & s [q’] r implies p [q;q’] r• useful if r describes error states, and q

describes initial states from which a test execution of q may end in error.

Summary

• {p} q {r} =def p;q ⊑ r– Hoare triple

• <p,q>->r =def p;q ⊒ r– Plotkin reduction

• p –q->r =def p ⊒ q;r–Milner transition

• p [q] r =defp ⊑ q;r– test generation

Sequential composition

• Law: ; is associative• Theorem: sequence rule is valid for all four

triples.

• the Law is provable from the conjunction of all of them

Skip

• Law: p ; = p = ; p

• Theorems: {p} {p} p [] p

p − → p <p, > –>p

• Law follows from conjunction of all four theorems

Left distribution ; through ⊔• Law: p;(q ⊔ q’) = p;q ⊔ p;q’ • Theorems:– {p} (q⊔q’) {r} if {p}q{r} and {p}q’{r} – <p,q⊔q’>-> r if <p,q>-> r or <p, q’>-> r – p [q⊔q’] r if p [q] r or p [q’] r – p -(q⊔q’)-> r if p –q->r and p -q’->r(not used in CCS)

• law provable from either and rule together with either or rule.

locality and frame

• left locality (s|p) ; q ⊑ s | (p;q)• Hoare frame: {p} q {r} ⇒ {s|p} q {s|r}

• right locality p ; (q|s) ⊑ (p;q) | s• Milner frame: p -q-> r ⇒(p|s) - q-> (r|s)

• Full locality requires both frame rules

Separation logic

•Exchange law: – (p | p’) ; (q| q’) (p ; q) | (p’;q’)•Theorems– {p} q {r} & {p’} q’ {r’} ⇒ {p|p’} q|q’ {r|

r’}– p -q -> r & p’–q’-> r’ => p|p’ –q|q’-> r|r’

• the law is provable from either theorem• For the other two triples, the rules are

equivalent to the converse exchange law.

usual restrictions on triples

• in {p} q {r} , p and r are of form ..b, ..c

• in p [q] r , p and r are of form b.., c..• in <p,q>->r, p and r are of form ..b, ..c• in p –q->r, p and r are programs • in p –q->r (small step), q is atomic • (in all cases, q is a program)

• all laws are valid without these restrictions

Weakest precondition (-;)•(q -; r) =def

the weakest solution of ( _ ;q ⊆ r)– the same as Dijkstra’s wp(q, r)– for backward development of programs

Weakest precondition (-;)

• Law (-; adjoint to ;)– p ⊑ q -; r iff p;q ⊑ r (galois)

• Theorem– (q -; r) ; q ⊑ r– p ⊑ q -; (p ; q)

• Law provable from the theorems– cf. (r div q) q ≤ r– r ≤ (rq) div q

Theorems

• q’ ⊑ q & r ⊑ r’ => q-;r ⊑ q’-;r’• (q;q’)-;r ⊑ q-;(q’-;r)• q-;r ⊑ (q;s) -; (r;s)

Specification statement (;-)

•(p ;- r) =def the weakest solution of ( p ; _ ⊆ r)

– Back/Morgan’s specification statement– for stepwise refinement of designs– same as p⇝r in RGSep– same as (requires p; ensures r) in VCC

Law of consequence

Frame laws

Part 4Denotational Models

A model is a mathematical structure that satisfies the axioms of an algebra, and realistically describes a useful application, for example, program execution.

Modelsdenotational models

algebraic laws

Some Standard Models:

• Boolean algebra( {0,1}, ≤, , , (1 - _) )

• predicate algebra (Frege, Heyting)– (ℙS,├, , , (S - _), => , ∃, ∀)

• regular expressions (Kleene):– (ℙA*, ⊆, ∪, ; , ɛ , {<a>} , | )

• binary relations (Tarski):– (ℙ(SS), ⊆, ∪, ∩, ; , Id , not(_), converse(_))

• algebra of designs is a superset of these

Model: (EV, EX, PR)

• EV is an underlying set of events (x, y, ..) that can occur in any execution of any program

• EX are executions (e, f,…), modelled as sets of events

• PR are designs (p, q, r,…), modelled as sets of executions.

Set concepts

• ⊑ is (set inclusion)• ⊔ is (set union) • ⊓ is (intersection of sets)• is { } (the empty set)• ⊤ is EV (the universal set)

With (|)

• p | q = {e ∪ f | e ε p & f ε q & e∩f = { } }

– each execution of p|q is the disjoint union of an execution of p and an execution of q

– p|q contains all such disjoint unions• | generalises many binary operators

Introducing time

• TIM is a set of times for events– partially ordered by ≤

•Let when : EV -> TIM – map each event to its time of occurrence.

Definition of <

•x < y =def not(when(y) ≤ when(x))– x < y & y < x means that x and y occur ‘in

true concurrency’.• e < f =def ∀x,y . x∊e & y∊f => x < y– no event of f occurs before an event of e– hence e<f implies ef = { }

•If ≤ is a total order, – there is no concurrency, – executions are time-ordered strings

Sequential composition (then)

• p ; q = {ef | e∊p & f∊q & e<f}

• special case: if ≤ is a total order, – e < f means that ef is concatenation

(e⋅f) of strings– ; is the composition of regular

expressions

Theorems

• These definitions of ; and | satisfy the locality and exchange laws.

•(s|p) ; q ⊑ s |(p;q)•(p|q) ; (p’|q’) ⊑ (p;p’) | (q;q’)– Proof: the lhs describes fewer

interleavings than the rhs.

• special case: regular expressions satisfy all our laws for ⊑ , ⊔ , ; , and |

Disjoint concurrency (||)

• p||q =def (p ; q) (q ; p)– all events of p concurrent with all of q .– no interaction is possible between them.

• Theorems: (p||q) ; r p || (q ; r) (p||q) ; (p’||q’) (p;p’) || (q;q’)

– Proof: the rhs has more disjointness constraints than the lhs .

– the wrong way round!• So make the programmer responsible for

disjointness, using interfaces!

Interfaces

• Let q be the body of a subroutine• Let s be its specification• Let (q .. s) assert that q is correct • Caller may assume s• Implementer may execute q

Solution

• p*q =def (p|q .. p||q) = p|q if p|q ⊑ p||q ⊤ otherwise

– programmer is responsible for absence of interaction between p and q .

• Theorem: ; and * satisfy locality and exchange.– Proof: in cases where lhs ≠ rhs, rhs = ⊤

Problem

• ; is almost useless in the presence of arbitrary interleaving (interference).

• It is hard to prove disjointness of p||q• We need a more complex model– which constrains the places at which a

program may make changes.

Separation

• PL is the set of places at which an event can occur

• each place is ‘owned’ by one thread,– no other thread can act there.

• Let where:EV -> PL map each event to its place of occurrence.

• where(e) =def {where(x) | x ∊ e }

Separation principle

• events at different places are concurrent

• events at the same place are totally ordered in time

• ∀x,y ∊ EV . where(x) = where(y) iff x≤y or y≤x

Picture

time

space

Theorem

• p || q = {ef | e ∊ p & f ∊ q& where(e) where(f) = { }

}• proved from separation principle

Convexity Principle

• Each execution contains every event that occurs between any of its events.

• ∀e ∊ EX , y ∊ EV. ∀x, z ∊ e .when(x) ≤ when(y) ≤ when(z) => y ∊ e – no event from elsewhere can interfere

between any two events of an execution

A convex execution of p;q

time

space

p q

A non-convex ‘execution’ of p;q

time

space

p q

Conclusion:in Praise of Algebra

• Reusable• Modular• Incremental• Unifying

• Discriminative• Computational• Comprehensible• Abstract

• Beautiful!

Algebra likes pairs

• Algebra chooses as primitives– operators with two operands + , – predicates with two places = , – laws with two operators & v , + – algebras with two components rings

Tuples

• Tuples are defined in terms of pairs.– Hoare triples– Plotkin triples– Jones quintuples – seventeentuples …

Semantic Links

deductions transitions

denotations

algebra

Increments

algebra

Filling the gaps

algebra