1exDeductive Verification, the Inductive Way · LogicalReasoningMeetsMachineLearning...

Post on 29-May-2020

8 views 0 download

transcript

Deductive Verification, the Inductive Way

Daniel Neider

ForMal Spring School6 June 2019

The Need for Software Verification

Daniel Neider: Deductive Verification, the Inductive Way 1

The Need for Software Verification

Tricentis

Daniel Neider: Deductive Verification, the Inductive Way 1

The Need for Software Verification

Tricentis

Daniel Neider: Deductive Verification, the Inductive Way 1

The Need for Software Verification

Tricentis

Daniel Neider: Deductive Verification, the Inductive Way 1

Software Quality Assurance

How does one develop high-quality software?

I Correct and complete specifications/designI Good software development processI TestingI Formal verificationI Runtime monitoringI . . .

Daniel Neider: Deductive Verification, the Inductive Way 2

Software Quality Assurance

How does one develop high-quality software?

I Correct and complete specifications/designI Good software development processI TestingI Formal verificationI Runtime monitoringI . . .

Daniel Neider: Deductive Verification, the Inductive Way 2

Logical Reasoning Meets Machine Learning

Combine logical reasoningand machine learning

Alan Turing, Computing Machinery and Intelligence (1950)

Goal: Improve the verification process by incorporating knowledgethat has been learned from the program

Daniel Neider: Deductive Verification, the Inductive Way 3

Logical Reasoning Meets Machine Learning

“Machine learning studies algorithms that canlearn from data and make predictions on datawithout being explicitly programmed”

Arthur Samuel (1959)

Goal: Improve the verification process by incorporating knowledgethat has been learned from the program

Daniel Neider: Deductive Verification, the Inductive Way 3

Logical Reasoning Meets Machine Learning

Combine logical reasoningand machine learning

Alan Turing, Computing Machinery and Intelligence (1950)

DeductiveReasoning

(Formal Verification)

InductiveReasoning(Machine Learning)

Infer conclusions byapplying logical rules

Infer conclusions bygeneralizing from data

Goal: Improve the verification process by incorporating knowledgethat has been learned from the program

Daniel Neider: Deductive Verification, the Inductive Way 3

Logical Reasoning Meets Machine Learning

Combine logical reasoningand machine learning

Alan Turing, Computing Machinery and Intelligence (1950)

DeductiveReasoning

(Formal Verification)

InductiveReasoning(Machine Learning)

Goal: Improve the verification process by incorporating knowledgethat has been learned from the program

Daniel Neider: Deductive Verification, the Inductive Way 3

Outline

1. A crash course on deductive software verification

2. Inductive inference of annotations

Daniel Neider: Deductive Verification, the Inductive Way 4

1. Crash Course on DeductiveSoftware Verification

Program Verification 101

Program CorrectnessIf the proper condition to run a program holds, and the program is run,then the program will halt, and when it halts, the desired result followsI The proper condition to run the program is called preconditionI The desired result is called postcondition

It is often convenient to prove termination and correctness separatelyI Precondition implies termination (termination)I Precondition and termination imply postcondition (partial

correctness)

Daniel Neider: Deductive Verification, the Inductive Way 5

Running Example

1: var i, n: int;

2: assume (i == 0 && n >= 0);

3: while (i < n)4: {5: i := i + 1;6: }

7: assert (i == n);

Deductive Program VerificationI Express the correctness of a program as a set of mathematical

statements (i.e., formulas), called verification conditionsI Then, check their validity using either automated or interactive

theorem provers

Daniel Neider: Deductive Verification, the Inductive Way 6

Running Example

1: var i, n: int;

2: assume (i == 0 && n >= 0);

3: while (i < n)4: {5: i := i + 1;6: }

7: assert (i == n);

Deductive Program VerificationI Express the correctness of a program as a set of mathematical

statements (i.e., formulas), called verification conditionsI Then, check their validity using either automated or interactive

theorem provers

precondition

postcondition

Daniel Neider: Deductive Verification, the Inductive Way 6

Running Example

1: var i, n: int;

2: assume (i == 0 && n >= 0);

3: while (i < n)4: {5: i := i + 1;6: }

7: assert (i == n);

Deductive Program VerificationI Express the correctness of a program as a set of mathematical

statements (i.e., formulas), called verification conditionsI Then, check their validity using either automated or interactive

theorem provers

precondition

postcondition

Daniel Neider: Deductive Verification, the Inductive Way 6

Deductive Software Verification

IntermediateLanguage

VerificationConditionGenerator

Program

Annotations

TheoremProver

VCs(formulas) 3 . . .

3 . . .7 . . .

Daniel Neider: Deductive Verification, the Inductive Way 7

Deductive Software Verification

IntermediateLanguage

VerificationConditionGenerator

Program

Annotations

TheoremProver

VCs(formulas) 3 . . .

3 . . .7 . . .

Daniel Neider: Deductive Verification, the Inductive Way 7

Deductive Software Verifiers

GRA

SSho

pper

g

MoreBoogie: An IntermediateVerification Language

Why3

I Support for many programming languages (e.g., via LLVM IR)I Support for many theorem provers (e.g., via SMTLib2)I Many industry applications (e.g., Microsoft SDV, Facebook

INFER)

Daniel Neider: Deductive Verification, the Inductive Way 8

Deductive Program Verification

IntermediateLanguage

VerificationConditionGenerator

Program

Annotations

TheoremProver

VCs(formulas) 3 . . .

3 . . .7 . . .

Daniel Neider: Deductive Verification, the Inductive Way 9

SMT: Satisfiability Modulo Theories

Satisfiability Modulo Theories (SMT)Satisfiability problem for logical formulas with respect to combinationsof background theories expressed in first-order logic with equalityI theory of real numbers, the theory of integersI theory of bit vectors (useful for modeling machine-level data types)I theory of various data structures such as lists and arrays

Usually, one considers quantifier-free theories

SMT SolversNumerous highly-optimized solvers are availableI Z3, CVC4, OpenSMT, . . .I http://smtcomp.sourceforge.net/2018/

Daniel Neider: Deductive Verification, the Inductive Way 10

SMT: Satisfiability Modulo Theories

Satisfiability Modulo Theories (SMT)Satisfiability problem for logical formulas with respect to combinationsof background theories expressed in first-order logic with equalityI theory of real numbers, the theory of integersI theory of bit vectors (useful for modeling machine-level data types)I theory of various data structures such as lists and arrays

Usually, one considers quantifier-free theories

SMT SolversNumerous highly-optimized solvers are availableI Z3, CVC4, OpenSMT, . . .I http://smtcomp.sourceforge.net/2018/

Daniel Neider: Deductive Verification, the Inductive Way 10

Deductive Program Verification

IntermediateLanguage

VerificationConditionGenerator

Program

Annotations

TheoremProver

VCs(formulas) 3 . . .

3 . . .7 . . .

Daniel Neider: Deductive Verification, the Inductive Way 11

Verification Condition Generation

Verification Conditions (VCs)Verification conditions are logic formulas derived from the program’ssource codeI If the VC is valid: the program is correctI If the VC is invalid: there are errors in the program

Floyd-Hoare-style VerificationI Hoare triples {P}S{Q} formalize the semantics of software for the

purpose of deductive verificationI Verification conditions can be generated automatically using the

concept of weakest preconditions

Daniel Neider: Deductive Verification, the Inductive Way 12

Verification Condition Generation

Verification Conditions (VCs)Verification conditions are logic formulas derived from the program’ssource codeI If the VC is valid: the program is correctI If the VC is invalid: there are errors in the program

Floyd-Hoare-style VerificationI Hoare triples {P}S{Q} formalize the semantics of software for the

purpose of deductive verificationI Verification conditions can be generated automatically using the

concept of weakest preconditions

Daniel Neider: Deductive Verification, the Inductive Way 12

Verification Conditions: Straight-Line Code

Proving a Program Correct

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x >= 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 ≥ 3

]]

Detecting Assertion Violations

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x < 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 < 3

]]

I A satisfying assignment for the negation of the VC provides inputto the program that violates the assertion

Daniel Neider: Deductive Verification, the Inductive Way 13

Verification Conditions: Straight-Line Code

Proving a Program Correct

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x >= 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 ≥ 3

]]

Detecting Assertion Violations

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x < 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 < 3

]]

I A satisfying assignment for the negation of the VC provides inputto the program that violates the assertion

Daniel Neider: Deductive Verification, the Inductive Way 13

Verification Conditions: Straight-Line Code

Proving a Program Correct

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x >= 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 ≥ 3

]]

Detecting Assertion Violations

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x < 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 < 3

]]

I A satisfying assignment for the negation of the VC provides inputto the program that violates the assertion

Daniel Neider: Deductive Verification, the Inductive Way 13

Verification Conditions: Straight-Line Code

Proving a Program Correct

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x >= 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 ≥ 3

]]

Detecting Assertion Violations

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x < 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 < 3

]]

I A satisfying assignment for the negation of the VC provides inputto the program that violates the assertion

Daniel Neider: Deductive Verification, the Inductive Way 13

Verification Conditions: Straight-Line Code

Proving a Program Correct

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x >= 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 ≥ 3

]]

Detecting Assertion Violations

1: var x: int;2: assume x >= 1;3: x := x + 2;4: assert x < 3;

x2 ≥ 1⇒[x3 = x2 + 2⇒

[x3 < 3

]]

I A satisfying assignment for the negation of the VC provides inputto the program that violates the assertion

Daniel Neider: Deductive Verification, the Inductive Way 13

Verification Conditions: Branching

1: var x, y: int;2: if (x < 0) {3: y := -x;4: } else {5: y := x;6: }7: assert y >= 0;

[x2 < 0⇒

[y3 = −x2 ⇒

[y3 ≥ 0

]]]

[¬(x2 < 0)⇒

[y5 = x2 ⇒

[y5 ≥ 0

]]]

Daniel Neider: Deductive Verification, the Inductive Way 14

Verification Conditions: Branching

1: var x, y: int;2: if (x < 0) {3: y := -x;4: } else {5: y := x;6: }7: assert y >= 0;

[x2 < 0⇒

[y3 = −x2 ⇒

[y3 ≥ 0

]]]

[¬(x2 < 0)⇒

[y5 = x2 ⇒

[y5 ≥ 0

]]]

Daniel Neider: Deductive Verification, the Inductive Way 14

Design by Contract

Design by Contract or Assume-Guarantee ReasoningDevelopers annotate software components with contracts (i.e., formalspecifications)I Contracts document the developer’s intentI Verification is broken down into compositional verification of

individual components

Typical ContractsI Annotations on procedure boundaries (preconditions and

postconditions)I Annotations on loop boundaries (loop invariants)

Daniel Neider: Deductive Verification, the Inductive Way 15

Method Contracts – Idea

ExampleHow can we verify the following program?

foo() { . . . }bar() { . . . foo(); . . . }

Daniel Neider: Deductive Verification, the Inductive Way 16

Method Contracts – Idea

ExampleHow can we verify the following program?

foo() { . . . }bar() { . . . foo(); . . . }

First SolutionInline foo

Daniel Neider: Deductive Verification, the Inductive Way 16

Method Contracts – Idea

ExampleHow can we verify the following program?

foo() { . . . }bar() { . . . foo(); . . . }

Second SolutionWrite contract/specification P of fooI Assume P when checking bar

bar() { . . . assume P; . . . }

I Guarantee P when checking foo

foo() { . . . assert P; }

Daniel Neider: Deductive Verification, the Inductive Way 16

Method Contracts – Details

1: procedure M(x, y)returns (r, s)requires Pensures Q

2: {3: S;4: }

1: assume P;2: S;3: assert Q;

5: call a, b := M(c, d);

4: x’ := c; y’ := d;5: assert P’;6: assume Q’;7: a := r’; b := s’;

where x’, y’, r’, s’ are fresh variables, P’ is P with x’, y’ for x, y,and Q’ is Q with x’, y’, r’, s’ for x, y, r, s

Daniel Neider: Deductive Verification, the Inductive Way 17

Method Contracts – Details

1: procedure M(x, y)returns (r, s)requires Pensures Q

2: {3: S;4: }

1: assume P;2: S;3: assert Q;

5: call a, b := M(c, d);

4: x’ := c; y’ := d;5: assert P’;6: assume Q’;7: a := r’; b := s’;

where x’, y’, r’, s’ are fresh variables, P’ is P with x’, y’ for x, y,and Q’ is Q with x’, y’, r’, s’ for x, y, r, s

Daniel Neider: Deductive Verification, the Inductive Way 17

Method Contracts – Details

1: procedure M(x, y)returns (r, s)requires Pensures Q

2: {3: S;4: }

1: assume P;2: S;3: assert Q;

5: call a, b := M(c, d); 4: x’ := c; y’ := d;5: assert P’;6: assume Q’;7: a := r’; b := s’;

where x’, y’, r’, s’ are fresh variables, P’ is P with x’, y’ for x, y,and Q’ is Q with x’, y’, r’, s’ for x, y, r, s

Daniel Neider: Deductive Verification, the Inductive Way 17

Loops and Loop Unrolling

1: var i, n: int;2: assume (i == 0 && n >= 0);3: while (i < n)4: {5: i := i + 1;6: }7: assert (i == n);

Loop Unrollingassume (i == 0 && n >= 0);if (i < n) {

i := i + 1;if (i < n) {

... (assume false;)}

}assert (i == n);

N unrollings

Daniel Neider: Deductive Verification, the Inductive Way 18

Loops and Loop Unrolling

1: var i, n: int;2: assume (i == 0 && n >= 0);3: while (i < n)4: {5: i := i + 1;6: }7: assert (i == n);

Loop Unrollingassume (i == 0 && n >= 0);if (i < n) {

i := i + 1;if (i < n) {

... (assume false;)}

}assert (i == n);

N unrollings

Daniel Neider: Deductive Verification, the Inductive Way 18

Loops and Loop Invariants

1: var i, n: int;2: assume (i == 0 && n >= 0);3: while (i < n)4: {5: i := i + 1;6: }7: assert (i == n);

Loop InvariantsA loop invariant is a statement I over the variables of the program (i.e.,a predicate) such thatI I holds before the loop and is implied by the preconditionI I holds on every iteration of the loop (is inductive)I I holds after the final iteration and implies the postcondition

Daniel Neider: Deductive Verification, the Inductive Way 19

Loops and Loop Invariants

1: var i, n: int;2: assume (i == 0 && n >= 0);3: while (i < n)4: {5: i := i + 1;6: }7: assert (i == n);

Loop InvariantsAn adequate invariant is i ≤ nI Precondition: (i = 0 ∧ n ≥ 0) ⇒ i ≤ nI Inductivity: (i ≤ n ∧ i < n ∧ i ′ = i + 1 ∧ n′ = n)⇒ i ′ ≤ n′

I Postcondition: (i ≤ n ∧ ¬(i < n)) ⇒ i = n

Daniel Neider: Deductive Verification, the Inductive Way 19

Quiz: Can You Prove This Program Correct?

Example1: var x, y: int;2: x := -1;3: while (x < 0)4: invariant ???;5: {6: x := x + y;7: y := y + 1;8: }9: assert (y > 0);

https://horn-ice.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 20

Quiz: Can You Prove This Program Correct?

Example1: var x, y: int;2: x := -1;3: while (x < 0)4: invariant x >= 0 ==> y > 0;5: {6: x := x + y;7: y := y + 1;8: }9: assert (y > 0);

https://horn-ice.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 20

2. Inductive Inference ofAnnotations

Deductive Program Verification

IntermediateLanguage

VerificationConditionGenerator

Program

Annotations

TheoremProver

VCs(formulas) 3 . . .

3 . . .7 . . .

Daniel Neider: Deductive Verification, the Inductive Way 21

(Software) Verification

loc = l13x = 2y = 17

...

loc = l14x = 3y = 17

...

x = x + 1

Daniel Neider: Deductive Verification, the Inductive Way 22

(Software) Verification

Init

Bad

ϕpre

¬ϕpost

Daniel Neider: Deductive Verification, the Inductive Way 22

(Software) Verification

Init

Bad

ϕpre

¬ϕpost

Daniel Neider: Deductive Verification, the Inductive Way 22

(Software) Verification

Inv

Init

Bad

ϕpre

¬ϕpostϕinv

Invariant1. Init ⊆ Inv (includes initial configurations)2. Bad ∩ Inv = ∅ (excludes bad configurations)3. Step(Inv) ⊆ Inv (is inductive)

Daniel Neider: Deductive Verification, the Inductive Way 22

(Software) Verification

Inv

Init

Bad

ϕpre

¬ϕpostϕinv

Invariant SynthesisI Abstract interpretation, predicate abstraction, Craig’s

interpolation, IC3, etc.I Inductive techniques from machine learning

Daniel Neider: Deductive Verification, the Inductive Way 22

Learning Invariants

Learner(knows examples)

Teacher(knows program)

candidate invariant H

counterexample

Inductivereasoningengine

Deductivereasoning

engine

Daniel Neider: Deductive Verification, the Inductive Way 23

Counterexamples

H

Init

Bad

Invariant1. Init ⊆ H (includes initial configurations)2. Bad ∩ H = ∅ (excludes bad configurations)3. Step(H) ⊆ H (is inductive)

Daniel Neider: Deductive Verification, the Inductive Way 24

Counterexamples

H

cInit

Bad

Refuting Non-Invariants1. Positive counterexample: if Init 6⊆ H, report c ∈ Init \ H

2. Negative counterexample: if Bad ∩ H 6= ∅, report c ∈ Bad ∩ H3. Implication counterexample: if Step(H) 6⊆ H, report c ⇒ c ′ with

Step(c, c ′), c ∈ H, and c ′ /∈ H

Daniel Neider: Deductive Verification, the Inductive Way 24

Counterexamples

H c

Init

Bad

Refuting Non-Invariants1. Positive counterexample: if Init 6⊆ H, report c ∈ Init \ H2. Negative counterexample: if Bad ∩ H 6= ∅, report c ∈ Bad ∩ H

3. Implication counterexample: if Step(H) 6⊆ H, report c ⇒ c ′ withStep(c, c ′), c ∈ H, and c ′ /∈ H

Daniel Neider: Deductive Verification, the Inductive Way 24

Counterexamples

H

c c ′

Init

Bad

Refuting Non-Invariants1. Positive counterexample: if Init 6⊆ H, report c ∈ Init \ H2. Negative counterexample: if Bad ∩ H 6= ∅, report c ∈ Bad ∩ H3. Implication counterexample: if Step(H) 6⊆ H, report c ⇒ c ′ with

Step(c, c ′), c ∈ H, and c ′ /∈ H

Daniel Neider: Deductive Verification, the Inductive Way 24

ICE Learning [CAV ’14]

Learner(knows examples)

Teacher(knows program)

candidate invariant H

positive, negative, orimplication counterexample

Teacher1. Given a hypothesis H, check Conditions 1, 2, and 32. If H is not an invariant, return a positive, negative, or implication

counterexample

Daniel Neider: Deductive Verification, the Inductive Way 25

ICE Learning [CAV ’14]

Learner(knows examples)

Teacher(knows program)

candidate invariant H

positive, negative, orimplication counterexample

LearnerMaintains a sample S = (Pos,Neg , Impl) and constructs a hypothesisH that is consistent with S:I c ∈ H for each c ∈ PosI c /∈ H for each c ∈ NegI c ∈ H implies c ′ ∈ H for each c ⇒ c ′ ∈ Impl

Daniel Neider: Deductive Verification, the Inductive Way 25

Teacher / Tool Architecture

Teacher(Boogie) ICE Learner

Boogie program

SMACK

C/C++/Javaprogram

SMTsolver

Counterexample

Hypothesis

FailedVerified

Model

VC

Daniel Neider: Deductive Verification, the Inductive Way 26

Implementing a Learner

++++++++ + + + + + +

++++++++++++

−−−−−−

−−−−−−

x

y

−60 −30 0

−40

−20

0

20

Hypothesis H

SimplificationI We assume that a finite set P of predicates is given that allows

separating any pair of program configurations in the ICE sampleI We will relax this restriction later

Daniel Neider: Deductive Verification, the Inductive Way 27

Implementing a Learner

x + y ≤ −2

x ≤ −10

++++++++ + + + + + +

++++++++++++

−−−−−−

−−−−−−

x

y

−60 −30 0

−40

−20

0

20

x + y ≤ −2∨

x ≤ −10

SimplificationI We assume that a finite set P of predicates is given that allows

separating any pair of program configurations in the ICE sampleI We will relax this restriction later

Daniel Neider: Deductive Verification, the Inductive Way 27

Implementing a Learner

x + y ≤ −2

x ≤ −10

++++++++ + + + + + +

++++++++++++

−−−−−−

−−−−−−

x

y

−60 −30 0

−40

−20

0

20

x + y ≤ −2∨

x ≤ −10

SimplificationI We assume that a finite set P of predicates is given that allows

separating any pair of program configurations in the ICE sampleI We will relax this restriction later

Daniel Neider: Deductive Verification, the Inductive Way 27

Three Types of ICE Learning Algorithms

A. Houdini(Flanagan and Leino, FME ’01)

B. Sorcar(Madhusudan, N., and Saha)

C. ICE Learning Using Decision Trees(Garg, Madhusudan, N., Roth, POPL ’16)

Daniel Neider: Deductive Verification, the Inductive Way 28

A. Houdini

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

ExampleLet P = {p1, p2, p3, p4, p5} be a set of predicatesI (1, 0, 1, 1, 0), +; (1, 1, 1, 0, 1), +I (1, 1, 1, 0, 0)→ (0, 1, 1, 1, 1)I (0, 1, 0, 1, 1), –

p1 ∧ p2 ∧ p3 ∧ p4 ∧ p5

Algorithm 1: The Houdini algorithm1 X ← P (i.e., ϕX = p1 ∧ . . . ∧ pn)2 while X is not consistent with Pos do3 Remove predicates pi from X that “occur as 0” in a positive example4 if the left-hand-side of an implication in Impl is satisfied then5 mark the right-hand-side as positive

6 return X if no negative example in Neg is satisfied

Daniel Neider: Deductive Verification, the Inductive Way 29

The Houdini Algorithm

Theorem (Flanagan and Leino, [FME ’01])Houdini learns the semantically smallest inductive invariant expressibleas a conjunction over P in at most |P| rounds (if one exists). The timespend in each round is proportional to |S| · |P|.

I Advantage:Houdini is independent of negative examples/the post-condition

I Disadvantage:Houdini is independent of negative examples/the post-condition

Daniel Neider: Deductive Verification, the Inductive Way 30

The Houdini Algorithm

Theorem (Flanagan and Leino, [FME ’01])Houdini learns the semantically smallest inductive invariant expressibleas a conjunction over P in at most |P| rounds (if one exists). The timespend in each round is proportional to |S| · |P|.

I Advantage:Houdini is independent of negative examples/the post-condition

I Disadvantage:Houdini is independent of negative examples/the post-condition

Daniel Neider: Deductive Verification, the Inductive Way 30

The Houdini Algorithm

Theorem (Flanagan and Leino, [FME ’01])Houdini learns the semantically smallest inductive invariant expressibleas a conjunction over P in at most |P| rounds (if one exists). The timespend in each round is proportional to |S| · |P|.

I Advantage:Houdini is independent of negative examples/the post-condition

I Disadvantage:Houdini is independent of negative examples/the post-condition

Daniel Neider: Deductive Verification, the Inductive Way 30

B. Sorcar

The Sorcar Algorithm

Idea: Relevant PredicatesA predicate is relevant if it has shown evidence to be useful for refutingnegative examples (i.e., occurs as “0” in a negative example)

Algorithm 2: The Sorcar algorithm1 static R ← ∅2 Procedure Sorcar(S, P, R):3 X ← Houdini(S, P) // Takes care of positive examples in Pos

4 while X ∩ R is not consistent with S do5 foreach negative example in Neg not consistent with X ∩ R do6 Add “relevant” predicate from X \ R to R7 foreach implication in Impl not consistent with X ∩ R do8 Mark the left-hand-side as negative

9 return X

Daniel Neider: Deductive Verification, the Inductive Way 31

The Sorcar Algorithm

Idea: Relevant PredicatesA predicate is relevant if it has shown evidence to be useful for refutingnegative examples (i.e., occurs as “0” in a negative example)

Algorithm 2: The Sorcar algorithm1 static R ← ∅2 Procedure Sorcar(S, P, R):3 X ← Houdini(S, P) // Takes care of positive examples in Pos

4 while X ∩ R is not consistent with S do5 foreach negative example in Neg not consistent with X ∩ R do6 Add “relevant” predicate from X \ R to R7 foreach implication in Impl not consistent with X ∩ R do8 Mark the left-hand-side as negative

9 return X

Daniel Neider: Deductive Verification, the Inductive Way 31

The Sorcar Algorithm

Theorem (Madhusudan, N., Saha)Sorcar learns an inductive invariant in at most 2 · |P| rounds if one isexpressible as a conjunction over P. The time spend in each round isproportional to |S| · f (|P|), where f is a function capturing thecomplexity of finding relevant predicates.I The conjunction Sorcar computes is always a subset of the one

Houdini computes

1. Sorcar-Max: f (n) ∈ O(n)2. Sorcar-First: f (n) ∈ O(n)3. Sorcar-Min: f (n) ∈ O(2n)4. Sorcar-Greedy: f (n) ∈ O(n3)

Daniel Neider: Deductive Verification, the Inductive Way 32

The Sorcar Algorithm

Theorem (Madhusudan, N., Saha)Sorcar learns an inductive invariant in at most 2 · |P| rounds if one isexpressible as a conjunction over P. The time spend in each round isproportional to |S| · f (|P|), where f is a function capturing thecomplexity of finding relevant predicates.I The conjunction Sorcar computes is always a subset of the one

Houdini computes

1. Sorcar-Max: f (n) ∈ O(n)2. Sorcar-First: f (n) ∈ O(n)3. Sorcar-Min: f (n) ∈ O(2n)4. Sorcar-Greedy: f (n) ∈ O(n3)

Daniel Neider: Deductive Verification, the Inductive Way 32

Houdini vs. Sorcar

Daniel Neider: Deductive Verification, the Inductive Way 33

Quiz: Can You Prove This Program Correct?

Example (Houdini/Sorcar)1: var x, y, z: int;2: assume (x < y);3: z := y;4: while (z > x)5: invariant ???;6: {7: z := z - 1;8: }9: assert (x <= z);

https://horn-ice.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 34

Quiz: Can You Prove This Program Correct?

Example (Houdini/Sorcar)1: var x, y, z: int;2: assume (x < y);3: z := y;4: while (z > x)5: invariant x <= z && y > x;6: {7: z := z - 1;8: }9: assert (x <= z);

https://horn-ice.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 34

C. ICE Learning UsingDecision Trees

Decision Trees

Sunny? Hot? Windy? Play Tennis

0 1 1 –0 0 1 –1 0 1 +0 1 0 +

Sunny?

+ Windy?

– +

x ≥ 0 ∨(¬(x ≥ 0) ∧ ¬(x ≤ res)

)

Daniel Neider: Deductive Verification, the Inductive Way 35

Decision Trees

x ≥ 0 res < 0 x ≤ res Class

0 1 1 –0 0 1 –1 0 1 +0 1 0 +

x ≥ 0

+ x ≤ res

– +

x ≥ 0 ∨(¬(x ≥ 0) ∧ ¬(x ≤ res)

)

Daniel Neider: Deductive Verification, the Inductive Way 35

Decision Trees

x ≥ 0 res < 0 x ≤ res Class

0 1 1 –0 0 1 –1 0 1 +0 1 0 +

x ≥ 0

+ x ≤ res

– +

x ≥ 0 ∨(¬(x ≥ 0) ∧ ¬(x ≤ res)

)

Daniel Neider: Deductive Verification, the Inductive Way 35

Learning Decision Trees

––

+++

––

++

––

++

– ––

+++

++

––

–+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

––

+++

––

++

––

++

– ––

+++

++

––

–+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

––

+++

––

++

––

++

– ––

+++

++

––

–+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4 x + y ≤ 2

––

+++

––

++

––

++

– ––

+++

++

––

–+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4 x + y ≤ 2

––

+++

––

++

––

++

– ––

+++

++

––

–+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

–+

+

–+

–+

–+

+

+

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

–+

+

–+

–+

–+

+

+

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

–+

+

–+

–+

–+

+

+

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4

–+

+

–+

–+

–+

+

+

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4

–+

+

–+

–+

–+

+

+

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4

–+

+

–+

–+

–+

+

++

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4

–+

+

–+

–+

–+++

++

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4

–+

+

–+

–+

–+++

++

––

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4

–+

+

–+

–+

––

+++

++

––

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4 x + y ≤ 2

–+

+

–+

–+

––

+++

++

––

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

Learning Decision Trees

x − 0.1y ≤ 0

x ≤ −4 x + y ≤ 2

–+

+

–+

–+

––

+++

++

––

+++

––

Daniel Neider: Deductive Verification, the Inductive Way 36

How to Split?

Goal: Split such that the resulting subsamples are as “pure” as possible

Entropy (Shannon, 1948)Let S = (Pos,Neg) be a sample. Then, entropy is defined as

H(S) = −[P(+) log2 P(+) + P(–) log2 P(–)

],

where P(+) = |Pos||Pos|+|Neg | and P(–) = |Neg |

|Pos|+|Neg |

0 0.5 10

1

P(+)

H(S

)

Information GainA split of S into S1 and S2 results in aninformation gain of

G(S,S1,S2) = H(S)−(H(S1) + H(S2)

)

Daniel Neider: Deductive Verification, the Inductive Way 37

How to Split?

Goal: Split such that the resulting subsamples are as “pure” as possible

Entropy (Shannon, 1948)Let S = (Pos,Neg) be a sample. Then, entropy is defined as

H(S) = −[P(+) log2 P(+) + P(–) log2 P(–)

],

where P(+) = |Pos||Pos|+|Neg | and P(–) = |Neg |

|Pos|+|Neg |

0 0.5 10

1

P(+)

H(S

)

Information GainA split of S into S1 and S2 results in aninformation gain of

G(S,S1,S2) = H(S)−(H(S1) + H(S2)

)

Daniel Neider: Deductive Verification, the Inductive Way 37

How to Split?

Goal: Split such that the resulting subsamples are as “pure” as possible

Entropy (Shannon, 1948)Let S = (Pos,Neg) be a sample. Then, entropy is defined as

H(S) = −[P(+) log2 P(+) + P(–) log2 P(–)

],

where P(+) = |Pos||Pos|+|Neg | and P(–) = |Neg |

|Pos|+|Neg |

0 0.5 10

1

P(+)

H(S

)

Information GainA split of S into S1 and S2 results in aninformation gain of

G(S,S1,S2) = H(S)−(H(S1) + H(S2)

)Daniel Neider: Deductive Verification, the Inductive Way 37

How to Split in the Presence of Implications

x0 5

y

++

–? ?

(a) Sample before splitting

x0 5

y

++

–+ +

(b) Samples after splitting

Figure 1: The samples discussed in Example 1.

Penalize Implications That Are Cut by a SplitLet c ∈ R be a constant and γ be the number of implications that gofrom S1 to S2 or vice versa

Gpenalty(S,S1,S2) = G(S,S1,S2)− c · γ

Daniel Neider: Deductive Verification, the Inductive Way 38

How to Split in the Presence of Implications

x0 5

y

++

–? ?

(a) Sample before splitting

x0 5

y

++

–+ +

(b) Samples after splitting

Figure 1: The samples discussed in Example 1.

Penalize Implications That Are Cut by a SplitLet c ∈ R be a constant and γ be the number of implications that gofrom S1 to S2 or vice versa

Gpenalty(S,S1,S2) = G(S,S1,S2)− c · γ

Daniel Neider: Deductive Verification, the Inductive Way 38

Predicates Over Numeric Variables

A fixed set of predicates might be insufficient to construct a consistentdecision tree

x0 2 4

x ≤ 3

SolutionLet the learner choose the best split based on the data, which allowsseparating any pair of program configurations

Daniel Neider: Deductive Verification, the Inductive Way 39

Predicates Over Numeric Variables

A fixed set of predicates might be insufficient to construct a consistentdecision tree

x0 2 4

x ≤ −0.5

SolutionLet the learner choose the best split based on the data, which allowsseparating any pair of program configurations

Daniel Neider: Deductive Verification, the Inductive Way 39

Predicates Over Numeric Variables

A fixed set of predicates might be insufficient to construct a consistentdecision tree

x0 2 4

x ≤ 2

SolutionLet the learner choose the best split based on the data, which allowsseparating any pair of program configurations

Daniel Neider: Deductive Verification, the Inductive Way 39

Predicates Over Numeric Variables

A fixed set of predicates might be insufficient to construct a consistentdecision tree

x0 2 4

x ≤ 3.5

SolutionLet the learner choose the best split based on the data, which allowsseparating any pair of program configurations

Daniel Neider: Deductive Verification, the Inductive Way 39

Correctness of the Decision Tree Learner

Theorem (Garg, Madhusudan, N., Roth [POPL ’16])Let P be a finite set of predicates that allows separating any two datapoints in a sample. If the sample is non-contradictory, the presentedlearner always produces a decision tree over P that is consistent withthe given ICE sample (independent of the strategy used to split).

Theorem (Garg, Madhusudan, N., Roth [POPL ’16])By only allowing splits with values in a range [−c; c] and increasing conly if necessary, one obtains a decision tree learner that is guaranteedto find an inductive invariant if one can be expressed as a decision tree.

Daniel Neider: Deductive Verification, the Inductive Way 40

Correctness of the Decision Tree Learner

Theorem (Garg, Madhusudan, N., Roth [POPL ’16])Let P be a finite set of predicates that allows separating any two datapoints in a sample. If the sample is non-contradictory, the presentedlearner always produces a decision tree over P that is consistent withthe given ICE sample (independent of the strategy used to split).

Theorem (Garg, Madhusudan, N., Roth [POPL ’16])By only allowing splits with values in a range [−c; c] and increasing conly if necessary, one obtains a decision tree learner that is guaranteedto find an inductive invariant if one can be expressed as a decision tree.

Daniel Neider: Deductive Verification, the Inductive Way 40

Performance of the Decision Tree Learner

0.1 1 10 1000.1

1

10

100

TO

TO

Time CPAChecker in s

TimeICEin

s

Imperative programsSV Comp 2016

0.1 1 10 1000.1

1

10

100

TO

TO

Time Automizer in sTimeHorn-ICEin

s

Recursive programsSV Comp 2018

Daniel Neider: Deductive Verification, the Inductive Way 41

Quiz: Can You Prove This Program Correct?Example (Decision trees)1: var s, x, y: int;2: assume (x >= 0);3: s := 0;4: while (s < x)5: invariant ???;6: {7: s := s + 1;8: }9: y := 0;

10: while (y < s)11: invariant ???;12: {13: y := y + 1;14: }15: assert (y == x);

https://horn-ice.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 42

Quiz: Can You Prove This Program Correct?Example (Decision trees)1: var s, x, y: int;2: assume (x >= 0);3: s := 0;4: while (s < x)5: invariant s <= x;6: {7: s := s + 1;8: }9: y := 0;

10: while (y < s)11: invariant y <= s;12: {13: y := y + 1;14: }15: assert (y == x);

https://horn-ice.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 42

Conclusion

Conclusion

SummaryDeductive software verification using inductive learning has beenapplied in practice with great success:I GPUVerify (Houdini)I Microsoft’s Static Driver Verifier (Corral, Houdini)

Future ResearchI Synthesizing predicatesI Learning from symbolic counterexamplesI Learning termination proofsI Beyond software: verification of cyber-physical and AI-driven

systemsI Beyond verification: program synthesis

Daniel Neider: Deductive Verification, the Inductive Way 43

Opportunities at MPI-SWS

The Max Planck Institute forSoftware Systems is offeringopportunities:I InternshipsI PhD positionsI PostDoc positions

https://www.mpi-sws.org

Daniel Neider: Deductive Verification, the Inductive Way 44