+ All Categories
Home > Documents > Syntax (Logic)

Syntax (Logic)

Date post: 12-Jan-2016
Category:
Upload: man
View: 32 times
Download: 1 times
Share this document with a friend
Description:
1. From Wikipedia, the free encyclopedia2. Lexicographical order
Popular Tags:
106
Syntax (logic) From Wikipedia, the free encyclopedia
Transcript
Page 1: Syntax (Logic)

Syntax (logic)From Wikipedia, the free encyclopedia

Page 2: Syntax (Logic)

Contents

1 Atomic sentence 11.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.1.1 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1.2 Atomic sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1.3 Atomic formulae . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.1.4 Compound sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.1.5 Compound formulae . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Translating sentences from a natural language into an artificial language . . . . . . . . . . . . . . . 31.4 Philosophical significance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Formal proof 52.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.1.1 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.1.2 Formal grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.1.3 Formal systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.1.4 Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.4 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3 Formal system 73.1 Related subjects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

3.1.1 Logical system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73.1.2 Deductive system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.1.3 Formal proofs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.1.4 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.1.5 Formal grammar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93.3 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93.4 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

i

Page 3: Syntax (Logic)

ii CONTENTS

3.5 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4 Formation rule 104.1 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104.2 Formal systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104.3 Propositional and predicate logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

5 Logical consequence 125.1 Formal accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125.2 A priori property of logical consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135.3 Proofs and models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

5.3.1 Syntactic consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135.3.2 Semantic consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

5.4 Modal accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135.4.1 Modal-formal accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.4.2 Warrant-based accounts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.4.3 Non-monotonic logical consequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

5.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.6 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145.7 Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

6 Logical constant 176.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176.3 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

7 Logical Syntax of Language 187.1 Life and work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187.2 Logical syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

7.2.1 The purpose of logical syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207.3 Rejection of metaphysics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

7.3.1 The function of logical analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217.4 Selected publications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227.7 Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

8 Metasyntactic variable 258.1 Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

Page 4: Syntax (Logic)

CONTENTS iii

8.3 Words commonly used as metasyntactic variables . . . . . . . . . . . . . . . . . . . . . . . . . . . 268.3.1 Arabic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268.3.2 English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268.3.3 German . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.4 French . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.5 Hebrew . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.6 Italian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.7 Japanese . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.8 Portuguese . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.9 Spanish . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.10 Turkish . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278.3.11 Persian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

8.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

9 Metavariable 309.1 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

10 Proposition 3110.1 Historical usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

10.1.1 By Aristotle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3110.1.2 By the logical positivists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3110.1.3 By Russell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

10.2 Relation to the mind . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3210.3 Treatment in logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3210.4 Objections to propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3210.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3310.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3310.7 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

11 Propositional formula 3411.1 Propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

11.1.1 Relationship between propositional and predicate formulas . . . . . . . . . . . . . . . . . 3511.1.2 Identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

11.2 An algebra of propositions, the propositional calculus . . . . . . . . . . . . . . . . . . . . . . . . . 3511.2.1 Usefulness of propositional formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3611.2.2 Propositional variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3611.2.3 Truth-value assignments, formula evaluations . . . . . . . . . . . . . . . . . . . . . . . . 36

11.3 Propositional connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3711.3.1 Connectives of rhetoric, philosophy and mathematics . . . . . . . . . . . . . . . . . . . . 37

Page 5: Syntax (Logic)

iv CONTENTS

11.3.2 Engineering connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3711.3.3 CASE connective: IF ... THEN ... ELSE ... . . . . . . . . . . . . . . . . . . . . . . . . . 3711.3.4 IDENTITY and evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

11.4 More complex formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3911.4.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3911.4.2 Axiom and definition schemas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4011.4.3 Substitution versus replacement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

11.5 Inductive definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4011.6 Parsing formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

11.6.1 Connective seniority (symbol rank) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4111.6.2 Commutative and associative laws . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4211.6.3 Distributive laws . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4211.6.4 De Morgan’s laws . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4211.6.5 Laws of absorption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4311.6.6 Laws of evaluation: Identity, nullity, and complement . . . . . . . . . . . . . . . . . . . . 4311.6.7 Double negative (Involution) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

11.7 Well-formed formulas (wffs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4311.7.1 Wffs versus valid formulas in inferences . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

11.8 Reduced sets of connectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4411.8.1 The stroke (NAND) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4411.8.2 IF ... THEN ... ELSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

11.9 Normal forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4611.9.1 Reduction to normal form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4611.9.2 Reduction by use of the map method (Veitch, Karnaugh) . . . . . . . . . . . . . . . . . . 47

11.10Impredicative propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4811.11Propositional formula with “feedback” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

11.11.1 Oscillation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4911.11.2 Memory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

11.12Historical development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5011.13Footnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5211.14References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

12 Rule of inference 6012.1 The standard form of rules of inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6012.2 Axiom schemas and axioms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6112.3 Example: Hilbert systems for two propositional logics . . . . . . . . . . . . . . . . . . . . . . . . 6112.4 Admissibility and derivability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6212.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6212.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

13 Symbol (formal) 6413.1 Can words be modeled as formal symbols? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

Page 6: Syntax (Logic)

CONTENTS v

13.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6513.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

14 Syntax (logic) 6614.1 Syntactic entities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

14.1.1 Symbols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6714.1.2 Formal language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6714.1.3 Formation rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6714.1.4 Propositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6714.1.5 Formal theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6714.1.6 Formal systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6814.1.7 Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

14.2 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6814.3 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

15 Theorem 7015.1 Informal account of theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7015.2 Provability and theoremhood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7115.3 Relation with scientific theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7115.4 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7115.5 Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7315.6 Lore . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7315.7 Theorems in logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

15.7.1 Syntax and semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7415.7.2 Derivation of a theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7415.7.3 Interpretation of a formal theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7515.7.4 Theorems and theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

15.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7515.9 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7515.10References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7615.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

16 Theory (mathematical logic) 8116.1 Theories expressed in formal language generally . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

16.1.1 Subtheories and extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8116.1.2 Deductive theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8116.1.3 Consistency and completeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8116.1.4 Interpretation of a theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8216.1.5 Theories associated with a structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

16.2 First-order theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8216.2.1 Derivation in a first order theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8216.2.2 Syntactic consequence in a first order theory . . . . . . . . . . . . . . . . . . . . . . . . . 83

Page 7: Syntax (Logic)

vi CONTENTS

16.2.3 Interpretation of a first order theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8316.2.4 First order theories with identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8316.2.5 Topics related to first order theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

16.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8316.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8416.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8416.6 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

17 Unate function 85

18 Variable (mathematics) 8618.1 Etymology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8618.2 Genesis and evolution of the concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8618.3 Specific kinds of variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

18.3.1 Dependent and independent variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8818.3.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

18.4 Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8818.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9018.6 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9018.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

19 Well-formed formula 9119.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9219.2 Propositional calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9219.3 Predicate logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9319.4 Atomic and open formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9319.5 Closed formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9419.6 Properties applicable to formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9419.7 Usage of the terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9419.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9419.9 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9419.10References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9519.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9519.12Text and image sources, contributors, and licenses . . . . . . . . . . . . . . . . . . . . . . . . . . 96

19.12.1 Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9619.12.2 Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9819.12.3 Content license . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

Page 8: Syntax (Logic)

Chapter 1

Atomic sentence

In logic, an atomic sentence is a type of declarative sentence which is either true or false (may also be referred to asa proposition, statement or truthbearer) and which cannot be broken down into other simpler sentences. For example“The dog ran” is an atomic sentence in natural language, whereas “The dog ran and the cat hid.” is a molecularsentence in natural language.From a logical analysis, the truth or falsity of sentences in general is determined by only two things: the logical formof the sentence and the truth or falsity of its simple sentences. This is to say, for example, that the truth of the sentence“John is Greek and John is happy” is a function of the meaning of "and", and the truth values of the atomic sentences“John is Greek” and “John is happy”. However, the truth or falsity of an atomic sentence is not a matter that is withinthe scope of logic itself, but rather whatever art or science the content of the atomic sentence happens to be talkingabout.[1]

Logic has developed artificial languages, for example sentential calculus and predicate calculus partly with the purposeof revealing the underlying logic of natural languages statements, the surface grammar of which may conceal theunderlying logical structure; see Analytic Philosophy. In these artificial languages an Atomic Sentence is a string ofsymbols which can represent an elementary sentence in a natural language, and it can be defined as follows.In a formal language, a well-formed formula (or wff) is a string of symbols constituted in accordance with the rules ofsyntax of the language. A term is a variable, an individual constant or a n-place function letter followed by n terms.An atomic formula is a wff consisting of either a sentential letter or an n-place predicate letter followed by n terms. Asentence is a wff in which any variables are bound. An atomic sentence is an atomic formula containing no variables.It follows that an atomic sentence contains no logical connectives, variables or quantifiers. A sentence consisting ofone or more sentences and a logical connective is a compound (or molecular sentence). See vocabulary in First-orderlogic

1.1 Examples

1.1.1 Assumptions

In the following examples:* let F, G, H be predicate letters; * let a, b, c be individual constants; * let x, y, z be variables.

1.1.2 Atomic sentences

These wffs are atomic sentences; they contain no variables or conjunctions:

• F(a)

• H(b, a, c)

1

Page 9: Syntax (Logic)

2 CHAPTER 1. ATOMIC SENTENCE

1.1.3 Atomic formulae

These wffs are atomic formulae, but are not sentences (atomic or otherwise) because they include free variables:

• F(x)

• G(a, z)

• H(x, y, z)

1.1.4 Compound sentences

These wffs are compound sentences. They are sentences, but are not atomic sentences because they are not atomicformulae:

• ∀x (F(x))

• ∃z (G(a, z))

• ∃x ∀y ∃z (H(x, y, x))

• ∀x ∃z (F(x) ∧ G(a, z))

• ∃x ∀y ∃z (G(a, z) ∨ H(x, y, z))

1.1.5 Compound formulae

These wffs are compound formulae. They are not atomic formulae but are built up from atomic formulae using logicalconnectives. They are also not sentences because they contain free variables:

• F(x) ∧ G(a, z)

• G(a, z) ∨ H(x, y, z)

1.2 Interpretations

Main article: Interpretation (logic)

A sentence is either true or false under an interpretation which assigns values to the logical variables. We mightfor example make the following assignments:Individual Constants

• a: Socrates

• b: Plato

• c: Aristotle

Predicates:

• Fα: α is sleeping

• Gαβ: α hates β

• Hαβγ: α made β hit γ

Sentential variables:

Page 10: Syntax (Logic)

1.3. TRANSLATING SENTENCES FROM A NATURAL LANGUAGE INTO AN ARTIFICIAL LANGUAGE 3

• p: It is raining.

Under this interpretation the sentences discussed above would represent the following English statements:

• p: “It is raining.”

• F(a): “Socrates is sleeping.”

• H(b, a, c): “Plato made Socrates hit Aristotle.”

• ∀x (F(x)): “Everybody is sleeping.”

• ∃z (G(a, z)): “Socrates hates somebody.”

• ∃x ∀y ∃z (H(x, y, z)): “Somebody made everybody hit somebody.” (They may not have all hit the same personz, but they all did so because of the same person x.)

• ∀x ∃z (F(x) ∧ G(a, z)): “Everybody is sleeping and Socrates hates somebody.”

• ∃x ∀y ∃z (G(a, z) ∨H(x, y, z)): “Either Socrates hates somebody or somebody made everybody hit somebody.”

1.3 Translating sentences from a natural language into an artificial lan-guage

Sentences in natural languages can be ambiguous, whereas the languages of the sentential logic and predicate logicsare precise. Translation can reveal such ambiguities and express precisely the intended meaning.For example take the English sentence “Father Ted married Jack and Jill”. Does this mean Jack married Jill? Intranslating we might make the following assignments: Individual Constants

• a: Father Ted

• b: Jack

• c: Jill

Predicates:

• Mαβγ: α officiated at the marriage of β to γ

Using these assignments the sentence above could be translated as follows:

• M(a, b, c): Father Ted officiated at the marriage of Jack and Jill.

• ∃x ∃y (M(a, b, x) ∧ M(a, c, y)): Father Ted officiated at the marriage of Jack to somebody and Father Tedofficiated at the marriage of Jill to somebody.

• ∃x ∃y (M(x, a, b) ∧ M(y, a, c)): Somebody officiated at the marriage of Father Ted to Jack and somebodyofficiated at the marriage of Father Ted to Jill.

To establish which is the correct translation of “Father Ted married Jack and Jill”, it would be necessary to ask thespeaker exactly what was meant.

Page 11: Syntax (Logic)

4 CHAPTER 1. ATOMIC SENTENCE

1.4 Philosophical significance

Atomic sentences are of particular interest in philosophical logic and the theory of truth and, it has been argued, thereare corresponding atomic facts. An Atomic sentence (or possibly the meaning of an atomic sentence) is called anelementary proposition by Wittgenstein and an atomic proposition by Russell:

• 4.2 The sense of a proposition is its agreement and disagreement with possibilities of existence and non-existenceof states of affairs. 4.21 The simplest kind of proposition, an elementary proposition, asserts the existence of astate of affairs.: Wittgenstein, Tractatus Logico-Philosophicus, s:Tractatus Logico-Philosophicus.

• A proposition (true or false) asserting an atomic fact is called an atomic proposition.: Russell, Introduction toTractatus Logico-Philosophicus, s:Tractatus Logico-Philosophicus/Introduction

• see also [2] and [3] especially regarding elementary proposition and atomic proposition as discussed by Russelland Wittgenstein

Note the distinction between an elementary/atomic proposition and an atomic fact

No atomic sentence can be deduced from (is not entailed by) any other atomic sentence, no two atomic sentences areincompatible, and no sets of atomic sentences are self-contradictory. Wittgenstein made much of this in his TractatusLogico-Philosophicus. If there are any atomic sentences then there must be “atomic facts” which correspond to thosethat are true, and the conjunction of all true atomic sentences would say all that was the case, i.e. “the world” since,according toWittegenstein, “The world is all that is the case”. (TLP:1). Similarly the set of all sets of atomic sentencescorresponds to the set of all possible worlds (all that could be the case).The T-schema, which embodies the theory of truth proposed by Alfred Tarski, defines the truth of arbitrary sentencesfrom the truth of atomic sentences.

1.5 See also• Logical atomism

• Logical constant

• Truthbearer

1.6 References• Benson Mates, Elementary Logic, OUP, New York 1972 (Library of Congress Catalog Card no.74-166004)

• Elliot Mendelson, Introduction to Mathematical Logic, Van Nostran Reinholds Company, New York 1964

• Wittgenstein, Tractatus_Logico-Philosophicus: s:Tractatus Logico-Philosophicus.]

[1] Philosophy of Logic, Willard Van Orman Quine

[2] http://plato.stanford.edu/entries/logical-atomism/

[3] http://plato.stanford.edu/entries/wittgenstein-atomism/

Page 12: Syntax (Logic)

Chapter 2

Formal proof

A formal proof or derivation is a finite sequence of sentences (called well-formed formulas in the case of a formallanguage) each of which is an axiom, an assumption, or follows from the preceding sentences in the sequence by arule of inference. The last sentence in the sequence is a theorem of a formal system. The notion of theorem is notin general effective, therefore there may be no method by which we can always find a proof of a given sentence ordetermine that none exists. The concept of natural deduction is a generalization of the concept of proof.[1]

The theorem is a syntactic consequence of all the well-formed formulas preceding it in the proof. For a well-formedformula to qualify as part of a proof, it must be the result of applying a rule of the deductive apparatus of some formalsystem to the previous well-formed formulae in the proof sequence.Formal proofs often are constructed with the help of computers in interactive theorem proving. Significantly, theseproofs can be checked automatically, also by computer. Checking formal proofs is usually simple, while the problemof finding proofs (automated theorem proving) is usually computationally intractable and/or only semi-decidable,depending upon the formal system in use.

2.1 Background

2.1.1 Formal language

Main article: Formal language

A formal language is a set of finite sequences of symbols. Such a language can be defined without reference to anymeanings of any of its expressions; it can exist before any interpretation is assigned to it – that is, before it has anymeaning. Formal proofs are expressed in some formal language.

2.1.2 Formal grammar

Main articles: Formal grammar and Formation rule

A formal grammar (also called formation rules) is a precise description of the well-formed formulas of a formallanguage. It is synonymous with the set of strings over the alphabet of the formal language which constitute wellformed formulas. However, it does not describe their semantics (i.e. what they mean).

2.1.3 Formal systems

Main article: Formal system

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with adeductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformation

5

Page 13: Syntax (Logic)

6 CHAPTER 2. FORMAL PROOF

rules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expressionfrom one or more other expressions.

2.1.4 Interpretations

Main articles: Formal semantics (logic) and Interpretation (logic)

An interpretation of a formal system is the assignment of meanings to the symbols, and truth-values to the sentencesof a formal system. The study of interpretations is called formal semantics. Giving an interpretation is synonymouswith constructing a model.

2.2 See also• Proof (truth)

• Mathematical proof

• Proof theory

• Axiomatic system

2.3 References[1] The Cambridge Dictionary of Philosophy, deduction

2.4 External links• “A Special Issue on Formal Proof”. Notices of the American Mathematical Society. December 2008.

• 2πix.com: Logic Part of a series of articles covering mathematics and logic.

Page 14: Syntax (Logic)

Chapter 3

Formal system

A formal system is broadly defined as anywell-defined system of abstract thought based on themodel ofmathematics.Euclid’s Elements is often held to be the first formal system and displays the characteristic of a formal system. Theentailment of the system by its logical foundation is what distinguishes a formal system from others which may havesome basis in an abstract model. Often the formal system will be the basis for or even identified with a larger theoryor field (e.g. Euclidean geometry) consistent with the usage in modern mathematics such as model theory. A formalsystem need not be mathematical as such; for example, Spinoza’s Ethics imitates the form of Euclid’s Elements.Each formal system has a formal language, which is composed by primitive symbols. These symbols act on certainrules of formation and are developed by inference from a set of axioms. The system thus consists of any numberof formulas built up through finite combinations of the primitive symbols—combinations that are formed from theaxioms in accordance with the stated rules.[1]

Formal systems in mathematics consist of the following elements:

1. A finite set of symbols (i.e. the alphabet), that can be used for constructing formulas (i.e. finite strings ofsymbols).

2. A grammar, which tells how well-formed formulas (abbreviated wff) are constructed out of the symbols in thealphabet. It is usually required that there be a decision procedure for deciding whether a formula is well formedor not.

3. A set of axioms or axiom schemata: each axiom must be a wff.

4. A set of inference rules.

A formal system is said to be recursive (i.e. effective) if the set of axioms and the set of inference rules are decidablesets or semidecidable sets, according to context.Some theorists use the term formalism as a rough synonym for formal system, but the term is also used to refer to aparticular style of notation, for example, Paul Dirac's bra–ket notation.

3.1 Related subjects

3.1.1 Logical system

A logical system or, for short, logic, is a formal system together with a form of semantics, usually in the form ofmodel-theoretic interpretation, which assigns truth values to sentences of the formal language, that is, formulae thatcontain no free variables. A logic is sound if all sentences that can be derived are true in the interpretation, andcomplete if, conversely, all true sentences can be derived.

7

Page 15: Syntax (Logic)

8 CHAPTER 3. FORMAL SYSTEM

3.1.2 Deductive system

A deductive system (also called a deductive apparatus of a formal system) consists of the axioms (or axiom schemata)and rules of inference that can be used to derive the theorems of the system.[2]

Such a deductive system is intended to preserve deductive qualities in the formulas that are expressed in the system.Usually the quality we are concerned with is truth as opposed to falsehood. However, other modalities, such asjustification or belief may be preserved instead.In order to sustain its deductive integrity, a deductive apparatus must be definable without reference to any intendedinterpretation of the language. The aim is to ensure that each line of a derivation is merely a syntactic consequenceof the lines that precede it. There should be no element of any interpretation of the language that gets involved withthe deductive nature of the system.

3.1.3 Formal proofs

Main article: Formal proof

Formal proofs are sequences of well-formed formulas. For a wff to qualify as part of a proof, it might either be anaxiom or be the product of applying an inference rule on previous wffs in the proof sequence. The last wff in thesequence is recognized as a theorem.The point of view that generating formal proofs is all there is to mathematics is often called formalism. David Hilbertfounded metamathematics as a discipline for discussing formal systems. Any language that one uses to talk about aformal system is called ametalanguage. Themetalanguagemay be a natural language, or it may be partially formalizeditself, but it is generally less completely formalized than the formal language component of the formal system underexamination, which is then called the object language, that is, the object of the discussion in question.Once a formal system is given, one can define the set of theorems which can be proved inside the formal system.This set consists of all wffs for which there is a proof. Thus all axioms are considered theorems. Unlike the grammarfor wffs, there is no guarantee that there will be a decision procedure for deciding whether a given wff is a theoremor not. The notion of theorem just defined should not be confused with theorems about the formal system, which, inorder to avoid confusion, are usually called metatheorems.

3.1.4 Formal language

Main article: Formal language

In mathematics, logic, and computer science, a formal language is a language that is defined by precise mathematicalor machine processable formulas. Like languages in linguistics, formal languages generally have two aspects:

• the syntax of a language is what the language looks like (more formally: the set of possible expressions thatare valid utterances in the language)

• the semantics of a language are what the utterances of the language mean (which is formalized in various ways,depending on the type of language in question)

A special branch of mathematics and computer science exists that is devoted exclusively to the theory of languagesyntax: formal language theory. In formal language theory, a language is nothing more than its syntax; questions ofsemantics are not included in this specialty.

3.1.5 Formal grammar

Main article: Formal grammar

In computer science and linguistics a formal grammar is a precise description of a formal language: a set of strings.The two main categories of formal grammar are that of generative grammars, which are sets of rules for how strings

Page 16: Syntax (Logic)

3.2. SEE ALSO 9

in a language can be generated, and that of analytic grammars (or reductive grammar,[3][4] which are sets of rulesfor how a string can be analyzed to determine whether it is a member of the language. In short, an analytic grammardescribes how to recognize when strings are members in the set, whereas a generative grammar describes how to writeonly those strings in the set.

3.2 See also

3.3 References[1] Encyclopædia Britannica, Formal system definition, 2007.

[2] Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of CaliforniaPres, 1971

[3] Reductive grammar: (computer science) A set of syntactic rules for the analysis of strings to determine whether the stringsexist in a language. “Sci-Tech Dictionary McGraw-Hill Dictionary of Scientific and Technical Terms” (6th ed.). McGraw-Hill. About the Author Compiled by The Editors of the McGraw-Hill Encyclopedia of Science & Technology (New York,NY) an in-house staff who represents the cutting-edge of skill, knowledge, and innovation in science publishing.

[4] “There are two classes of formal-language definition compiler-writing schemes. The productive grammar approach is themost common. A productive grammar consists primarrly of a set of rules that describe a method of generating all possiblestrings of the language. The reductive or analytical grammar technique states a set of rules that describe a method ofanalyzing any string of characters and deciding whether that string is in the language.” "The TREE-META Compiler-Compiler System: A Meta Compiler System for the Univac 1108 and General Electric 645, University of UtahTechnical Report RADC-TR-69-83. C. Stephen Carr, David A. Luther, Sherian Erdmann” (PDF). Retrieved 5 January2015.

3.4 Further reading• RaymondM. Smullyan, 1961. Theory of Formal Systems: Annals of Mathematics Studies, Princeton UniversityPress (April 1, 1961) 156 pages ISBN 0-691-08047-X

• S. C. Kleene, 1967. Mathematical Logic Reprinted by Dover, 2002. ISBN 0-486-42533-9

• Douglas Hofstadter, 1979. Gödel, Escher, Bach: An Eternal Golden Braid ISBN 978-0-465-02656-2. 777pages.

3.5 External links• Encyclopædia Britannica, Formal system definition, 2007.

• What is a Formal System?: Some quotes from John Haugeland’s `Artificial Intelligence: The Very Idea' (1985),pp. 48–64.

• Peter Suber, Formal Systems and Machines: An Isomorphism, 1997.

Page 17: Syntax (Logic)

Chapter 4

Formation rule

In mathematical logic, formation rules are rules for describing which strings of symbols formed from the alphabet ofa formal language are syntactically valid within the language. These rules only address the location and manipulationof the strings of the language. It does not describe anything else about a language, such as its semantics (i.e. whatthe strings mean). (See also formal grammar).

4.1 Formal language

Main article: Formal language

A formal language is an organized set of symbols the essential feature being that it can be precisely defined in termsof just the shapes and locations of those symbols. Such a language can be defined, then, without any reference to anymeanings of any of its expressions; it can exist before any interpretation is assigned to it—that is, before it has anymeaning. A formal grammar determines which symbols and sets of symbols are formulas in a formal language.

4.2 Formal systems

Main article: Formal system

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with adeductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformationrules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expressionfrom one or more other expressions. Propositional and predicate calculi are examples of formal systems.

4.3 Propositional and predicate logic

The formation rules of a propositional calculus may, for instance, take a form such that;

• if we take Φ to be a propositional formula we can also take ¬Φ to be a formula;

• if we take Φ and Ψ to be a propositional formulas we can also take (Φ & Ψ), (Φ → Ψ), (Φ Ψ) and (Φ↔ Ψ)to also be formulas.

A predicate calculus will usually include all the same rules as a propositional calculus, with the addition of quantifierssuch that if we take Φ to be a formula of propositional logic and α as a variable then we can take (α)Φ and (α)Φ eachto be formulas of our predicate calculus.

10

Page 18: Syntax (Logic)

4.4. SEE ALSO 11

4.4 See also• Finite state automaton

Page 19: Syntax (Logic)

Chapter 5

Logical consequence

“Entailment” redirects here. For other uses, see Entail (disambiguation).“Therefore” redirects here. For the therefore symbol (∴), see Therefore sign.“Logical implication” redirects here. For the binary connective, see Material conditional.

Logical consequence (also entailment) is one of the most fundamental concepts in logic. It is the relationship be-tween statements that holds true when one logically “follows from” one or more others. A valid logical argumentis one in which the conclusions follow from its premises, and its conclusions are consequences of its premises.The philosophical analysis of logical consequence involves asking, 'in what sense does a conclusion follow fromits premises?' and 'what does it mean for a conclusion to be a consequence of premises?'[1] All of philosophical logiccan be thought of as providing accounts of the nature of logical consequence, as well as logical truth.[2]

Logical consequence is taken to be both necessary and formal with examples explicated using models and proofs.[1]A sentence is said to be a logical consequence of a set of sentences, for a given language, if and only if, using logicalone (i.e. without regard to any interpretations of the sentences) the sentence must be true if every sentence in theset were to be true.[3]

Logiciansmake precise accounts of logical consequence with respect to a given languageL by constructing a deductivesystem for L , or by formalizing the intended semantics for L . Alfred Tarski highlighted three salient features forwhich any adequate characterization of logical consequence needs to account: 1) that the logical consequence relationrelies on the logical form of the sentences involved, 2) that the relation is a priori, i.e. it can be determined whetheror not it holds without regard to sense experience, and 3) that the relation has a modal component.[3]

5.1 Formal accounts

Themost widely prevailing view on how to best account for logical consequence is to appeal to formality. This is to saythat whether statements follow from one another logically depends on the structure or logical form of the statementswithout regard to the contents of that form.Syntactic accounts of logical consequence rely on schemes using inference rules. For instance, we can express thelogical form of a valid argument as “AllA areB . All C are A . Therefore, All C areB .” This argument is formallyvalid, because every instance of arguments constructed using this scheme are valid.This is in contrast to an argument like “Fred is Mike’s brother’s son. Therefore Fred is Mike’s nephew.” Since thisargument depends on the meanings of the words “brother”, “son”, and “nephew”, the statement “Fred is Mike’snephew” is a so-called material consequence of “Fred is Mike’s brother’s son,” not a formal consequence. A formalconsequence must be true in all cases, however this is an incomplete definition of formal consequence, since even theargument " P is Q 's brother’s son, therefore P is Q 's nephew” is valid in all cases, but is not a formal argument.[1]

12

Page 20: Syntax (Logic)

5.2. A PRIORI PROPERTY OF LOGICAL CONSEQUENCE 13

5.2 A priori property of logical consequence

If you know thatQ follows logically fromP no information about the possible interpretations ofP orQwill affect thatknowledge. Our knowledge thatQ is a logical consequence of P cannot be influenced by empirical knowledge.[1] De-ductively valid arguments can be known to be so without recourse to experience, so they must be knowable a priori.[1]However, formality alone does not guarantee that logical consequence is not influenced by empirical knowledge. Sothe a priori property of logical consequence is considered to be independent of formality.[1]

5.3 Proofs and models

The two prevailing techniques for providing accounts of logical consequence involve expressing the concept in termsof proofs and via models. The study of the syntactic consequence (of a logic) is called (its) proof theory whereas thestudy of (its) semantic consequence is called (its) model theory.[4]

5.3.1 Syntactic consequence

See also: ∴ and ⊢

A formula A is a syntactic consequence[5][6][7][8] within some formal system FS of a set Γ of formulas if there is aformal proof in FS of A from the set Γ .

Γ ⊢FS A

Syntactic consequence does not depend on any interpretation of the formal system.[9]

5.3.2 Semantic consequence

See also: ⊨

A formula A is a semantic consequence within some formal system FS of a set of statements Γ

Γ |=FS A,

if and only if there is no model I in which all members of Γ are true and A is false.[10] Or, in other words, the set ofthe interpretations that make all members of Γ true is a subset of the set of the interpretations that make A true.

5.4 Modal accounts

Modal accounts of logical consequence are variations on the following basic idea:

Γ ⊢ A is true if and only if it is necessary that if all of the elements of Γ are true, then A is true.

Alternatively (and, most would say, equivalently):

Γ ⊢ A is true if and only if it is impossible for all of the elements of Γ to be true and A false.

Such accounts are called “modal” because they appeal to the modal notions of logical necessity and logical possibility.'It is necessary that' is often expressed as a universal quantifier over possible worlds, so that the accounts above translateas:

Page 21: Syntax (Logic)

14 CHAPTER 5. LOGICAL CONSEQUENCE

Γ ⊢ A is true if and only if there is no possible world at which all of the elements of Γ are true and A isfalse (untrue).

Consider the modal account in terms of the argument given as an example above:

All frogs are green.Kermit is a frog.Therefore, Kermit is green.

The conclusion is a logical consequence of the premises because we can't imagine a possible world where (a) all frogsare green; (b) Kermit is a frog; and (c) Kermit is not green.

5.4.1 Modal-formal accounts

Modal-formal accounts of logical consequence combine the modal and formal accounts above, yielding variations onthe following basic idea:

Γ ⊢ A if and only if it is impossible for an argument with the same logical form as Γ / A to have truepremises and a false conclusion.

5.4.2 Warrant-based accounts

The accounts considered above are all “truth-preservational,” in that they all assume that the characteristic feature of agood inference is that it never allows one to move from true premises to an untrue conclusion. As an alternative, somehave proposed "warrant-preservational” accounts, according to which the characteristic feature of a good inferenceis that it never allows one to move from justifiably assertible premises to a conclusion that is not justifiably assertible.This is (roughly) the account favored by intuitionists such as Michael Dummett.

5.4.3 Non-monotonic logical consequence

Main article: Non-monotonic logic

The accounts discussed above all yield monotonic consequence relations, i.e. ones such that if A is a consequence ofΓ , then A is a consequence of any superset of Γ . It is also possible to specify non-monotonic consequence relationsto capture the idea that, e.g., 'Tweety can fly' is a logical consequence of

{Birds can typically fly, Tweety is a bird}

but not of

{Birds can typically fly, Tweety is a bird, Tweety is a penguin}.

For more on this, see Belief revision#Non-monotonic inference relation.

5.5 See also

5.6 Notes[1] Beall, JC and Restall, Greg, Logical Consequence The Stanford Encyclopedia of Philosophy (Fall 2009 Edition), Edward

N. Zalta (ed.).

[2] Quine, Willard Van Orman, Philosophy of logic

Page 22: Syntax (Logic)

5.7. RESOURCES 15

[3] McKeon, Matthew, Logical Consequence Internet Encyclopedia of Philosophy.

[4] Kosta Dosen (1996). “Logical consequence: a turn in style”. In Maria Luisa Dalla Chiara, Kees Doets, Daniele Mundici,Johan van Benthem. Logic and Scientific Methods: Volume One of the Tenth International Congress of Logic, Methodologyand Philosophy of Science, Florence, August 1995. Springer. p. 292. ISBN 978-0-7923-4383-7.

[5] Dummett, Michael (1993) Frege: philosophy of language Harvard University Press, p.82ff

[6] Lear, Jonathan (1986) Aristotle and Logical Theory Cambridge University Press, 136p.

[7] Creath, Richard, and Friedman, Michael (2007) The Cambridge companion to Carnap Cambridge University Press, 371p.

[8] FOLDOC: “syntactic consequence”

[9] Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of CaliforniaPres, 1971, p. 75.

[10] Etchemendy, John, Logical consequence, The Cambridge Dictionary of Philosophy

5.7 Resources• Anderson, A.R.; Belnap, N.D., Jr. (1975), Entailment 1, Princeton, NJ: Princeton.

• Barwise, Jon; Etchemendy, John (2008), Language, Proof and Logic, Stanford: CSLI Publications.

• Brown, Frank Markham (2003), Boolean Reasoning: The Logic of Boolean Equations 1st edition, KluwerAcademic Publishers, Norwell, MA. 2nd edition, Dover Publications, Mineola, NY, 2003.

• Davis, Martin, (editor) (1965), The Undecidable, Basic Papers on Undecidable Propositions, Unsolvable Prob-lems And Computable Functions, New York: Raven Press. Papers include those by Gödel, Church, Rosser,Kleene, and Post.

• Dummett, Michael (1991), The Logical Basis of Metaphysics, Harvard University Press.

• Edgington, Dorothy (2001), Conditionals, Blackwell in Lou Goble (ed.), The Blackwell Guide to PhilosophicalLogic.

• Edgington, Dorothy (2006), Conditionals in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy.

• Etchemendy, John (1990), The Concept of Logical Consequence, Harvard University Press.

• Goble, Lou, ed. (2001), The Blackwell Guide to Philosophical Logic, Blackwell.

• Hanson, William H (1997), “The concept of logical consequence”, The Philosophical Review 106 365–409.

• Hendricks, Vincent F. (2005), Thought 2 Talk: A Crash Course in Reflection and Expression, New York: Au-tomatic Press / VIP, ISBN 87-991013-7-8

• Planchette, P. A. (2001), Logical Consequence in Goble, Lou, ed., The Blackwell Guide to Philosophical Logic.Blackwell.

• Quine, W.V. (1982), Methods of Logic, Cambridge, MA: Harvard University Press (1st ed. 1950), (2nd ed.1959), (3rd ed. 1972), (4th edition, 1982).

• Shapiro, Stewart (2002), Necessity, meaning, and rationality: the notion of logical consequence in D. Jacquette,ed., A Companion to Philosophical Logic. Blackwell.

• Tarski, Alfred (1936), On the concept of logical consequence Reprinted in Tarski, A., 1983. Logic, Semantics,Metamathematics, 2nd ed. Oxford University Press. Originally published in Polish and German.

• A paper on 'implication' from math.niu.edu, Implication

• A definition of 'implicant' AllWords

• Ryszard Wójcicki (1988). Theory of Logical Calculi: Basic Theory of Consequence Operations. Springer.ISBN 978-90-277-2785-5.

Page 24: Syntax (Logic)

Chapter 6

Logical constant

In logic, a logical constant of a languageL is a symbol that has the same semantic value under every interpretation ofL . Two important types of logical constants are logical connectives and quantifiers. The equality predicate (usuallywritten '=') is also treated as a logical constant in many systems of logic. One of the fundamental questions in thephilosophy of logic is “What is a logical constant?"; that is, what special feature of certain constants makes themlogical in nature?[1]

Some symbols that are commonly treated as logical constants are:Many of these logical constants are sometimes denoted by alternate symbols (e.g., the use of the symbol "&" ratherthan "∧" to denote the logical and).

6.1 See also• Non-logical symbol

• Logical value

• Logical connective

6.2 References[1] Carnap

6.3 External links• Stanford Encyclopedia of Philosophy entry on logical constants

17

Page 25: Syntax (Logic)

Chapter 7

Logical Syntax of Language

Rudolf Carnap (/ˈkɑrnæp/;[1] German: [ˈkaɐ̯naːp]; May 18, 1891 – September 14, 1970) was a German-born philoso-pher who was active in Europe before 1935 and in the United States thereafter. He was a major member of theVienna Circle and an advocate of logical positivism. He is considered “one of the giants among twentieth-centuryphilosophers.”[2]

7.1 Life and work

Carnap’s Birthplace in Wuppertal

Carnap’s father had risen from the status of a poor ribbon-weaver to become the owner of a ribbon-making fac-tory. His mother came from academic stock; her father was an educational reformer and her oldest brother was thearchaeologist Wilhelm Dörpfeld. As a ten-year-old, Carnap accompanied his uncle on an expedition to Greece.[3]

He began his formal education at the Barmen Gymnasium. From 1910 to 1914, he attended the University ofJena, intending to write a thesis in physics. But he also studied carefully Kant's Critique of Pure Reason during acourse taught by Bruno Bauch, and was one of very few students to attend Gottlob Frege's courses in mathematical

18

Page 26: Syntax (Logic)

7.1. LIFE AND WORK 19

logic. While Carnap held moral and political opposition to World War I, he felt obligated to serve in the Germanarmy. After three years of service, he was given permission to study physics at the University of Berlin, 1917–18,where Albert Einstein was a newly appointed professor. Carnap then attended the University of Jena, where hewrote a thesis defining an axiomatic theory of space and time. The physics department said it was too philosophical,and Bruno Bauch of the philosophy department said it was pure physics. Carnap then wrote another thesis, withBauch’s supervision, on the theory of space in a more orthodox Kantian style, and published as Der Raum (Space)in a supplemental issue of Kant-Studien (1922). In it he makes the clear distinction between formal, physical andperceptual (e.g., visual) spaces.Frege’s course exposed him to Bertrand Russell's work on logic and philosophy, which put a sense of the aims tohis studies. He accepted the effort to surpass traditional philosophy with logical innovations that inform the sciences.He wrote a letter to Russell, who responded by copying by hand long passages from his Principia Mathematica forCarnap’s benefit, as neither Carnap nor his university could afford a copy of this epochal work. In 1924 and 1925, heattended seminars led by Edmund Husserl, the founder of phenomenology, and continued to write on physics from alogical positivist perspective.Carnap discovered a kindred spirit when he met Hans Reichenbach at a 1923 conference. Reichenbach introducedCarnap to Moritz Schlick, a professor at the University of Vienna who offered Carnap a position in his department,which Carnap accepted in 1926. Carnap thereupon joined an informal group of Viennese intellectuals that came tobe known as the Vienna Circle, directed largely by Moritz Schlick and including Hans Hahn, Friedrich Waismann,Otto Neurath, and Herbert Feigl, with occasional visits by Hahn’s student Kurt Gödel. When Wittgenstein visitedVienna, Carnap would meet with him. He (with Hahn and Neurath) wrote the 1929 manifesto of the Circle, and(with Hans Reichenbach) initiated the philosophy journal Erkenntnis.In 1928, Carnap published two important books:

• The Logical Structure of theWorld (German: “Der logischeAufbau derWelt”), in which he developed a rigorousformal version of empiricism, defining all scientific terms in phenomenalistic terms. The formal system of theAufbau (as the work is commonly termed) was grounded in a single primitive dyadic predicate, which is satisfiedif “two” individuals “resemble” each other. The Aufbau was greatly influenced by Principia Mathematica, andwarrants comparison with the mereotopological metaphysics A. N. Whitehead developed over 1916–29. Itappears, however, that Carnap soon became somewhat disenchanted with this book. In particular, he did notauthorize an English translation until 1967.

• Pseudoproblems in Philosophy asserted that many philosophical questions were meaningless, i.e., the way theywere posed amounted to an abuse of language. An operational implication of this opinion was taken to be theelimination of metaphysics from responsible human discourse. This is the statement for which Carnap was bestknown for many years.

See also: Carnap–Ramsey sentence

In February 1930 Tarski lectured in Vienna, and during November 1930 Carnap visited Warsaw. On these occasionshe learned much about Tarski’s model theoretic method of semantics. Rose Rand, another philosopher in the ViennaCircle, noted, “Carnap’s conception of semantics starts from the basis given in Tarski’s work but a distinction is madebetween logical and non-logical constants and between logical and factual truth... At the same time he worked withthe concepts of intension and extension and took these two concepts as a basis of a new method of semantics.”[4] In1931, Carnap was appointed Professor at the German language University of Prague. There he wrote the book thatwas to make him the most famous logical positivist and member of the Vienna Circle, his Logical Syntax of Language(Carnap 1934). In this work, Carnap advanced his Principle of Tolerance, according to which there is not any suchthing as a “true” or “correct” logic or language. One is free to adopt whatever form of language is useful for one’spurposes. In 1933, W. V. Quine met Carnap in Prague and discussed the latter’s work at some length. Thus beganthe lifelong mutual respect these two men shared, one that survived Quine’s eventual forceful disagreements with anumber of Carnap’s philosophical conclusions.Carnap, whose socialist and pacifist beliefs put him at risk in Nazi Germany, emigrated to the United States in1935 and became a naturalized citizen in 1941. Meanwhile back in Vienna, Moritz Schlick was murdered in 1936.From 1936 to 1952, Carnap was a professor of philosophy at the University of Chicago. During the late 1930s,Carnap offered an assistant position in philosophy to Carl Gustav Hempel, who accepted. The two conducted researchincluding Logical Syntax.[5] Thanks partly to Quine’s help, Carnap spent the years 1939–41 at Harvard, where he wasreunited with Tarski. Carnap (1963) later expressed some irritation about his time at Chicago, where he and Charles

Page 27: Syntax (Logic)

20 CHAPTER 7. LOGICAL SYNTAX OF LANGUAGE

W. Morris were the only members of the department committed to the primacy of science and logic. (Their Chicagocolleagues included Richard McKeon, Mortimer Adler, Charles Hartshorne, and Manley Thompson.) Carnap’s yearsat Chicago were nonetheless very productive ones. He wrote books on semantics (Carnap 1942, 1943, 1956), modallogic, being very similar in Carnap (1956) to the now-standard possible worlds semantics for that logic Saul Kripkeproposed starting in 1959, and on the philosophical foundations of probability and induction (Carnap 1950, 1952).After a stint at the Institute for Advanced Study in Princeton, he joined the philosophy department at UCLA in 1954,Hans Reichenbach having died the previous year. He had earlier refused an offer of a similar job at the University ofCalifornia, because accepting that position required that he sign a loyalty oath, a practice to which he was opposed onprinciple. While at UCLA, he wrote on scientific knowledge, the analytic – synthetic dichotomy, and the verificationprinciple. His writings on thermodynamics and on the foundations of probability and induction, were publishedposthumously as Carnap (1971, 1977, 1980).Carnap taught himself Esperanto when he was 14 years of age, and remained sympathetic to it (Carnap 1963). Helater attended the World Congress of Esperanto in 1908 and 1922, and employed the language while traveling.Carnap had four children by his first marriage to Elizabeth Schöndube, which ended in divorce in 1929. He marriedhis second wife, Elizabeth Ina Stögner, in 1933.[3] Ina committed suicide in 1964.

7.2 Logical syntax

Carnap’s Logical Syntax of Language can be regarded as a response to Wittgenstein 's Tractatus.Carnap elaborated and extended the concept of logical syntax proposed by Wittgenstein in the Tractatus (Section3.325).

3.325. In order to avoid such errors we must make use of a sign-language that excludes them by notusing the same sign for different symbols and by not using in a superficially similar way signs that havedifferent modes of signification: that is to say, a sign-language that is governed by logical grammar—bylogical syntax. ......

—Wittgenstein , Section 3.325, Tractatus

However, Wittgenstein stated that propositions cannot represent logical form.

4.121. Propositions cannot represent logical form: it is mirrored in them. What finds its reflection inlanguage, language cannot represent. What expresses itself in language, we cannot express by means oflanguage.

Propositions show the logical form of reality. They display it.— Wittgenstein , Section 4.121, Tractatus

Carnap disagreed. Wittgenstein proposed the idea of logical syntax. It is Carnap who designed, formulated andimplemented the details of logical syntax in philosophical analysis. Carnap defined logical syntax as:

By the logical syntax of a language, we mean the formal theory of the linguistic forms of that lan-guage – the systematic statement of the formal rules which govern it together with the development of theconsequences which follow from these rules.

A theory, a rule, a definition, or the like is to be called formal when no reference is made in it eitherto the meaning of the symbols (for examples, the words) or to the sense of the expressions (e.g. thesentences), but simply and solely to the kinds and order of the symbols from which the expressions areconstructed.

— Carnap , Page 1, Logical Syntax of Language

In the U.S, the concept of logical syntax helped the development of natural language processing and compiler design.

7.2.1 The purpose of logical syntax

The purpose of logical syntax is to provide a system of concepts, a language, by the help of which the results of logicalanalysis will be exactly formulable.

Page 28: Syntax (Logic)

7.3. REJECTION OF METAPHYSICS 21

Carnap stated :

Philosophy is to be replaced by the logic of science – that is to say, by the logical analysis of theconcepts and sentences of the sciences, for the logic of science is nothing other than the logical syntax ofthe language of science.

—Carnap , Foreword, Logical Syntax of Language

......According to this view, the sentences ofmetaphysics are pseudo-sentences which on logical analysisare proved to be either empty phrases or phrases which violate the rules of syntax. Of the so-calledphilosophical problems, the only questions which have any meaning are those of the logic of science. Toshare this view is to substitute logical syntax for philosophy.

—Carnap , Page 8, Logical Syntax of Language

Carnap wanted only to end metaphysics but not philosophy.

7.3 Rejection of metaphysics

Carnap, in his book Philosophy and Logical Syntax, used the concept of verifiability to reject metaphysics.

7.3.1 The function of logical analysis

Carnap used the method of logical analysis to reject metaphysics.

The function of logical analysis is to analyse all knowledge, all assertions of science and of everydaylife, in order to make clear the sense of each such assertion and the connections between them. One of theprincipal tasks of the logical analysis of a given proposition is to find out the method of verification forthat proposition.

—Carnap , Page. 9-10 , Philosophy and Logical Syntax

7.4 Selected publications

For links to Carnap’s publications and discussions of his work, see ""Carnap” in All Fields of Study”. MicrosoftAcademic Search. Retrieved 2013-05-26.

• 1922. Der Raum: Ein Beitrag zur Wissenschaftslehre, Kant-Studien, Ergänzungshefte, no. 56. His Ph.D. thesis.

• 1926. Physikalische Begriffsbildung. Karlsruhe: Braun.

• 1928. Scheinprobleme in der Philosophie (Pseudoproblems of Philosophy). Berlin: Weltkreis-Verlag.

• 1928. Der Logische Aufbau der Welt. Leipzig: Felix Meiner Verlag. English translation by Rolf A. George,1967. The Logical Structure of the World. Pseudoproblems in Philosophy. University of California Press. ISBN0-812-69523-2

• 1929. Abriss der Logistik, mit besonderer Berücksichtigung der Relationstheorie und ihrer Anwendungen. Springer.[6]

• 1934. Logische Syntax der Sprache. English translation 1937, The Logical Syntax of Language. Kegan Paul.[7]

• 1996 (1935). Philosophy and Logical Syntax. Bristol UK: Thoemmes. Excerpt.

• 1939, Foundations of Logic and Mathematics in International Encyclopedia of Unified Science, Vol. I, no. 3.University of Chicago Press.[8]

• 1942. Introduction to Semantics. Harvard Uni. Press.

• 1943. Formalization of Logic. Harvard Uni. Press.

Page 29: Syntax (Logic)

22 CHAPTER 7. LOGICAL SYNTAX OF LANGUAGE

• 1945. On Inductive Logic in Philosophy of Science, Vol.12, p. 72-97.

• 1945. The Two Concepts of Probability in Philosophy and Phenomenological Research, Vol.5, No.4 (Jun), p.513-532.

• 1947. On the Application of Inductive Logic in Philosophy and Phenomenological Research, Vol. 8, p. 133-148.

• 1956 (1947). Meaning and Necessity: a Study in Semantics and Modal Logic. University of Chicago Press.

• 1950. Logical Foundations of Probability. University of Chicago Press. Pp. 3–15 online.

• 1950. "Empiricism, Semantics, Ontology", Revue Internationale de Philosophie 4: 20–40.

• 1952. The Continuum of Inductive Methods. University of Chicago Press.

• 1958. Introduction to Symbolic Logic with Applications. Dover.

• 1963, “Intellectual Autobiography” in Schilpp (1963: 1–84).

• 1966. Philosophical Foundations of Physics. Martin Gardner, ed. Basic Books. Online excerpt.

• 1971. Studies in inductive logic and probability, Vol. 1. University of California Press.

• 1977. Two essays on entropy. Shimony, Abner, ed. University of California Press.

• 1980. Studies in inductive logic and probability, Vol. 2. Jeffrey, R. C., ed. University of California Press.

• 2000. Untersuchungen zur Allgemeinen Axiomatik. Edited from unpublished manuscript by T. Bonk and J.Mosterín. Darmstadt: Wissenschaftliche Buchgesellschaft. 167 pp. ISBN 3-534-14298-5.

Online bibliography. Under construction, with no entries dated later than 1937.For a more comprehensive bibliography, see also http://fr.wikipedia.org/wiki/Rudolf_Carnap

7.5 See also• Analytic–synthetic distinction

• Carnap–Ramsey sentence

• Internal–external distinction

• Meta-ontology

• Philosophy of science

• Skepticism

• Visual space

7.6 References[1] “Carnap”. Random House Webster’s Unabridged Dictionary.

[2] http://texts.cdlib.org/view?docId=hb6h4nb3q7&doc.view=frames&chunk.id=div00004&toc.depth=1&toc.id=

[3] Quine, W.V. and Rudolf Carnap (1990). Dear Carnap, Dear Van: The Quine-Carnap Correspondence and Related Work.Berkeley, CA: University of California Press. p. 23.

[4] Rand, Rose. “Reading Notes and Summaries onWorks by Rudolph Carnap, 1932 and Undated” (PDF). Rose Rand Papers.Special Collections Department, University of Pittsburgh. Retrieved May 16, 2013.

[5] Carnap, Rudolf. “Rudolf Carnap Papers”. Special Collections Department, University of Pittsburgh. Retrieved September17, 2013.

Page 30: Syntax (Logic)

7.7. SOURCES 23

[6] Weiss, Paul (1929). “Review: Abriss der Logistik by Rudolf Carnap” (PDF). Bull. Amer. Math. Soc. 35 (6): 880.doi:10.1090/s0002-9904-1929-04818-3.

[7] Mac Lane, Saunders (1938). “Review: The Logical Syntax of Language by Rudolf Carnap, translated from the German byAmethe Smeaton” (PDF). Bull. Amer. Math. Soc. 44 (3): 171–176.

[8] Church, Alonzo (1939). “Review: Foundations of Logic and Mathematics by Rudolf Carnap” (PDF). Bull. Amer. Math.Soc. 45 (11): 821–822. doi:10.1090/s0002-9904-1939-07085-7.

7.7 Sources• Richard Creath, Michael Friedman, ed. (2007). The Cambridge companion to Carnap. Cambridge UniversityPress. ISBN 0521840155.

• Roger F Gibson, ed. (2004). The Cambridge companion to Quine. Cambridge University Press. ISBN0521639492.

• Ivor Grattan-Guinness, 2000. In Search of Mathematical Roots. Princeton Uni. Press.

• Thomas Mormann, 2000. “Rudolf Carnap” (book). München, Beck.

• Willard Quine

• 1951, Two Dogmas of Empiricism. The Philosophical Review 60: 20–43. Reprinted in his 1953 From aLogical Point of View. Harvard University Press.

• 1985, The Time of My Life: An Autobiography. MIT Press.

• Richardson, Alan W., 1998. Carnap’s construction of the world : the Aufbau and the emergence of logicalempiricism. Cambridge Uni. Press.

• Schilpp, P. A., ed., 1963. The Philosophy of Rudolf Carnap. LaSalle IL: Open Court.

• Spohn, Wolfgang, ed., 1991. Erkenntnis Orientated: A Centennial Volume for Rudolf Carnap and Hans Re-ichenbach. Kluwer Academic Publishers.

• 1991. Logic, Language, and the Structure of Scientific Theories: Proceedings of the Carnap-Reichenbach Cen-tennial, University of Konstanz, May 21–24, 1991. University of Pittsburgh Press.

• Wagner, Pierre, ed., 2009. Carnap’s Logical Syntax of Language. Palgrave Macmillan.

• Wagner, Pierre, ed., 2012. Carnap’s Ideal of Explication and Naturalism. Palgrave Macmillan.

7.8 External links• Rudolf Carnap entry by Mauro Murzi in the Internet Encyclopedia of Philosophy

• Carnap’s Modal Logic entry by M.J. Cresswell in the Internet Encyclopedia of Philosophy

• Rudolf Carnap Webpage and Directory of Internet Resources

• Homepage of the Collected Works of Rudolf Carnap. Department of Philosophy, Carnegie Mellon University

• Precis of Carnap’s philosophy.

• The Life of Rudolf Carnap, Philosophy at RBJones.com

• R. Carnap: “Von der Erkenntnistheorie zur Wissenschaftslogik”, Paris Congress in 1935, Paris, 1936.

• R. Carnap: "Über die Einheitssprache der Wissenschaft”, Paris Congress in 1935, Paris, 1936.

• R. Carnap: “Wahrheit und Bewährung”, Paris Congress in 1935, Paris, 1936.

• Rudolf Carnap Papers: (Rudolf Carnap Papers, 1905-1970, ASP.1974.01, Special Collections Department,University of Pittsburgh.)

Page 31: Syntax (Logic)

24 CHAPTER 7. LOGICAL SYNTAX OF LANGUAGE

• Das Fremdpsychische bei Rudolf Carnap (German) by Robert Bauer.

• FBI file on Rudolph Carnap

• Luchte, James, 2007. “Martin Heidegger and Rudolf Carnap: Radical Phenomenology, Logical Positivismand the Roots of the Continental/Analytic Divide,” Philosophy Today, Vol. 51, No. 3, 241–260.

Page 32: Syntax (Logic)

Chapter 8

Metasyntactic variable

This article is about metasyntactic variables in computer science and hacker culture. For metasyntactic variables asused in formal logic, see Metavariable (logic). For general usage, see placeholder name.

A metasyntactic variable is a placeholder name used in computer science, a word without meaning intended to besubstituted by some objects pertaining to the context where it is used. The word foo as used in IETF Requests forComments is a good example.[1]

By mathematical analogy, a metasyntactic variable is a word that is a variable for other words, just as in algebra lettersare used as variables for numbers.[1] Any symbol or word which does not violate the syntactic rules of the languagecan be used as a metasyntactic variable. For specifications written in natural language, nonsense words are commonlyused as metasyntactic variables.Metasyntactic variables have a secondary, implied meaning to the reader (often students), which makes them differentfrom normal metavariables. It is understood by those who have studied computer science that certain words areplaceholders or examples only and should or must be replaced in a production-level computer program.In hacker culture, “metasyntactic variable” has come to denote some typical (otherwise meaningless) words used asmetavariables in computing; see reification. For example, The Hacker’s Dictionary (1st ed.) defined FOO as “the firstmetasyntactic variable” and BAR as “the second metasyntactic variable”, explaining that “When you have to inventan arbitrary temporary name for something for the sake of exposition, FOO is usually used. If you need a secondone, BAR or BAZ is usually used; there is a slight preference at MIT for bar and at Stanford for baz. Clearly, barwas the original, for the concatenation FOOBAR is widely used also, and this in turn can be traced to the obsceneacronym 'FUBAR' that arose in the armed forces during World War II. [...] A hacker avoids using 'foo' as the realname of anything. Indeed, a standard convention is that any file with 'foo' in its name is temporary and can be deletedon sight.”[2] The names of these consecrated “metasyntactic variables” are also commonly used as actual identifiers(for variables, functions, etc.) in tutorial programming examples when their purpose is to emphasize syntax; in thisusage, “metasyntactic variable” is synonymous with "meaningless word".[3]

8.1 Construction

• meta- means providing information about, or transcending,

• syntax denotes the grammatical arrangement of words or the grammatical rules of a programming language,and

• a variable is something that can assume a value, or something likely to vary.

So metasyntactic variable denotes a word that “transcends grammar and can assume a value” or one that is “morecomprehensive than suggested by its grammatical arrangement and is likely to vary”. It may also denote a word thatprovides information about the grammatical arrangement of words by being able to assume a value that is expectedto vary.

25

Page 33: Syntax (Logic)

26 CHAPTER 8. METASYNTACTIC VARIABLE

8.2 Examples

RFC 772 (cited in RFC 3092) contains for instance:All is well; now the recipients can be specified. S:MRCPTO:<Foo@Y><CRLF>R: 200OKS:MRCPTO:<Raboof@Y><CRLF>R: 553No such user here S:MRCPTO:<bar@Y><CRLF>R: 200OKS:MRCPTO:<@Y,@X,fubar@Z><CRLF> R: 200 OK Note that the failure of “Raboof” has no effect on the storage of mail for “Foo”, “bar” or themail to be forwarded to “fubar@Z” through host “X”.Both the IETF RFCs and computer programming languages are rendered in plain text, making it necessary to dis-tinguish metasyntactic variables by a naming convention, more or less obvious from context. If rich text formattingis available, e.g. as in the HTML produced from texinfo sources, then a typographical convention may be used, asdone for the example in the GNU Fortran manual:[4]

A metasyntactic variable—that is, a name used in this document to serve as a placeholder for what-ever text is used by the user or programmer—appears as shown in the following example:

“The INTEGER ivar statement specifies that ivar is a variable or array of type INTEGER.”In the above example, any valid text may be substituted for the metasyntactic variable ivar to make

the statement apply to a specific instance, as long as the same text is substituted for both occurrences ofivar.

The above example uses italics to denote metavariables (borrowing from the common convention to use italics forvariables in mathematics), although italics are also used in the same text for emphasizing other words. (The docu-mentation for texinfo emphasizes the distinction between metavariables and mere variables used in a programminglanguage being documented in some texinfo file as: “Use the @var command to indicate metasyntactic variables. Ametasyntactic variable is something that stands for another piece of text. For example, you should use a metasyntacticvariable in the documentation of a function to describe the arguments that are passed to that function. Do not use@var for the names of particular variables in programming languages. These are specific names from a program, so@code is correct for them.”[5]) Another point reflected in the above example is the convention that a metavariable isto be uniformly substituted with the same instance in all its appearances in a given schema. This is in contrast withnonterminal symbols in formal grammars where the nonterminals on the right of a production can be substituted bydifferent instances.[6]

A third example of the use of the “metasyntactic variables” foo and bar, this time as actual identifiers in a program-ming (interview) example is contrasting the following C++ function prototypes for their different argument passingmechanisms:[7]

void foo(Fruit bar); void foo(Fruit* bar); void foo(Fruit& bar);

8.3 Words commonly used as metasyntactic variables

8.3.1 Arabic

In Arabic, the word “kedha” (كذا) is often used in the same way English speakers use the word “bla” as in, “kedha,kedha, kedha” to mean “this, that, and the other thing” or, “such and such”. Similarly, the names “Fullan” (فلان)and "'Allan” (علان) are used to refer to non-specific persons, a practice which has been adopted in other languages(see Portuguese, Spanish, Turkish and Persian below).

8.3.2 English

A “standard list of metasyntactic variables used in syntax examples” often used in the United States is: foo, bar, baz,qux, quux, corge, grault, garply, waldo, fred, plugh, xyzzy, thud.[1] The word foo occurs in over 330 RFCs and baroccurs in over 290.[8] Wibble, wobble, wubble, Fred and flob are often used in the UK.[9]

Due to English being the foundation-language, or lingua franca, of most computer programming languages thesevariables are also commonly seen even in programs and examples of programs written for other spoken-languageaudiences.

Page 34: Syntax (Logic)

8.3. WORDS COMMONLY USED AS METASYNTACTIC VARIABLES 27

The typical names may depend however on the subculture that has developed around a given programming language.For example, spam, ham, and eggs are the principal metasyntactic variables used in the Python programming lan-guage.[10] This is a reference to the famous comedy sketch Spam by Monty Python, the eponym of the language.[11]

The R programming language often adds norf to the list.[12][13]

8.3.3 German

In German, the words bla, blubb and blabla are commonly used as names for metasyntactic variables (comparablewith English blah, blah-blah).

8.3.4 French

In French, the words toto, titi, tata, tutu, truc, bidule, machin and azerty are commonly used (AZERTY being theorder of first letters on French keyboards).

8.3.5 Hebrew

In Hebrew, the words chupchick and stam are commonly used.

8.3.6 Italian

In Italian, the word pippo is commonly used. Strangely enough, besides being a diminutive of the first namesGiuseppe(Joseph) and Filippo (Philip), pippo is the Italian name of the Disney character Goofy, but it is probably used justbecause of its sound which is quite strange; moreover, this name can be typed very quickly on a computer keyboard,as it involves three near keys (P, I and O). Sometimes the words pluto and paperino (Italian name of Donald Duck)can be hence used as additional terms.

8.3.7 Japanese

In Japanese, the words hoge and piyo are commonly used, with other common words and variants being fuga, hogera,and hogehoge.[14] Note that -ra is a pluralizing ending in Japanese, and reduplication is also used for pluralizing. Theorigin of hoge as a metasyntactic variable is not known, but it is believed to date to the early 1980s.[14] Hoge was alsoused extensively in the lyrics of the theme song for the Dororo animated adaptation, in 1969.

8.3.8 Portuguese

In Portuguese, the words fulano, sicrano and beltrano are commonly used to refer to people.[15] To refer to objectsin general, the most common placeholder name is XPTO.

8.3.9 Spanish

In Spanish, the words fulano,[16] mengano[17] and zutano[18] are commonly used, often followed by de tal mocking alastname in Spanish form (e.g. Fulano de Tal). These words have the constraint that they can only be used to referto people, as in the case with Portuguese. Also, when referring to an example of some person performing a certainaction, Perico de los Palotes can also be used as a placeholder for a real name. In place of people or objects (includingnumbers, etc.) the usual X, Y, Z are used (e.g. Person X, Quantity Z).

8.3.10 Turkish

In Turkish, the words falan, filan, hede, hödö, hebele, hübele are commonly used.

Page 35: Syntax (Logic)

28 CHAPTER 8. METASYNTACTIC VARIABLE

8.3.11 Persian

In Persian, the word folân is used for Foo and the words bahmān and bisār used for Bar.

8.4 See also• Metavariable (logic)

• Alice and Bob

• John Doe

• Fnord

• Free variables and bound variables

• Gadget

• Lorem ipsum

• Nonce word

• Placeholder name

• Widget

• Smurf

8.5 References[1] RFC 3092 (rfc3092) - Etymology of “Foo”

[2] As reproduced in Hank Bromley; Richard Lamson (1987). Lisp lore: a guide to programming the Lisp machine. KluwerAcademic. p. 291.

[3] Mark Slagell (2002). Sams Teach Yourself Ruby in 21 Days. Sams Publishing. p. 108. ISBN 978-0-672-32252-5.

[4] http://gcc.gnu.org/onlinedocs/gcc-3.3.5/g77/Notation-Used.html

[5] http://sunsite.ualberta.ca/Documentation/Gnu/texinfo-4.0/html_chapter/texinfo_10.html

[6] R. D. Tennent (2002). Specifying Software: A Hands-On Introduction. Cambridge University Press. pp. 36–37 and 210.ISBN 978-0-521-00401-5.

[7] John Mongan; Noah Kindler; Eric Giguere (2012). Programming Interviews Exposed: Secrets to Landing Your Next Job.John Wiley & Sons. p. 242. ISBN 978-1-118-28720-0.

[8] RFC-Editor.org

[9] wibble. (n.d.). Jargon File 4.4.7. Retrieved February 23, 2010, from

[10] Python Tutorial

[11] General Python FAQ

[12] http://www.vidyokarma.com/article.php?c=R-programming-language&page=1

[13] http://use-r.com/page/2/

[14] (Japanese)

[15] http://www.priberam.pt/dlpo/fulano

[16] http://lema.rae.es/drae/?val=fulano

[17] http://lema.rae.es/drae/?val=mengano

[18] http://lema.rae.es/drae/?val=zutano

Page 36: Syntax (Logic)

8.6. EXTERNAL LINKS 29

8.6 External links• Definition of metasyntactic variable, with examples.

• Examples of metasyntactic variables used in Commonwealth Hackish, such as wombat.

• Variable “foo” and Other Programming Oddities

Page 37: Syntax (Logic)

Chapter 9

Metavariable

For the term as used in computer science and hacking culture, see Metasyntactic variable.

In logic, a metavariable (also metalinguistic variable[1] or syntactical variable[2]) is a symbol or symbol stringwhich belongs to a metalanguage and stands for elements of some object language. For instance, in the sentence

Let A and B be two sentences of a language ℒ

the symbols A and B are part of the metalanguage in which the statement about the object language ℒ is formulated.John Corcoran considers this terminology unfortunate because it obscures the use of schemata and because such“variables” do not actually range over a domain.[3]:220

The convention is that a metavariable is to be uniformly substituted with the same instance in all its appearances in agiven schema. This is in contrast with nonterminal symbols in formal grammars where the nonterminals on the rightof a production can be substituted by different instances.[4]

Attempts to formalize the notion of metavariable result in some kind of type theory.[5]

In computing one often needs to specify and document the syntax and semantics of a computer language, more orless formally. A term often used for metavariable in that area is "metasyntactic variable". Furthermore, because ofthe common practice in hacker culture to use nonsense words like "foo" as metavariables, the term “metasyntacticvariable” has come to denote such words by themselves; for instance, “foo” is referred to as “the first metasyntacticvariable” in the first edition of The Hacker’s Dictionary.

9.1 See also• Explicit substitution

9.2 References[1] Geoffrey Hunter, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic p.13

[2] Shoenfield, Joseph R. (2001) [1967], Mathematical Logic (2nd ed.), A K Peters, p. 7, ISBN 978-1-56881-135-2

[3] Corcoran, J. 2006. Schemata: the Concept of Schema in the History of Logic. Bulletin of Symbolic Logic 12: 219-40

[4] R. D. Tennent (2002). Specifying Software: A Hands-On Introduction. Cambridge University Press. pp. 36–37 and 210.ISBN 978-0-521-00401-5.

[5] Masahiko Sato, Takafumi Sakurai, Yukiyoshi Kameyama, and Atsushi Igarashi. "Calculi of Meta-variables" in ComputerScience Logic. 17th International Workshop CSL 2003. 12th Annual Conference of the EACSL. 8th Kurt Gödel Colloquium,KGC 2003, Vienna, Austria, August 25-30, 2003. Proceedings, Springer Lecture Notes in Computer Science 2803. ISBN3-540-40801-0. pp. 484–497

30

Page 38: Syntax (Logic)

Chapter 10

Proposition

This article is about the term in logic and philosophy. For other uses, see Proposition (disambiguation).Not to be confused with preposition.

The term proposition has a broad use in contemporary philosophy. It is used to refer to some or all of the following:the primary bearers of truth-value, the objects of belief and other "propositional attitudes" (i.e., what is believed,doubted, etc.), the referents of that-clauses and the meanings of declarative sentences. Propositions are the sharableobjects of attitudes and the primary bearers of truth and falsity. This stipulation rules out certain candidates forpropositions, including thought- and utterance-tokens which are not sharable, and concrete events or facts, whichcannot be false.[1]

10.1 Historical usage

10.1.1 By Aristotle

Aristotelian logic identifies a proposition as a sentence which affirms or denies a predicate of a subject. AnAristotelianproposition may take the form “All men are mortal” or “Socrates is a man.” In the first example the subject is “Allmen” and the predicate “are mortal.” In the second example the subject is “Socrates” and the predicate is “is a man.”

10.1.2 By the logical positivists

Often propositions are related to closed sentences to distinguish them from what is expressed by an open sentence.In this sense, propositions are “statements” that are truth-bearers. This conception of a proposition was supported bythe philosophical school of logical positivism.Some philosophers argue that some (or all) kinds of speech or actions besides the declarative ones also have proposi-tional content. For example, yes–no questions present propositions, being inquiries into the truth value of them. Onthe other hand, some signs can be declarative assertions of propositions without forming a sentence nor even beinglinguistic, e.g. traffic signs convey definite meaning which is either true or false.Propositions are also spoken of as the content of beliefs and similar intentional attitudes such as desires, preferences,and hopes. For example, “I desire that I have a new car,” or “I wonder whether it will snow" (or, whether it is thecase that “it will snow”). Desire, belief, and so on, are thus called propositional attitudes when they take this sort ofcontent.

10.1.3 By Russell

Bertrand Russell held that propositions were structured entities with objects and properties as constituents. Wittgen-stein held that a proposition is the set of possible worlds/states of affairs in which it is true. One important differencebetween these views is that on the Russellian account, two propositions that are true in all the same states of affairs

31

Page 39: Syntax (Logic)

32 CHAPTER 10. PROPOSITION

can still be differentiated. For instance, the proposition that two plus two equals four is distinct on a Russellian ac-count from three plus three equals six. If propositions are sets of possible worlds, however, then all mathematicaltruths (and all other necessary truths) are the same set (the set of all possible worlds).

10.2 Relation to the mind

In relation to the mind, propositions are discussed primarily as they fit into propositional attitudes. Propositionalattitudes are simply attitudes characteristic of folk psychology (belief, desire, etc.) that one can take toward a propo-sition (e.g. 'it is raining,' 'snow is white,' etc.). In English, propositions usually follow folk psychological attitudes bya “that clause” (e.g. “Jane believes that it is raining”). In philosophy of mind and psychology, mental states are oftentaken to primarily consist in propositional attitudes. The propositions are usually said to be the “mental content” ofthe attitude. For example, if Jane has a mental state of believing that it is raining, her mental content is the proposi-tion 'it is raining.' Furthermore, since such mental states are about something (namely propositions), they are said tobe intentional mental states. Philosophical debates surrounding propositions as they relate to propositional attitudeshave also recently centered on whether they are internal or external to the agent or whether they are mind-dependentor mind-independent entities (see the entry on internalism and externalism in philosophy of mind).

10.3 Treatment in logic

As noted above, in Aristotelian logic a proposition is a particular kind of sentence, one which affirms or denies apredicate of a subject. Aristotelian propositions take forms like “All men are mortal” and “Socrates is a man.”Propositions show up in formal logic as objects of a formal language. A formal language begins with different types ofsymbols. These types can include variables, operators, function symbols, predicate (or relation) symbols, quantifiers,and propositional constants. (Grouping symbols are often added for convenience in using the language but do not playa logical role.) Symbols are concatenated together according to recursive rules in order to construct strings to whichtruth-values will be assigned. The rules specify how the operators, function and predicate symbols, and quantifiers areto be concatenated with other strings. A proposition is then a string with a specific form. The form that a propositiontakes depends on the type of logic.The type of logic called propositional, sentential, or statement logic includes only operators and propositional constantsas symbols in its language. The propositions in this language are propositional constants, which are considered atomicpropositions, and composite propositions, which are composed by recursively applying operators to propositions.Application here is simply a short way of saying that the corresponding concatenation rule has been applied.The types of logics called predicate, quantificational, or n-order logic include variables, operators, predicate andfunction symbols, and quantifiers as symbols in their languages. The propositions in these logics are more complex.First, terms must be defined. A term is (i) a variable or (ii) a function symbol applied to the number of terms requiredby the function symbol’s arity. For example, if + is a binary function symbol and x, y, and z are variables, thenx+(y+z) is a term, which might be written with the symbols in various orders. A proposition is (i) a predicate symbolapplied to the number of terms required by its arity, (ii) an operator applied to the number of propositions requiredby its arity, or (iii) a quantifier applied to a proposition. For example, if = is a binary predicate symbol and ∀ is aquantifier, then ∀x,y,z [(x = y) → (x+z = y+z)] is a proposition. This more complex structure of propositions allowsthese logics to make finer distinctions between inferences, i.e., to have greater expressive power.In this context, propositions are also called sentences, statements, statement forms, formulas, and well-formed formu-las, though these terms are usually not synonymous within a single text. This definition treats propositions as syntacticobjects, as opposed to semantic or mental objects. That is, propositions in this sense are meaningless, formal, abstractobjects. They are assigned meaning and truth-values by mappings called interpretations and valuations, respectively.

10.4 Objections to propositions

Attempts to provide a workable definition of proposition include

Two meaningful declarative sentences express the same proposition if and only if they mean thesame thing.

Page 40: Syntax (Logic)

10.5. SEE ALSO 33

thus defining proposition in terms of synonymity. For example, “Snow is white” (in English) and “Schnee ist weiß"(in German) are different sentences, but they say the same thing, so they express the same proposition.

Two meaningful declarative sentence-tokens express the same proposition if and only if they meanthe same thing.

Unfortunately, the above definition has the result that two sentences/sentence-tokens which have the same meaningand thus express the same proposition, could have different truth-values, e.g. “I am Spartacus” said by Spartacus andsaid by John Smith; and e.g. “It is Wednesday” said on a Wednesday and on a Thursday.A number of philosophers and linguists claim that all definitions of a proposition are too vague to be useful. Forthem, it is just a misleading concept that should be removed from philosophy and semantics. W.V. Quine maintainedthat the indeterminacy of translation prevented any meaningful discussion of propositions, and that they should bediscarded in favor of sentences.[2] Strawson advocated the use of the term “statement”.

10.5 See also• Main contention

.

10.6 References[1] “Propositions (Stanford Encyclopedia of Philosophy)". Plato.stanford.edu. Retrieved 2014-06-23.

[2] Quine W.V. Philosophy of Logic, Prentice-Hall NJ USA: 1970, pp 1-14

10.7 External links• Stanford Encyclopedia of Philosophy articles on:

• Propositions, by Matthew McGrath• Singular Propositions, by Greg Fitch• Structured Propositions, by Jeffrey C. King

Page 41: Syntax (Logic)

Chapter 11

Propositional formula

In propositional logic, a propositional formula is a type of syntactic formula which is well formed and has a truthvalue. If the values of all variables in a propositional formula are given, it determines a unique truth value. Apropositional formula may also be called a propositional expression, a sentence, or a sentential formula.A propositional formula is constructed from simple propositions, such as “five is greater than three” or propositionalvariables such as P and Q, using connectives such as NOT, AND, OR, and IMPLIES; for example:

(P AND NOT Q) IMPLIES (P OR Q).

In mathematics, a propositional formula is often more briefly referred to as a "proposition", but, more precisely, apropositional formula is not a proposition but a formal expression that denotes a proposition, a formal object underdiscussion, just like an expression such as "x + y" is not a value, but denotes a value. In some contexts, maintainingthe distinction may be of importance.

11.1 Propositions

For the purposes of the propositional calculus, propositions (utterances, sentences, assertions) are considered to beeither simple or compound.[1] Compound propositions are considered to be linked by sentential connectives, someof the most common of which are “AND”, “OR”, “IF ... THEN ...”, “NEITHER ... NOR...”, "... IS EQUIVALENTTO ...” . The linking semicolon ";", and connective “BUT” are considered to be expressions of “AND”. A sequenceof discrete sentences are considered to be linked by “AND"s, and formal analysis applies a recursive “parenthesisrule” with respect to sequences of simple propositions (see more below about well-formed formulas).

For example: The assertion: “This cow is blue. That horse is orange but this horse here is purple.” isactually a compound proposition linked by “AND"s: ( (“This cow is blue” AND “that horse is orange”)AND “this horse here is purple” ) .

Simple propositions are declarative in nature, that is, they make assertions about the condition or nature of a particularobject of sensation e.g. “This cow is blue”, “There’s a coyote!" (“That coyote IS there, behind the rocks.”).[2] Thusthe simple “primitive” assertions must be about specific objects or specific states of mind. Each must have at least asubject (an immediate object of thought or observation), a verb (in the active voice and present tense preferred), andperhaps an adjective or adverb. “Dog!" probably implies “I see a dog” but should be rejected as too ambiguous.

Example: “That purple dog is running”, “This cow is blue”, “Switch M31 is closed”, “This cap is off”,“Tomorrow is Friday”.

For the purposes of the propositional calculus a compound proposition can usually be reworded into a series of simplesentences, although the result will probably sound stilted.

34

Page 42: Syntax (Logic)

11.2. AN ALGEBRA OF PROPOSITIONS, THE PROPOSITIONAL CALCULUS 35

11.1.1 Relationship between propositional and predicate formulas

The predicate calculus goes a step further than the propositional calculus to an “analysis of the inner structure ofpropositions”[3] It breaks a simple sentence down into two parts (i) its subject (the object (singular or plural) ofdiscourse) and (ii) a predicate—a verb or possibly verb-clause that asserts a quality or attribute of the object(s)).The predicate calculus then generalizes the “subject|predicate” form (where | symbolizes concatenation (stringingtogether) of symbols) into a form with the following blank-subject structure " ___|predicate”, and the predicate inturn generalized to all things with that property.

Example: “This blue pig has wings” becomes two sentences in the propositional calculus: “This pig haswings” AND “This pig is blue”, whose internal structure is not considered. In contrast, in the predicatecalculus, the first sentence breaks into “this pig” as the subject, and “has wings” as the predicate. Thusit asserts that object “this pig” is a member of the class (set, collection) of “winged things”. The secondsentence asserts that object “this pig” has an attribute “blue” and thus is a member of the class of “bluethings”. One might choose to write the two sentences connected with AND as:

p|W AND p|B

The generalization of “this pig” to a (potential) member of two classes “winged things” and “blue things” means thatit has a truth-relationship with both of these classes. In other words, given a domain of discourse “winged things”,either we find p to be a member of this domain or not. Thus we have a relationship W (wingedness) between p (pig)and { T, F }, W(p) evaluates to { T, F }. Likewise for B (blueness) and p (pig) and { T, F }: B(p) evaluates to { T, F}. So we now can analyze the connected assertions “B(p) AND W(p)" for its overall truth-value, i.e.:

( B(p) AND W(p) ) evaluates to { T, F }

In particular, simple sentences that employ notions of “all”, “some”, “a few”, “one of”, etc. are treated by the predicatecalculus. Along with the new function symbolism “F(x)" two new symbols are introduced: ∀ (For all), and ∃ (Thereexists ..., At least one of ... exists, etc.). The predicate calculus, but not the propositional calculus, can establish theformal validity of the following statement:

“All blue pigs have wings but some pigs have no wings, hence some pigs are not blue”.

11.1.2 Identity

Tarski asserts that the notion of IDENTITY (as distinguished from LOGICAL EQUIVALENCE) lies outside thepropositional calculus; however, he notes that if a logic is to be of use for mathematics and the sciences it must containa “theory” of IDENTITY.[4] Some authors refer to “predicate logic with identity” to emphasize this extension. Seemore about this below.

11.2 An algebra of propositions, the propositional calculus

An algebra (and there are many different ones), loosely defined, is a method by which a collection of symbols calledvariables together with some other symbols such as parentheses (, ) and some sub-set of symbols such as *, +, ~, &,V, =, ≡, ⋀, ¬ are manipulated within a system of rules. These symbols, and well-formed strings of them, are saidto represent objects, but in a specific algebraic system these objects do not have meanings. Thus work inside thealgebra becomes an exercise in obeying certain laws (rules) of the algebra’s syntax (symbol-formation) rather thanin semantics (meaning) of the symbols. The meanings are to be found outside the algebra.For a well-formed sequence of symbols in the algebra—a formula -- to have some usefulness outside the algebra thesymbols are assigned meanings and eventually the variables are assigned values; then by a series of rules the formulais evaluated.When the values are restricted to just two and applied to the notion of simple sentences (e.g. spoken utterancesor written assertions) linked by propositional connectives this whole algebraic system of symbols and rules andevaluation-methods is usually called the propositional calculus or the sentential calculus.

Page 43: Syntax (Logic)

36 CHAPTER 11. PROPOSITIONAL FORMULA

While some of the familiar rules of arithmetic algebra continue to hold in the algebra of propositions (e.g. thecommutative and associative laws for AND and OR), some do not (e.g. the distributive laws for AND, OR andNOT).

11.2.1 Usefulness of propositional formulas

Analysis: In deductive reasoning, philosophers, rhetoricians and mathematicians reduce arguments to formulas andthen study them (usually with truth tables) for correctness (soundness). For example: Is the following argumentsound?

“Given that consciousness is sufficient for an artificial intelligence and only conscious entities can passthe Turing test, before we can conclude that a robot is an artificial intelligence the robot must pass theTuring test.”

Engineers analyze the logic circuits they have designed using synthesis techniques and then apply various reductionand minimization techniques to simplify their designs.Synthesis: Engineers in particular synthesize propositional formulas (that eventually end up as circuits of symbols)from truth tables. For example, one might write down a truth table for how binary addition should behave given theaddition of variables “b” and “a” and “carry_in” “ci”, and the results “carry_out” “co” and “sum” Σ:

Example: in row 5, ( (b+a) + ci ) = ( (1+0) + 1 ) = the number “2”. written as a binary number this is102, where “co"=1 and Σ=0 as shown in the right-most columns.

11.2.2 Propositional variables

The simplest type of propositional formula is a propositional variable. Propositions that are simple (atomic), sym-bolic expressions are often denoted by variables named a, b, or A, B, etc. A propositional variable is intended torepresent an atomic proposition (assertion), such as “It is Saturday” = a (here the symbol = means " ... is assignedthe variable named ...”) or “I only go to the movies on Monday” = b.

11.2.3 Truth-value assignments, formula evaluations

Evaluation of a propositional formula begins with assignment of a truth value to each variable. Because eachvariable represents a simple sentence, the truth values are being applied to the “truth” or “falsity” of these simplesentences.Truth values in rhetoric, philosophy and mathematics: The truth values are only two: { TRUTH “T”, FALSITY“F” }. An empiricist puts all propositions into two broad classes: analytic—true no matter what (e.g. tautology), andsynthetic—derived from experience and thereby susceptible to confirmation by third parties (the verification theory ofmeaning).[5] Empiricits hold that, in general, to arrive at the truth-value of a synthetic proposition, meanings (pattern-matching templates) must first be applied to the words, and then these meaning-templates must be matched againstwhatever it is that is being asserted. For example, my utterance “That cow is blue!" Is this statement a TRUTH? TrulyI said it. And maybe I am seeing a blue cow—unless I am lying my statement is a TRUTH relative to the object ofmy (perhaps flawed) perception. But is the blue cow “really there"? What do you see when you look out the samewindow? In order to proceed with a verification, you will need a prior notion (a template) of both “cow” and “blue”,and an ability to match the templates against the object of sensation (if indeed there is one).Truth values in engineering: Engineers try to avoid notions of truth and falsity that bedevil philosophers, but in thefinal analysis engineers must trust their measuring instruments. In their quest for robustness, engineers prefer to pullknown objects from a small library—objects that have well-defined, predictable behaviors even in large combinations,(hence their name for the propositional calculus: “combinatorial logic”). The fewest behaviors of a single object aretwo (e.g. { OFF, ON }, { open, shut }, { UP, DOWN } etc.), and these are put in correspondence with { 0, 1 }.Such elements are called digital; those with a continuous range of behaviors are called analog. Whenever decisionsmust be made in an analog system, quite often an engineer will convert an analog behavior (the door is 45.32146%UP) to digital (e.g. DOWN=0 ) by use of a comparator.[6]

Page 44: Syntax (Logic)

11.3. PROPOSITIONAL CONNECTIVES 37

Thus an assignment ofmeaning of the variables and the two value-symbols { 0, 1 } comes from “outside” the formulathat represents the behavior of the (usually) compound object. An example is a garage door with two “limit switches”,one for UP labelled SW_U and one for DOWN labelled SW_D, and whatever else is in the door’s circuitry. Inspectionof the circuit (either the diagram or the actual objects themselves—door, switches, wires, circuit board, etc.) mightreveal that, on the circuit board “node 22” goes to +0 volts when the contacts of switch “SW_D” are mechanically incontact (“closed”) and the door is in the “down” position (95% down), and “node 29” goes to +0 volts when the dooris 95% UP and the contacts of switch SW_U are in mechanical contact (“closed”).[7] The engineer must define themeanings of these voltages and all possible combinations (all 4 of them), including the “bad” ones (e.g. both nodes22 and 29 at 0 volts, meaning that the door is open and closed at the same time). The circuit mindlessly responds towhatever voltages it experiences without any awareness of TRUTH or FALSEHOOD, RIGHT or WRONG, SAFEor DANGEROUS.

11.3 Propositional connectives

Arbitrary propositional formulas are built from propositional variables and other propositional formulas using propositionalconnectives. Examples of connectives include:

• The unary negation connective. If α is a formula, then ¬α is a formula.

• The classical binary connectives ∧,∨,→,↔ . Thus, for example, if α and β are formulas, so is (α→ β) .

• Other binary connectives, such as NAND, NOR, and XOR

• The ternary connective IF ... THEN ... ELSE ...

• Constant 0-ary connectives ⊤ and ⊥ (alternately, constants { T, F }, { 1, 0 } etc. )

• The “theory-extension” connective EQUALS (alternately, IDENTITY, or the sign " = " as distinguished fromthe “logical connective”↔ )

11.3.1 Connectives of rhetoric, philosophy and mathematics

The following are the connectives common to rhetoric, philosophy and mathematics together with their truth tables.The symbols used will vary from author to author and between fields of endeavor. In general the abbreviations “T”and “F” stand for the evaluations TRUTH and FALSITY applied to the variables in the propositional formula (e.g.the assertion: “That cow is blue” will have the truth-value “T” for Truth or “F” for Falsity, as the case may be.).The connectives go by a number of different word-usages, e.g. “a IMPLIES b” is also said “IF a THEN b”. Some ofthese are shown in the table.

11.3.2 Engineering connectives

In general, the engineering connectives are just the same as the mathematics connectives excepting they tend toevaluate with “1” = “T” and “0” = “F”. This is done for the purposes of analysis/minimization and synthesis offormulas by use of the notion of minterms and Karnaugh maps (see below). Engineers also use the words logicalproduct from Boole's notion (a*a = a) and logical sum from Jevons' notion (a+a = a).[8]

11.3.3 CASE connective: IF ... THEN ... ELSE ...

The IF ... THEN ... ELSE ... connective appears as the simplest form of CASE operator of recursion theoryand computation theory and is the connective responsible for conditional goto’s (jumps, branches). From this oneconnective all other connectives can be constructed (see more below). Although " IF c THEN b ELSE a " soundslike an implication it is, in its most reduced form, a switch that makes a decision and offers as outcome only one oftwo alternatives “a” or “b” (hence the name switch statement in the C programming language).[9]

The following three propositions are equivalent (as indicated by the logical equivalence sign ≡ ):

Page 45: Syntax (Logic)

38 CHAPTER 11. PROPOSITIONAL FORMULA

Engineering symbols have varied over the years, but these are commonplace. Sometimes they appear simply as boxes with symbolsin them. “a” and “b” are called “the inputs” and “c” is called “the output”. An output will typical “connect to” an input (unless it is thefinal connective); this accomplishes the mathematical notion of substitution.

• (1) ( IF 'counter is zero' THEN 'go to instruction b ' ELSE 'go to instruction a ') ≡• (2) ( (c → b) & (~c → a) ) ≡ ( ( IF 'counter is zero' THEN 'go to instruction b ' ) AND ( IF 'It isNOT the case that counter is zero' THEN 'go to instruction a ) " ≡

• (3) ( (c & b) V (~c & a) ) = " ( 'Counter is zero' AND 'go to instruction b ) OR ( 'It is NOT thecase that 'counter is zero' AND 'go to instruction a ) "

Thus IF ... THEN ... ELSE—unlike implication—does not evaluate to an ambiguous “TRUTH” when the firstproposition is false i.e. c = F in (c → b). For example, most people would reject the following compound propositionas a nonsensical non sequitur because the second sentence is not connected in meaning to the first.[10]

Example: The proposition " IF 'Winston Churchill was Chinese' THEN 'The sun rises in the east' "evaluates as a TRUTH given that 'Winston Church was Chinese' is a FALSEHOOD and 'The sun risesin the east' evaluates as a TRUTH.

In recognition of this problem, the sign → of formal implication in the propositional calculus is called materialimplication to distinguish it from the everyday, intuitive implication.[11]

The use of the IF ... THEN ... ELSE construction avoids controversy because it offers a completely deterministicchoice between two stated alternatives; it offers two “objects” (the two alternatives b and a), and it selects betweenthem exhaustively and unabiguously.[12] In the truth table below, d1 is the formula: ( (IF c THEN b) AND (IF NOT-cTHEN a) ). Its fully reduced form d2 is the formula: ( (c AND b) OR (NOT-c AND a). The two formulas areequivalent as shown by the columns "=d1” and "=d2”. Electrical engineers call the fully reduced formula the AND-OR-SELECT operator. The CASE (or SWITCH) operator is an extension of the same idea to n possible, but mutuallyexclusive outcomes. Electrical engineers call the CASE operator a multiplexer.

11.3.4 IDENTITY and evaluation

The first table of this section stars *** the entry logical equivalence to note the fact that "Logical equivalence" is notthe same thing as “identity”. For example, most would agree that the assertion “That cow is blue” is identical to theassertion “That cow is blue”. On the other hand logical equivalence sometimes appears in speech as in this example:" 'The sun is shining' means 'I'm biking' " Translated into a propositional formula the words become: “IF 'the sun isshining' THEN 'I'm biking', AND IF 'I'm biking' THEN 'the sun is shining'":[13]

“IF 's’ THEN 'b' AND IF 'b' THEN 's’ " is written as ((s → b) & (b → s)) or in an abbreviated form as(s ↔ b). As the rightmost symbol string is a definition for a new symbol in terms of the symbols on theleft, the use of the IDENTITY sign = is appropriate:

Page 46: Syntax (Logic)

11.4. MORE COMPLEX FORMULAS 39

((s → b) & (b → s)) = (s ↔ b)

Different authors use different signs for logical equivalence: ↔ (e.g. Suppes, Goodstein, Hamilton), ≡ (e.g. Robbin),⇔ (e.g. Bender andWilliamson). Typically identity is written as the equals sign =. One exception to this rule is foundin Principia Mathematica. For more about the philosophy of the notion of IDENTITY see Leibniz’s law.As noted above, Tarski considers IDENTITY to lie outside the propositional calculus, but he asserts that without thenotion, “logic” is insufficient for mathematics and the deductive sciences. In fact the sign comes into the propositionalcalculus when a formula is to be evaluated.[14]

In some systems there are no truth tables, but rather just formal axioms (e.g. strings of symbols from a set { ~, →, (,), variables p1, p2, p3, ... } and formula-formation rules (rules about how to make more symbol strings from previousstrings by use of e.g. substitution and modus ponens). the result of such a calculus will be another formula (i.e. awell-formed symbol string). Eventually, however, if one wants to use the calculus to study notions of validity andtruth, one must add axioms that define the behavior of the symbols called “the truth values” {T, F} ( or {1, 0}, etc.)relative to the other symbols.For example, Hamilton uses two symbols = and ≠ when he defines the notion of a valuation v of any wffs A and Bin his “formal statement calculus” L. A valuation v is a function from the wffs of his system L to the range (output){ T, F }, given that each variable p1, p2, p3 in a wff is assigned an arbitrary truth value { T, F }.

• (i) v(A) ≠ v(~A)

• (ii) v(A→ B) = F if and only if v(A) = T and v(B) = F

The two definitions (i) and (ii) define the equivalent of the truth tables for the ~ (NOT) and → (IMPLICATION)connectives of his system. The first one derives F ≠ T and T ≠ F, in other words " v(A) does not mean v(~A)".Definition (ii) specifies the third row in the truth table, and the other three rows then come from an application ofdefinition (i). In particular (ii) assigns the value F (or a meaning of “F”) to the entire expression. The definitions alsoserve as formation rules that allow substitution of a value previously derived into a formula:Some formal systems specify these valuation axioms at the outset in the form of certain formulas such as the law ofcontradiction or laws of identity and nullity. The choice of which ones to use, together with laws such as commutationand distribution, is up to the system’s designer as long as the set of axioms is complete (i.e. sufficient to form and toevaluate any well-formed formula created in the system).

11.4 More complex formulas

As shown above, the CASE (IF c THEN b ELSE a ) connective is constructed either from the 2-argument connectivesIF...THEN... and AND or from OR and AND and the 1-argument NOT. Connectives such as the n-argument AND(a & b & c & ... & n), OR (a V b V c V ... V n) are constructed from strings of two-argument AND and OR andwritten in abbreviated form without the parentheses. These, and other connectives as well, can then used as buildingblocks for yet further connectives. Rhetoricians, philosophers, and mathematicians use truth tables and the varioustheorems to analyze and simplify their formulas.Electrical engineering uses drawn symbols and connect them with lines that stand for the mathematicals act of sub-stitution and replacement. They then verify their drawings with truth tables and simplify the expressions as shownbelow by use of Karnaugh maps or the theorems. In this way engineers have created a host of “combinatorial logic”(i.e. connectives without feedback) such as “decoders”, “encoders”, “mutifunction gates”, “majority logic”, “binaryadders”, “arithmetic logic units”, etc.

11.4.1 Definitions

A definition creates a new symbol and its behavior, often for the purposes of abbreviation. Once the definition ispresented, either form of the equivalent symbol or formula can be used. The following symbolism =D is followingthe convention of Reichenbach.[15] Some examples of convenient definitions drawn from the symbol set { ~, &, (,) } and variables. Each definition is producing a logically equivalent formula that can be used for substitution orreplacement.

Page 47: Syntax (Logic)

40 CHAPTER 11. PROPOSITIONAL FORMULA

• definition of a new variable: (c & d) =D s• OR: ~(~a & ~b) =D (a V b)• IMPLICATION: (~a V b) =D (a → b)• XOR: (~a & b) V (a & ~b) =D (a ⊕ b)• LOGICAL EQUIVALENCE: ( (a → b) & (b → a) ) =D ( a ≡ b )

11.4.2 Axiom and definition schemas

The definitions above for OR, IMPLICATION, XOR, and logical equivalence are actually schemas (or “schemata”),that is, they are models (demonstrations, examples) for a general formula format but shown (for illustrative purposes)with specific letters a, b, c for the variables, whereas any variable letters can go in their places as long as the lettersubstitutions follow the rule of substitution below.

Example: In the definition (~a V b) =D (a → b), other variable-symbols such as “SW2” and “CON1”might be used, i.e. formally:

a =D SW2, b =D CON1, so we would have as an instance of the definition schema (~SW2V CON1) =D (SW2 → CON1)

11.4.3 Substitution versus replacement

Substitution: The variable or sub-formula to be substituted with another variable, constant, or sub-formula must bereplaced in all instances throughout the overall formula.

Example: (c & d) V (p & ~(c & ~d)), but (q1 & ~q2) ≡ d. Now wherever variable “d” occurs, substitute(q1 & ~q2):

(c & (q1 & ~q2)) V (p & ~(c & ~(q1 & ~q2)))

Replacement: (i) the formula to be replaced must be within a tautology, i.e. logically equivalent ( connected by ≡or ↔) to the formula that replaces it, and (ii) unlike substitution its permissible for the replacement to occur only inone place (i.e. for one formula).

Example: Use this set of formula schemas/equivalences: 1: ( (a V 0) ≡ a ). 2: ( (a & ~a) ≡ 0 ). 3: ( (~aV b) =D (a → b) ). 6. ( ~(~a) ≡ a )

• start with “a": a• Use 1 to replace “a” with (a V 0): (a V 0)• Use the notion of “schema” to substitute b for a in 2: ( (a & ~a) ≡ 0 )• Use 2 to replace 0 with (b & ~b): ( a V (b & ~b) )• (see below for how to distribute “a V” over (b & ~b), etc

11.5 Inductive definition

The classical presentation of propositional logic (see Enderton 2002) uses the connectives ¬,∧,∨,→,↔ . The setof formulas over a given set of propositional variables is inductively defined to be the smallest set of expressions suchthat:

• Each propositional variable in the set is a formula,

• (¬α) is a formula whenever α is, and

• (α□β) is a formula whenever α and β are formulas and □ is one of the binary connectives ∧,∨,→,↔ .

Page 48: Syntax (Logic)

11.6. PARSING FORMULAS 41

This inductive definition can be easily extended to cover additional connectives.The inductive definition can also be rephrased in terms of a closure operation (Enderton 2002). Let V denote a set ofpropositional variables and let XV denote the set of all strings from an alphabet including symbols in V, left and rightparentheses, and all the logical connectives under consideration. Each logical connective corresponds to a formulabuilding operation, a function from XXV to XXV:

• Given a string z, the operation E¬(z) returns (¬z) .

• Given strings y and z, the operation E∧(y, z) returns (y ∧ x) . There are similar operations E∨ , E→ , and E↔corresponding to the other binary connectives.

The set of formulas over V is defined to be the smallest subset of XXV containing V and closed under all the formulabuilding operations.

11.6 Parsing formulas

The following “laws” of the propositional calculus are used to “reduce” complex formulas. The “laws” can be easilyverified with truth tables. For each law, the principal (outermost) connective is associated with logical equivalence≡ or identity =. A complete analysis of all 2n combinations of truth-values for its n distinct variables will result ina column of 1’s (T’s) underneath this connective. This finding makes each law, by definition, a tautology. And, fora given law, because its formula on the left and right are equivalent (or identical) they can be substituted for oneanother.

Example: The following truth table is De Morgan’s law for the behavior of NOT over OR: ~(a V b) ≡(~a & ~b). To the left of the principal connective ≡ (yellow column labelled “taut”) the formula ~(b Va) evaluates to (1, 0, 0, 0) under the label “P”. On the right of “taut” the formula (~(b) V ~(a)) alsoevaluates to (1, 0, 0, 0) under the label “Q”. As the two columns have equivalent evaluations, the logicalequivalence ≡ under “taut” evaluates to (1, 1, 1, 1), i.e. P ≡ Q. Thus either formula can be substitutedfor the other if it appears in an larger formula.

Enterprising readers might challenge themselves to invent an “axiomatic system” that uses the symbols { V, &, ~, (,), variables a, b, c }, the formation rules specified above, as few as possible of the laws listed below, and then deriveas theorems the others as well as the truth-table valuations for V, &, and ~. One set attributed to Huntington (1904)(Suppes:204) uses 8 of the laws defined below.Note that that if used in an axiomatic system, the symbols 1 and 0 (or T and F) are considered to be wffs and thusobey all the same rules as the variables. Thus the laws listed below are actually axiom schemas, that is, they stand inplace of an infinite number of instances. Thus ( x V y ) ≡ ( y V x ) might be used in one instance, ( p V 0 ) ≡ ( 0 V p) and in another instance ( 1 V q ) ≡ ( q V 1 ), etc.

11.6.1 Connective seniority (symbol rank)

In general, to avoid confusion during analysis and evaluation of propositional formulas make liberal use parentheses.However, quite often authors leave them out. To parse a complicated formula one first needs to know the seniority,or rank that each of the connectives (excepting *) has over the other connectives. To” well-form” a formula start withthe connective with the highest rank and add parentheses around its components, then move down in rank (payingclose attention to the connective’s scope over which the it is working). From most- to least-senior, with the precidatesigns ∀x and ∃x, the IDENTITY = and arithmetic signs added for completeness:[16]

≡ (LOGICAL EQUIVALENCE), → (IMPLICATION), & (AND), V (OR), ~ (NOT), ∀x(FORALLx),∃x (THEREEXISTSANx),= (IDENTITY),+ (arithmetic sum), *(arithmeticmultiply), ' (s, arithmetic successor).

Thus the formula can be parsed—but note that, because NOT does not obey the distributive law, the parenthesesaround the inner formula (~c & ~d) is mandatory:

Page 49: Syntax (Logic)

42 CHAPTER 11. PROPOSITIONAL FORMULA

Example: " d & c V w " rewritten is ( (d & c) V w )Example: " a & a → b ≡ a & ~a V b " rewritten (rigorously) is

• ≡ has seniority: ( ( a & a → b ) ≡ ( a & ~a V b ) )• → has seniority: ( ( a & (a → b) ) ≡ ( a & ~a V b ) )• & has seniority both sides: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~a V b) ) )• ~ has seniority: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) V b) ) )• check 9 ( -parenthesis and 9 ) -parenthesis: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) V b)) )

Example:

d & c V p & ~(c & ~d) ≡ c & d V p & c V p & ~d rewritten is ( ( (d & c) V ( p & ~((c &~(d)) ) ) ) ≡ ( (c & d) V (p & c) V (p & ~(d)) ) )

11.6.2 Commutative and associative laws

Both AND and OR obey the commutative law and associative law:

• Commutative law for OR: ( a V b ) ≡ ( b V a )

• Commutative law for AND: ( a & b ) ≡ ( b & a )

• Associative law for OR: (( a V b ) V c ) ≡ ( a V (b V c) )

• Associative law for AND: (( a & b ) & c ) ≡ ( a & (b & c) )

Omitting parentheses in strings of AND and OR: The connectives are considered to be unary (one-variable, e.g.NOT) and binary (i.e. two-variable AND, OR, IMPLIES). For example:

( (c & d) V (p & c) V (p & ~d) ) above should be written ( ((c & d) V (p & c)) V (p & ~(d) ) ) or possibly( (c & d) V ( (p & c) V (p & ~(d)) ) )

However, a truth-table demonstration shows that the form without the extra parentheses is perfectly adequate.Omitting parentheses with regards to a single-variable NOT: While ~(a) where a is a single variable is perfectlyclear, ~a is adequate and is the usual way this literal would appear. When the NOT is over a formula with more thanone symbol, then the parentheses are mandatory, e.g. ~(a V b)

11.6.3 Distributive laws

OR distributes over AND and AND distributes over OR. NOT does not distribute over AND nor OR. See belowabout De Morgan’s law:

• Distributive law for OR: ( c V ( a & b) ) ≡ ( (c V a) & (c V b) )

• Distributive law for AND: ( c & ( a V b) ) ≡ ( (c & a) V (c & b) )

11.6.4 De Morgan’s laws

NOT, when distributed over OR or AND, does something peculiar (again, these can be verified with a truth-table):

• De Morgan’s law for OR: ~(a V b) ≡ (~a & ~b)

• De Morgan’s law for AND: ~(a & b) ≡ (~a V ~b)

Page 50: Syntax (Logic)

11.7. WELL-FORMED FORMULAS (WFFS) 43

11.6.5 Laws of absorption

Absorption, in particular the first one, cause the “laws” of logic to differ from the “laws” of arithmetic:

• Absorption (idempotency) for OR: (a V a) ≡ a

• Absorption (idempotency) for AND: (a & a) ≡ a

11.6.6 Laws of evaluation: Identity, nullity, and complement

The sign " = " (as distinguished from logical equivalence ≡, alternately ↔ or⇔) symbolizes the assignment of value ormeaning. Thus the string (a & ~(a)) symbolizes “1”, i.e. itmeans the same thing as symbol “1” ". In some “systems”this will be an axiom (definition) perhaps shown as ( (a & ~(a)) =D 1 ) ; in other systems, it may be derived in thetruth table below:

• Commutation of equality: (a = b) ≡ (b = a)

• Identity for OR: (a V 0) = a or (a V F) = a

• Identity for AND: (a & 1) = a or (a & T) = a

• Nullity for OR: (a V 1) = 1 or (a V T) = T

• Nullity for AND: (a & 0) = 0 or (a & F) = F

• Complement for OR: (a V ~a) = 1 or (a V ~a) = T, law of excluded middle

• Complement for AND: (a & ~a) = 0 or (a & ~a) = F, law of contradiction

11.6.7 Double negative (Involution)

• ~(~a) = a

11.7 Well-formed formulas (wffs)

A key property of formulas is that they can be uniquely parsed to determine the structure of the formula in termsof its propositional variables and logical connectives. When formulas are written in infix notation, as above, uniquereadability is ensured through an appropriate use of parentheses in the definition of formulas. Alternatively, formulascan be written in Polish notation or reverse Polish notation, eliminating the need for parentheses altogether.The inductive definition of infix formulas in the previous section can be converted to a formal grammar in Backus-Naur form:

<formula> ::= <propositional variable>| ( ¬ <formula> )| ( <formula> ∧ <formula>)| ( <formula> ∨ <formula> )| ( <formula>→ <formula> )| ( <formula>↔ <formula> )

It can be shown that any expression matched by the grammar has a balanced number of left and right parentheses, andany nonempty initial segment of a formula has more left than right parentheses.[17] This fact can be used to give analgorithm for parsing formulas. For example, suppose that an expression x begins with (¬ . Starting after the secondsymbol, match the shortest subexpression y of x that has balanced parentheses. If x is a formula, there is exactly onesymbol left after this expression, this symbol is a closing parenthesis, and y itself is a formula. This idea can be usedto generate a recursive descent parser for formulas.

Page 51: Syntax (Logic)

44 CHAPTER 11. PROPOSITIONAL FORMULA

Example of parenthesis counting:This method locates as “1” the principal connective -- the connective under which the overall evaluation of theformula occurs for the outer-most parentheses (which are often omitted).[18] It also locates the inner-most connectivewhere one would begin evaluatation of the formula without the use of a truth table, e.g. at “level 6”.

11.7.1 Wffs versus valid formulas in inferences

The notion of valid argument is usually applied to inferences in arguments, but arguments reduce to propositionalformulas and can be evaluated the same as any other propositional formula. Here a valid inference means: “Theformula that represents the inference evaluates to “truth” beneath its principal connective, no matter what truth-values are assigned to its variables”, i.e. the formula is a tautology.[19] Quite possibly a formula will be well-formedbut not valid. Another way of saying this is: “Being well-formed is necessary for a formula to be valid but it is notsufficient.” The only way to find out if it is both well-formed and valid is to submit it to verification with a truth tableor by use of the “laws":

Example 1: What does one make of the following difficult-to-follow assertion? Is it valid? “If it’s sunny,but if the frog is croaking then it’s not sunny, then it’s the same as saying that the frog isn't croaking.”Convert this to a propositional formula as follows:

" IF (a AND (IF b THEN NOT-a) THEN NOT-a” where " a " represents “its sunny” and " b" represents “the frog is croaking":( ( (a) & ( (b) → ~(a) ) ≡ ~(b) )

This is well-formed, but is it valid? In other words, when evaluated will this yield a tautology (all T)beneath the logical-equivalence symbol ≡ ? The answer is NO, it is not valid. However, if reconstructedas an implication then the argument is valid.“Saying it’s sunny, but if the frog is croaking then it’s not sunny, implies that the frog isn't croaking.”

Other circumstances may be preventing the frog from croaking: perhaps a crane ate it.

Example 2 (from Reichenbach via Bertrand Russell):

“If pigs have wings, some winged animals are good to eat. Some winged animals are good toeat, so pigs have wings.”( ((a) → (b)) & (b) → (a) ) is well formed, but an invalid argument as shown by the redevaluation under the principal implication:

11.8 Reduced sets of connectives

A set of logical connectives is called complete if every propositional formula is tautologically equivalent to a formulawith just the connectives in that set. There are many complete sets of connectives, including {∧,¬} , {∨,¬} , and{→,¬} . There are two binary connectives that are complete on their own, corresponding to NAND and NOR,respectively.[20] Some pairs are not complete, for example {∧,∨} .

11.8.1 The stroke (NAND)

The binary connective corresponding to NAND is called the Sheffer stroke, and written with a vertical bar | or verticalarrow ↑. The completeness of this connective was noted in Principia Mathematica (1927:xvii). Since it is complete onits own, all other connectives can be expressed using only the stroke. For example, where the symbol " ≡ " representslogical equivalence:

~p ≡ p|pp → q ≡ p|~qp V q ≡ ~p|~qp & q ≡ ~(p|q)

Page 52: Syntax (Logic)

11.8. REDUCED SETS OF CONNECTIVES 45

The engineering symbol for the NAND connective (the 'stroke') can be used to build any propositional formula. The notion that truth(1) and falsity (0) can be defined in terms of this connective is shown in the sequence of NANDs on the left, and the derivations ofthe four evaluations of a NAND b are shown along the bottom. The more common method is to use the definition of the NAND fromthe truth table.

In particular, the zero-ary connectives ⊤ (representing truth) and ⊥ (representing falsity) can be expressed using thestroke:

⊤ ≡ (a|(a|a))

⊥ ≡ (⊤|⊤)

11.8.2 IF ... THEN ... ELSE

This connective together with { 0, 1 }, ( or { F, T } or { ⊥ , ⊤ } ) forms a complete set. In the following theIF...THEN...ELSE relation (c, b, a) = d represents ( (c → b) V (~c → a) ) ≡ ( (c & b) V (~c & a) ) = d

(c, b, a):(c, 0, 1) ≡ ~c(c, b, 1) ≡ (c → b)(c, c, a) ≡ (c V a)(c, b, c) ≡ (c & b)

Page 53: Syntax (Logic)

46 CHAPTER 11. PROPOSITIONAL FORMULA

Example: The following shows how a theorem-based proof of "(c, b, 1) ≡ (c → b)" would proceed, below the proofis its truth-table verification. ( Note: (c → b) is defined to be (~c V b) ):

• Begin with the reduced form: ( (c & b) V (~c & a) )• Substitute “1” for a: ( (c & b) V (~c & 1) )• Identity (~c & 1) = ~c: ( (c & b) V (~c) )• Law of commutation for V: ( (~c) V (c & b) )• Distribute "~c V” over (c & b): ( ((~c) V c ) & ((~c) V b )• Law of excluded middle (((~c) V c ) = 1 ): ( (1) & ((~c) V b ) )• Distribute "(1) &" over ((~c) V b): ( ((1) & (~c)) V ((1) & b )) )• Commutivity and Identity (( 1 & ~c) = (~c & 1) = ~c, and (( 1 & b) ≡ (b & 1) ≡ b: ( ~c V b )• ( ~c V b ) is defined as c → b Q. E. D.

In the following truth table the column labelled “taut” for tautology evaluates logical equivalence (symbolized here by≡) between the two columns labelled d. Because all four rows under “taut” are 1’s, the equivalence indeed representsa tautology.

11.9 Normal forms

An arbitrary propositional formulamay have a very complicated structure. It is often convenient to workwith formulasthat have simpler forms, known as normal forms. Some common normal forms include conjunctive normal formand disjunctive normal form. Any propositional formula can be reduced to its conjunctive or disjunctive normal form.

11.9.1 Reduction to normal form

Reduction to normal form is relatively simple once a truth table for the formula is prepared. But further attempts tominimize the number of literals (see below) requires some tools: reduction by De Morgan’s laws and truth tablescan be unwieldy, but Karnaugh maps are very suitable a small number of variables (5 or less). Some sophisticatedtabular methods exist for more complex circuits with multiple outputs but these are beyond the scope of this article;for more see Quine–McCluskey algorithm.

Literal, term and alterm

In electrical engineering a variable x or its negation ~(x) is lumped together into a single notion called a literal. Astring of literals connected by ANDs is called a term. A string of literals connected by OR is called an alterm.Typically the literal ~(x) is abbreviated ~x. Sometimes the &-symbol is omitted altogether in the manner of algebraicmultiplication.

Example: a, b, c, d are variables. ((( a & ~(b) ) & ~(c)) & d) is a term. This can be abbreviated as (a &~b & ~c & d), or a~b~cd.Example: p, q, r, s are variables. (((p & ~(q) ) & r) & ~(s) ) is an alterm. This can be abbreviated as (pV ~q V r V ~s).

Minterms

In the same way that a 2n-row truth table displays the evaluation of a propositional formula for all 2n possible valuesof its variables, n variables produces a 2n-square Karnaugh map (even though we cannot draw it in its full-dimensionalrealization). For example, 3 variables produces 23 = 8 rows and 8 Karnaugh squares; 4 variables produces 16 truth-table rows and 16 squares and therefore 16 minterms. Each Karnaugh-map square and its corresponding truth-tableevaluation represents one minterm.

Page 54: Syntax (Logic)

11.9. NORMAL FORMS 47

Any propositional formula can be reduced to the “logical sum” (OR) of the active (i.e. “1"- or “T"-valued) minterms.When in this form the formula is said to be in disjunctive normal form. But even though it is in this form, it is notnecessarily minimized with respect to either the number of terms or the number of literals.In the following table, observe the peculiar numbering of the rows: (0, 1, 3, 2, 6, 7, 5, 4, 0). The first column is thedecimal equivalent of the binary equivalent of the digits “cba”, in other words:

Example: cba2 = c*22 + b*21 + a*20:

cba = (c=1, b=0, a=0) = 1012 = 1*22 + 0*21 + 1*20 = 510

This numbering comes about because as one moves down the table from row to row only one variable at a timechanges its value. Gray code is derived from this notion. This notion can be extended to three and four-dimensionalhypercubes called Hasse diagrams where each corner’s variables change only one at a time as one moves around theedges of the cube. Hasse diagrams (hypercubes) flattened into two dimensions are either Veitch diagrams or Karnaughmaps (these are virtually the same thing).When working with Karnaugh maps one must always keep in mind that the top edge “wrap arounds” to the bottomedge, and the left edge wraps around to the right edge—the Karnaugh diagram is really a three- or four- or n-dimensional flattened object.

11.9.2 Reduction by use of the map method (Veitch, Karnaugh)

Veitch improved the notion of Venn diagrams by converting the circles to abutting squares, and Karnaugh simplifiedthe Veitch diagram by converting the minterms, written in their literal-form (e.g. ~abc~d) into numbers.[21] Themethod proceeds as follows:

(1) Produce the formula’s truth table

Produce the formula’s truth table. Number its rows using the binary-equivalents of the variables (usually just sequen-tially 0 through n-1) for n variables.

Technically, the propositional function has been reduced to its (unminimized) conjunctive normal form:each row has its minterm expression and these can be OR'd to produce the formula in its (unminimized)conjunctive normal form.

Example: ((c & d) V (p & ~(c & (~d)))) = q in conjunctive normal form is:

( (~p & d & c ) V (p & d & c) V (p & d & ~c) V (p & ~d & ~c) ) = q

However, this formula be reduced both in the number of terms (from 4 to 3) and in the total count of its literals (12to 6).

(2) Create the formula’s Karnaugh map

Use the values of the formula (e.g. “p”) found by the truth-table method and place them in their into their respective(associated) Karnaugh squares (these are numbered per the Gray code convention). If values of “d” for “don't care”appear in the table, this adds flexibility during the reduction phase.

(3) Reduce minterms

Minterms of adjacent (abutting) 1-squares (T-squares) can be reduced with respect to the number of their literals,and the number terms also will be reduced in the process. Two abutting squares (2 x 1 horizontal or 1 x 2 vertical,even the edges represent abutting squares) lose one literal, four squares in a 4 x 1 rectangle (horizontal or vertical)or 2 x 2 square (even the four corners represent abutting squares) lose two literals, eight squares in a rectangle lose 3literals, etc. (One seeks out the largest square or rectangles and ignores the smaller squares or rectangles contained

Page 55: Syntax (Logic)

48 CHAPTER 11. PROPOSITIONAL FORMULA

totally within it. ) This process continues until all abutting squares are accounted for, at which point the propositionalformula is minimized.For example, squares #3 and #7 abut. These two abutting squares can lose one literal (e.g. “p” from squares #3 and#7), four squares in a rectangle or square lose two literals, eight squares in a rectangle lose 3 literals, etc. (One seeksout the largest square or rectangles.) This process continues until all abutting squares are accounted for, at whichpoint the propositional formula is said to be minimized.Example: The map method usually is done by inspection. The following example expands the algebraic method toshow the “trick” behind the combining of terms on a Karnaugh map:

Minterms #3 and #7 abut, #7 and #6 abut, and #4 and #6 abut (because the table’s edges wrap around).So each of these pairs can be reduced.

Observe that by the Idempotency law (A V A) = A, we can create more terms. Then by association and distributivelaws the variables to disappear can be paired, and then “disappeared” with the Law of contradiction (x & ~x)=0. Thefollowing uses brackets [ and ] only to keep track of the terms; they have no special significance:

• Put the formula in conjunctive normal form with the formula to be reduced:

q = ( (~p & d & c ) V (p & d & c) V (p & d & ~c) V (p & ~d & ~c) ) = ( #3 V#7 V #6 V #4 )

• Idempotency (absorption) [ A V A) = A:

( #3 V [ #7 V #7 ] V [ #6 V #6 ] V #4 )

• Associative law (x V (y V z)) = ( (x V y) V z )

( [ #3 V #7 ] V [ #7 V #6 ] V [ #6 V #4] )[ (~p & d & c ) V (p & d & c) ] V [ (p & d & c) V (p & d & ~c) ] V [ (p & d & ~c)V (p & ~d & ~c) ].

• Distributive law ( x & (y V z) ) = ( (x & y) V (x & z) ) :

( [ (d & c) V (~p & p) ] V [ (p & d) V (~c & c) ] V [ (p & ~c) V (c & ~c) ] )

• Commutative law and law of contradiction (x & ~x) = (~x & x) = 0:

( [ (d & c) V (0) ] V [ (p & d) V (0) ] V [ (p & ~c) V (0) ] )

• Law of identity ( x V 0 ) = x leading to the reduced form of the formula:

q = ( (d & c) V (p & d) V (p & ~c) )

(4) Verify reduction with a truth table

11.10 Impredicative propositions

Given the following examples-as-definitions, what does one make of the subsequent reasoning:

(1) “This sentence is simple.” (2) “This sentence is complex, and it is conjoined by AND.”

Then assign the variable “s” to the left-most sentence “This sentence is simple”. Define “compound” c = “not simple”~s, and assign c = ~s to “This sentence is compound"; assign “j” to “It [this sentence] is conjoined by AND”. Thesecond sentence can be expressed as:

Page 56: Syntax (Logic)

11.11. PROPOSITIONAL FORMULA WITH “FEEDBACK” 49

( NOT(s) AND j )

If truth values are to be placed on the sentences c = ~s and j, then all are clearly FALSEHOODS: e.g. “This sentenceis complex” is a FALSEHOOD (it is simple, by definition). So their conjunction (AND) is a falsehood. But whentaken in its assembed form, the sentence a TRUTH.This is an example of the paradoxes that result from an impredicative definition—that is, when an object m has aproperty P, but the object m is defined in terms of property P.[22] The best advice for a rhetorician or one involvedin deductive analysis is avoid impredicative definitions but at the same time be on the lookout for them because theycan indeed create paradoxes. Engineers, on the other hand, put them to work in the form of propositional formulaswith feedback.

11.11 Propositional formula with “feedback”

The notion of a propositional formula appearing as one of its own variables requires a formation rule that allows theassignment of the formula to a variable. In general there is no stipulation (either axiomatic or truth-table systems ofobjects and relations) that forbids this from happening.[23]

The simplest case occurs when an OR formula becomes one its own inputs e.g. p = q. Begin with (p V s) = q, thenlet p = q. Observe that q’s “definition” depends on itself “q” as well as on “s” and the OR connective; this definitionof q is thus impredicative. Either of two conditions can result:[24] oscillation or memory.It helps to think of the formula as a black box. Without knowledge of what is going on “inside” the formula-"box”from the outside it would appear that the output is no longer a function of the inputs alone. That is, sometimesone looks at q and sees 0 and other times 1. To avoid this problem one has to know the state (condition) of the“hidden” variable p inside the box (i.e. the value of q fed back and assigned to p). When this is known the apparentinconsistency goes away.To understand [predict] the behavior of formulas with feedback requires the more sophisticated analysis of sequentialcircuits. Propositional formulas with feedback lead, in their simplest form, to state machines; they also lead tomemories in the form of Turing tapes and counter-machine counters. From combinations of these elements onecan build any sort of bounded computational model (e.g. Turing machines, counter machines, register machines,Macintosh computers, etc.).

11.11.1 Oscillation

In the abstract (ideal) case the simplest oscillating formula is a NOT fed back to itself: ~(~(p=q)) = q. Analysis ofan abstract (ideal) propositional formula in a truth-table reveals an inconsistency for both p=1 and p=0 cases: Whenp=1, q=0, this cannot be because p=q; ditto for when p=0 and q=1.Oscillation with delay: If an delay[25] (ideal or non-ideal) is inserted in the abstract formula between p and q thenp will oscillate between 1 and 0: 101010...101... ad infinitum. If either of the delay and NOT are not abstract (i.e.not ideal), the type of analysis to be used will be dependent upon the exact nature of the objects that make up theoscillator; such things fall outside mathematics and into engineering.Analysis requires a delay to be inserted and then the loop cut between the delay and the input “p”. The delay mustbe viewed as a kind of proposition that has “qd” (q-delayed) as output for “q” as input. This new proposition addsanother column to the truth table. The inconsistency is now between “qd” and “p” as shown in red; two stable statesresulting:

11.11.2 Memory

Without delay, inconsistencies must be eliminated from a truth table analysis. With the notion of “delay”, this con-dition presents itself as a momentary inconsistency between the fed-back output variable q and p = q ₑ ₐ ₑ .A truth table reveals the rows where inconsistencies occur between p = q ₑ ₐ ₑ at the input and q at the output. After“breaking” the feed-back,[26] the truth table construction proceeds in the conventional manner. But afterwards, inevery row the output q is compared to the now-independent input p and any inconsistencies between p and q are noted(i.e. p=0 together with q=1, or p=1 and q=0); when the “line” is “remade” both are rendered impossible by the Law

Page 57: Syntax (Logic)

50 CHAPTER 11. PROPOSITIONAL FORMULA

of contradiction ~(p & ~p)). Rows revealing inconsistencies are either considered transient states or just eliminatedas inconsistent and hence “impossible”.

Once-flip memory

About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output“q” feeds back into “p”. Given that the formula is first evaluated (initialized) with p=0 & q=0, it will “flip” oncewhen “set” by s=1. Thereafter, output “q” will sustain “q” in the “flipped” condition (state q=1). This behavior, nowtime-dependent, is shown by the state diagram to the right of the once-flip.

Flip-flop memory

The next simplest case is the “set-reset” flip-flop shown below the once-flip. Given that r=0 & s=0 and q=0 at theoutset, it is “set” (s=1) in a manner similar to the once-flip. It however has a provision to “reset” q=0 when “r"=1.And additional complication occurs if both set=1 and reset=1. In this formula, the set=1 forces the output q=1 sowhen and if (s=0 & r=1) the flip-flop will be reset. Or, if (s=1 & r=0) the flip-flop will be set. In the abstract (ideal)instance in which s=1 => s=0 & r=1 => r=0 simultaneously, the formula q will be indeterminate (undecidable). Dueto delays in “real” OR, AND and NOT the result will be unknown at the outset but thereafter predicable.

Clocked flip-flop memory

The formula known as “clocked flip-flop” memory (“c” is the “clock” and “d” is the “data”) is given below. It worksas follows: When c = 0 the data d (either 0 or 1) cannot “get through” to affect output q. When c = 1 the data d “getsthrough” and output q “follows” d’s value. When c goes from 1 to 0 the last value of the data remains “trapped” atoutput “q”. As long as c=0, d can change value without causing q to change.Example: ( ( c & d ) V ( p & ( ~( c & ~( d ) ) ) ) = q, but now let p = q:

Example: ( ( c & d ) V ( q & ( ~( c & ~( d ) ) ) ) = q

The state diagram is similar in shape to the flip-flop’s state diagram, but with different labelling on the transitions.

11.12 Historical development

Bertrand Russell (1912:74) lists three laws of thought that derive from Aristotle: (1) The law of identity: “Whateveris, is.”, (2) The law of contradiction: “Nothing cannot both be and not be”, and (3) The law of excluded middle:“Everything must be or not be.”

Example: Here O is an expression about an objects BEING or QUALITY:

(1) Law of Identity: O = O(2) Law of contradiction: ~(O & ~(O))(3) Law of excluded middle: (O V ~(O))

The use of the word “everything” in the law of excluded middle renders Russell’s expression of this law open todebate. If restricted to an expression about BEING or QUALITY with reference to a finite collection of objects (afinite “universe of discourse”) -- the members of which can be investigated one after another for the presence orabsence of the assertion—then the law is considered intuitionistically appropriate. Thus an assertion such as: “Thisobject must either BE or NOT BE (in the collection)", or “This object must either have this QUALITY or NOT havethis QUALITY (relative to the objects in the collection)" is acceptable. See more at Venn diagram.Although a propositional calculus originated with Aristotle, the notion of an algebra applied to propositions had towait until the early 19th century. In an (adverse) reaction to the 2000 year tradition of Aristotle’s syllogisms, JohnLocke's Essay concerning human understanding (1690) used the word semiotics (theory of the use of symbols). By1826 Richard Whately had critically analyzed the syllogistic logic with a sympathy toward Locke’s semiotics. George

Page 58: Syntax (Logic)

11.12. HISTORICAL DEVELOPMENT 51

Bentham's work (1827) resulted in the notion of “quantification of the predicate” (1827) (nowadays symbolized as∀ ≡ “for all”). A “row” instigated by William Hamilton over a priority dispute with Augustus De Morgan “inspiredGeorge Boole to write up his ideas on logic, and to publish them as MAL [Mathematical Analysis of Logic] in 1847”(Grattin-Guinness and Bornet 1997:xxviii).About his contribution Grattin-Guinness and Bornet comment:

“Boole’s principal single innovation was [the] law [ xn = x ] for logic: it stated that the mental acts ofchoosing the property x and choosing x again and again is the same as choosing x once... As consequenceof it he formed the equations x•(1-x)=0 and x+(1-x)=1 which for him expressed respectively the law ofcontradiction and the law of excluded middle” (p. xxviiff). For Boole “1” was the universe of discourseand “0” was nothing.

Gottlob Frege's massive undertaking (1879) resulted in a formal calculus of propositions, but his symbolism is sodaunting that it had little influence excepting on one person: Bertrand Russell. First as the student of Alfred NorthWhitehead he studied Frege’s work and suggested a (famous and notorious) emendation with respect to it (1904)around the problem of an antinomy that he discovered in Frege’s treatment ( cf Russell’s paradox ). Russell’s workled to a collatoration with Whitehead that, in the year 1912, produced the first volume of Principia Mathematica(PM). It is here that what we consider “modern” propositional logic first appeared. In particular, PM introduces NOTand OR and the assertion symbol ⊦ as primitives. In terms of these notions they define IMPLICATION → ( def.*1.01: ~p V q ), then AND (def. *3.01: ~(~p V ~q) ), then EQUIVALENCE p ←→ q (*4.01: (p → q) & ( q → p )).

• Henry M. Sheffer (1921) and Jean Nicod demonstrate that only one connective, the “stroke” | is sufficient toexpress all propositional formulas.

• Emil Post (1921) develops the truth-table method of analysis in his “Introduction to a general theory of ele-mentary propositions”. He notes Nicod’s stroke | .

• Whitehead and Russell add an introduction to their 1927 re-publication of PM adding, in part, a favorabletreatment of the “stroke”.

Computation and switching logic:

• William Eccles and F. W. Jordan (1919) describe a “trigger relay” made from a vacuum tube.

• George Stibitz (1937) invents the binary adder using mechanical relays. He builds this on his kitchen table.

Example: Given binary bits aᵢ and bᵢ and carry-in ( c_inᵢ), their summation Σᵢ and carry-out (c_outᵢ) are:

• ( ( aᵢ XOR bᵢ ) XOR c_inᵢ )= Σᵢ• ( aᵢ & bᵢ ) V c_inᵢ ) = c_outᵢ;

• Alan Turing builds a multiplier using relays (1937–1938). He has to hand-wind his own relay coils to do this.

• Textbooks about “switching circuits” appear in early 1950s.

• Willard Quine 1952 and 1955, E. W. Veitch 1952, and M. Karnaugh (1953) develop map-methods for simpli-fying propositional functions.

• George H. Mealy (1955) and Edward F. Moore (1956) address the theory of sequential (i.e. switching-circuit)“machines”.

• E. J. McCluskey and H. Shorr develop a method for simplifying propositional (switching) circuits (1962).

Page 59: Syntax (Logic)

52 CHAPTER 11. PROPOSITIONAL FORMULA

11.13 Footnotes[1] Hamilton 1978:1

[2] PM p. 91 eschews “the” because they require a clear-cut “object of sensation"; they stipulate the use of “this”

[3] (italics added) Reichenbach p.80.

[4] Tarski p.54-68. Suppes calls IDENTITY a “further rule of inference” and has a brief development around it; Robbin,Bender and Williamson, and Goodstein introduce the sign and its usage without comment or explanation. Hamilton p. 37employs two signs ≠ and = with respect to the valuation of a formula in a formal calculus. Kleene p. 70 and Hamilton p.52 place it in the predicate calculus, in particular with regards to the arithmetic of natural numbers.

[5] Empiricits eschew the notion of a priori (built-in, born-with) knowledge. “Radical reductionists” such as John Lockeand David Hume “held that every idea must either originate directly in sense experience or else be compounded of ideasthus originating"; quoted from Quine reprinted in 1996 The Emergence of Logical Empriricism, Garland Publishing Inc.http://www.marxists.org/reference/subject/philosophy/works/us/quine.htm

[6] Neural net modelling offers a good mathematical model for a comparator as follows: Given a signal S and a threshold “thr”,subtract “thr” from S and substitute this difference d to a sigmoid function: For large “gains” k, e.g. k=100, 1/( 1 + e-k*(d)) = 1/( 1 + e-k*(S-thr) ) = { ≃0, ≃1 }. For example, if “The door is DOWN” means “The door is less than 50% of the wayup”, then a threshold thr=0.5 corresponding to 0.5*5.0 = +2.50 volts could be applied to a “linear” measuring-device withan output of 0 volts when fully closed and +5.0 volts when fully open.

[7] In actuality the digital 1 and 0 are defined over non-overlapping ranges e.g. { “1” = +5/+0.2/−1.0 volts, 0 = +0.5/−0.2volts }. When a value falls outside the defined range(s) the value becomes “u” -- unknown; e.g. +2.3 would be “u”.

[8] While the notion of logical product is not so peculiar (e.g. 0*0=0, 0*1=0, 1*0=0, 1*1=1), the notion of (1+1=1 is peculiar;in fact (a "+" b) = (a + (b - a*b)) where "+" is the “logical sum” but + and - are the true arithmetic counterparts. Occasionallyall four notions do appear in a formula: A AND B = 1/2*( A plus B minus ( A XOR B ) ] (cf p. 146 in John Wakerly 1978,Error Detecting Codes, Self-Checking Circuits and Applications, North-Holland, New York, ISBN 0-444-00259-6 pbk.)

[9] A careful look at its Karnaugh map shows that IF...THEN...ELSE can also be expressed, in a rather round-about way, interms of two exclusive-ORs: ( (b AND (c XOR a)) OR (a AND (c XOR b)) ) = d.

[10] Robbin p. 3.

[11] Rosenbloom p. 30 and p. 54ff discusses this problem of implication at some length. Most philosophers and mathematiciansjust accept the material definition as given above. But some do not, including the intuitionists; they consider it a form ofthe law of excluded middle misapplied.

[12] Indeed, exhaustive selection between alternatives --mutual exclusion -- is required by the definition that Kleene gives theCASE operator (Kleene 1952229)

[13] The use of quote marks around the expressions is not accidental. Tarski comments on the use of quotes in his “18. Identityof things and identity of their designations; use of quotation marks” p. 58ff.

[14] Hamilton p. 37. Bender and Williamson p. 29 state “In what follows, we'll replace “equals” with the symbol " ⇔ "(equivalence) which is usually used in logic. We use the more familiar " = " for assigning meaning and values.”

[15] Reichenbach p. 20-22 and follows the conventions of PM. The symbol =D is in the metalanguage and is not a formalsymbol with the following meaning: “by symbol ' s ' is to have the same meaning as the formula '(c & d)' ".

[16] Rosenbloom 1950:32. Kleene 1952:73-74 ranks all 11 symbols.

[17] cf Minsky 1967:75, section 4.2.3 “The method of parenthesis counting”. Minsky presents a state machine that will do thejob, and by use of induction (recursive definition) Minsky proves the “method” and presents a theorem as the result. Afully generalized “parenthesis grammar” requires an infinite state machine (e.g. a Turing machine) to do the counting.

[18] Robbin p. 7

[19] cf Reichenbach p. 68 for a more involved discussion: “If the inference is valid and the premises are true, the inference iscalled conclusive.

[20] As well as the first three, Hamilton pp.19-22 discusses logics built from only | (NAND), and ↓ (NOR).

[21] Wickes 1967:36ff. Wickes offers a good example of 8 of the 2 x 4 (3-variable maps) and 16 of the 4 x 4 (4-variable)maps. As an arbitrary 3-variable map could represent any one of 28=256 2x4 maps, and an arbitrary 4-variable map couldrepresent any one of 216 = 65,536 different formula-evaluations, writing down every one is infeasible.

Page 60: Syntax (Logic)

11.14. REFERENCES 53

[22] This definition is given by Stephen Kleene. Both Kurt Gödel and Kleene believed that the classical paradoxes are uniformlyexamples of this sort of definition. But Kleene went on to assert that the problem has not been solved satisfactorily andimpredicative definitions can be found in analysis. He gives as example the definition of the least upper bound (l.u.b) u ofM. Given a Dedekind cut of the number line C and the two parts into which the number line is cut, i.e. M and (C -M),l.u.b. = u is defined in terms of the notion M, whereas M is defined in terms of C. Thus the definition of u, an elementof C, is defined in terms of the totality C and this makes its definition impredicative. Kleene asserts that attempts to arguethis away can be used to uphold the impredicative definitions in the paradoxes.(Kleene 1952:43).

[23] McCluskey comments that “it could be argued that the analysis is still incomplete because the word statement “The outputsare equal to the previous values of the inputs” has not been obtained"; he goes on to dismiss such worries because “Englishis not a formal language in a mathematical sense, [and] it is not really possible to have a formal procedure for obtainingword statements” (p. 185).

[24] More precisely, given enough “loop gain”, either oscillation or memory will occur (cf McCluskey p. 191-2). In abstract(idealized) mathematical systems adequate loop gain is not a problem.

[25] The notion of delay and the principle of local causation as caused ultimately by the speed of light appears in Robin Gandy(1980), “Church’s thesis and Principles for Mechanisms”, in J. Barwise, H. J. Keisler and K. Kunen, eds., The KleeneSymposium, North-Holland Publishing Company (1980) 123-148. Gandy considered this to be the most important of hisprinciples: “Contemporary physics rejects the possibility of instantaneous action at a distance” (p. 135). Gandy was AlanTuring's student and close friend.

[26] McKlusky p. 194-5 discusses “breaking the loop” and inserts “amplifiers” to do this; Wickes (p. 118-121) discuss insertingdelays. McCluskey p. 195ff discusses the problem of “races” caused by delays.

11.14 References

• Bender, Edward A. andWilliamson, S. Gill, 2005, A Short Course in Discrete Mathematics, Dover Publications,Mineola NY, ISBN 0-486-43946-1. This text is used in a “lower division two-quarter [computer science]course” at UC San Diego.

• Enderton, H. B., 2002, AMathematical Introduction to Logic. Harcourt/Academic Press. ISBN 0-12-238452-0

• Goodstein, R. L., (Pergamon Press 1963), 1966, (Dover edition 2007), Boolean Algebra, Dover Publications,Inc. Minola, New York, ISBN 0-486-45894-6. Emphasis on the notion of “algebra of classes” with set-theoretic symbols such as ∩, ∪, ' (NOT), ⊂ (IMPLIES). Later Goldstein replaces these with &, ∨, ¬, → (re-spectively) in his treatment of “Sentence Logic” pp. 76–93.

• Ivor Grattan-Guinness and Gérard Bornet 1997, George Boole: Selected Manuscripts on Logic and its Philoso-phy, Birkhäuser Verlag, Basil, ISBN 978-0-8176-5456-6 (Boston).

• A. G. Hamilton 1978, Logic for Mathematicians, Cambridge University Press, Cambridge UK, ISBN 0-521-21838-1.

• E. J. McCluskey 1965, Introduction to the Theory of Switching Circuits, McGraw-Hill Book Company, NewYork. No ISBN. Library of Congress Catalog Card Number 65-17394. McCluskey was a student of WillardQuine and developed some notable theorems with Quine and on his own. For those interested in the history,the book contains a wealth of references.

• Marvin L. Minsky 1967, Computation: Finite and Infinite Machines, Prentice-Hall, Inc, Englewood Cliffs, N.J..No ISBN. Library of Congress Catalog Card Number 67-12342. Useful especially for computability, plus goodsources.

• Paul C. Rosenbloom 1950, Dover edition 2005, The Elements of Mathematical Logic, Dover Publications, Inc.,Mineola, New York, ISBN 0-486-44617-4.

• Joel W. Robbin 1969, 1997, Mathematical Logic: A First Course, Dover Publications, Inc., Mineola, NewYork, ISBN 0-486-45018-X (pbk.).

• Patrick Suppes 1957 (1999Dover edition), Introduction to Logic, Dover Publications, Inc., Mineola, NewYork.ISBN 0-486-40687-3 (pbk.). This book is in print and readily available.

Page 61: Syntax (Logic)

54 CHAPTER 11. PROPOSITIONAL FORMULA

• On his page 204 in a footnote he references his set of axioms to E. V. Huntington, “Sets of IndependentPostulates for the Algebra of Logic”, Transactions of the American Mathematical Society, Vol. 5 91904) pp.288-309.

• Alfred Tarski 1941 (1995 Dover edition), Introduction to Logic and to the Methodology of Deductive Sciences,Dover Publications, Inc., Mineola, New York. ISBN 0-486-28462-X (pbk.). This book is in print and readilyavailable.

• Jean van Heijenoort 1967, 3rd printing with emendations 1976, From Frege to Gödel: A Source Book in Mathe-matical Logic, 1879-1931, Harvard University Press, Cambridge, Massachusetts. ISBN 0-674-32449-8 (pbk.)Translation/reprints of Frege (1879), Russell’s letter to Frege (1902) and Frege’s letter to Russell (1902),Richard’s paradox (1905), Post (1921) can be found here.

• Alfred North Whitehead and Bertrand Russell 1927 2nd edition, paperback edition to *53 1962, PrincipiaMathematica, Cambridge University Press, no ISBN. In the years between the first edition of 1912 and the 2ndedition of 1927, H. M. Sheffer 1921 and M. Jean Nicod (no year cited) brought to Russell’s and Whitehead’sattention that what they considered their primitive propositions (connectives) could be reduced to a single |,nowadays known as the “stroke” or NAND (NOT-AND, NEITHER ... NOR...). Russell-Whitehead discussthis in their “Introduction to the Second Edition” and makes the definitions as discussed above.

• William E. Wickes 1968, Logic Design with Integrated Circuits, John Wiley & Sons, Inc., New York. NoISBN. Library of Congress Catalog Card Number: 68-21185. Tight presentation of engineering’s analysis andsynthesis methods, references McCluskey 1965. Unlike Suppes, Wickes’ presentation of “Boolean algebra”starts with a set of postulates of a truth-table nature and then derives the customary theorems of them (p.18ff).

Page 62: Syntax (Logic)

11.14. REFERENCES 55

A truth table will contain 2n rows, where n is the number of variables (e.g. three variables “p”, “d”, “c” produce 23 rows). Eachrow represents a minterm. Each minterm can be found on the Hasse diagram, on the Veitch diagram, and on the Karnaugh map.(The evaluations of “p” shown in the truth table are not shown in the Hasse, Veitch and Karnaugh diagrams; these are shown in theKarnaugh map of the following section.)

Page 63: Syntax (Logic)

56 CHAPTER 11. PROPOSITIONAL FORMULA

Steps in the reduction using a Karnaugh map. The final result is the OR (logical “sum”) of the three reduced terms.

Page 64: Syntax (Logic)

11.14. REFERENCES 57

Page 65: Syntax (Logic)

58 CHAPTER 11. PROPOSITIONAL FORMULA

About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output “q” feeding backinto “p”. The next simplest is the “flip-flop” shown below the once-flip. Analysis of these sorts of formulas can be done by eithercutting the feedback path(s) or inserting (ideal) delay in the path. A cut path and an assumption that no delay occurs anywhere in the“circuit” results in inconsistencies for some of the total states (combination of inputs and outputs, e.g. (p=0, s=1, r=1) results in aninconsistency). When delay is present these inconsistencies are merely transient and expire when the delay(s) expire. The drawingson the right are called state diagrams.

Page 66: Syntax (Logic)

11.14. REFERENCES 59

A “clocked flip-flop” memory (“c” is the “clock” and “d” is the “data”). The data can change at any time when clock c=0; when clockc=1 the output q “tracks” the value of data d. When c goes from 1 to 0 it “traps” d = q’s value and this continues to appear at q nomatter what d does (as long as c remains 0).

Page 67: Syntax (Logic)

Chapter 12

Rule of inference

In logic, a rule of inference, inference rule, or transformation rule is a logical form consisting of a function whichtakes premises, analyzes their syntax, and returns a conclusion (or conclusions). For example, the rule of inferencecalled modus ponens takes two premises, one in the form “If p then q” and another in the form “p”, and returns theconclusion “q”. The rule is valid with respect to the semantics of classical logic (as well as the semantics of many othernon-classical logics), in the sense that if the premises are true (under an interpretation), then so is the conclusion.Typically, a rule of inference preserves truth, a semantic property. In many-valued logic, it preserves a generaldesignation. But a rule of inference’s action is purely syntactic, and does not need to preserve any semantic property:any function from sets of formulae to formulae counts as a rule of inference. Usually only rules that are recursiveare important; i.e. rules such that there is an effective procedure for determining whether any given formula is theconclusion of a given set of formulae according to the rule. An example of a rule that is not effective in this sense isthe infinitary ω-rule.[1]

Popular rules of inference in propositional logic includemodus ponens, modus tollens, and contraposition. First-orderpredicate logic uses rules of inference to deal with logical quantifiers.

12.1 The standard form of rules of inference

In formal logic (and many related areas), rules of inference are usually given in the following standard form:Premise#1Premise#2...Premise#nConclusionThis expression states that whenever in the course of some logical derivation the given premises have been obtained,the specified conclusion can be taken for granted as well. The exact formal language that is used to describe bothpremises and conclusions depends on the actual context of the derivations. In a simple case, one may use logicalformulae, such as in:

A→ B

A

B

This is the modus ponens rule of propositional logic. Rules of inference are often formulated as schemata employingmetavariables.[2] In the rule (schema) above, the metavariables A and B can be instantiated to any element of theuniverse (or sometimes, by convention, a restricted subset such as propositions) to form an infinite set of inferencerules.A proof system is formed from a set of rules chained together to form proofs, also called derivations. Any derivationhas only one final conclusion, which is the statement proved or derived. If premises are left unsatisfied in the derivation,then the derivation is a proof of a hypothetical statement: "if the premises hold, then the conclusion holds.”

60

Page 68: Syntax (Logic)

12.2. AXIOM SCHEMAS AND AXIOMS 61

12.2 Axiom schemas and axioms

Inference rules may also be stated in this form: (1) zero or more premises, (2) a turnstile symbol ⊢ , which means“infers”, “proves”, or “concludes”, and (3) a conclusion. This form usually embodies the relational (as opposed tofunctional) view of a rule of inference, where the turnstile stands for a deducibility relation holding between premisesand conclusion.An inference rule containing no premises is called an axiom schema or, if it contains no metavariables, simply anaxiom.[2]

Rules of inference must be distinguished from axioms of a theory. In terms of semantics, axioms are valid assertions.Axioms are usually regarded as starting points for applying rules of inference and generating a set of conclusions. Or,in less technical terms:Rules are statements about the system, axioms are statements in the system. For example:

• The rule that from ⊢ p you can infer ⊢ Provable(p) is a statement that says if you've proven p , then it isprovable that p is provable. This rule holds in Peano arithmetic, for example.

• The axiom p → Provable(p) would mean that every true statement is provable. This axiom does not hold inPeano arithmetic.

Rules of inference play a vital role in the specification of logical calculi as they are considered in proof theory, suchas the sequent calculus and natural deduction.

12.3 Example: Hilbert systems for two propositional logics

In a Hilbert system, the premises and conclusion of the inference rules are simply formulae of some language, usuallyemploying metavariables. For graphical compactness of the presentation and to emphasize the distinction betweenaxioms and rules of inference, this section uses the sequent notation (⊢) instead of a vertical presentation of rules.The formal language for classical propositional logic can be expressed using just negation (¬), implication (→) andpropositional symbols. A well-known axiomatization, comprising three axiom schema and one inference rule (modusponens), is:(CA1) ⊢ A→ (B → A)(CA2) ⊢ (A→ (B → C)) → ((A→ B) → (A→ C))(CA3) ⊢ (¬A→ ¬B) → (B → A)(MP) A, A→ B ⊢ B

It may seem redundant to have two notions of inference in this case, ⊢ and →. In classical propositional logic, theyindeed coincide; the deduction theorem states that A⊢ B if and only if ⊢ A→ B. There is however a distinction worthemphasizing even in this case: the first notation describes a deduction, that is an activity of passing from sentences tosentences, whereas A → B is simply a formula made with a logical connective, implication in this case. Without aninference rule (like modus ponens in this case), there is no deduction or inference. This point is illustrated in LewisCarroll's dialogue called "What the Tortoise Said to Achilles".[3]

For some non-classical logics, the deduction theorem does not hold. For example, the three-valued logic Ł3 ofŁukasiewicz can be axiomatized as:[4]

(CA1) ⊢ A→ (B → A)(LA2) ⊢ (A→ B) → ((B → C) → (A→ C))(CA3) ⊢ (¬A→ ¬B) → (B → A)(LA4) ⊢ ((A→ ¬A) → A) → A(MP) A, A→ B ⊢ B

This sequence differs from classical logic by the change in axiom 2 and the addition of axiom 4. The classicaldeduction theorem does not hold for this logic, however a modified form does hold, namely A ⊢ B if and only if ⊢A→ (A→ B).[5]

Page 69: Syntax (Logic)

62 CHAPTER 12. RULE OF INFERENCE

12.4 Admissibility and derivability

Main article: Admissible rule

In a set of rules, an inference rule could be redundant in the sense that it is admissible or derivable. A derivablerule is one whose conclusion can be derived from its premises using the other rules. An admissible rule is onewhose conclusion holds whenever the premises hold. All derivable rules are admissible. To appreciate the difference,consider the following set of rules for defining the natural numbers (the judgment n nat asserts the fact that n is anatural number):

0 natn nats(n) nat

The first rule states that 0 is a natural number, and the second states that s(n) is a natural number if n is. In this proofsystem, the following rule, demonstrating that the second successor of a natural number is also a natural number, isderivable:

n nats(s(n)) nat

Its derivation is the composition of two uses of the successor rule above. The following rule for asserting the existenceof a predecessor for any nonzero number is merely admissible:

s(n) natn nat

This is a true fact of natural numbers, as can be proven by induction. (To prove that this rule is admissible, assume aderivation of the premise and induct on it to produce a derivation of n nat .) However, it is not derivable, because itdepends on the structure of the derivation of the premise. Because of this, derivability is stable under additions to theproof system, whereas admissibility is not. To see the difference, suppose the following nonsense rule were added tothe proof system:

s(−3) nat

In this new system, the double-successor rule is still derivable. However, the rule for finding the predecessor is nolonger admissible, because there is no way to derive −3 nat . The brittleness of admissibility comes from the way itis proved: since the proof can induct on the structure of the derivations of the premises, extensions to the system addnew cases to this proof, which may no longer hold.Admissible rules can be thought of as theorems of a proof system. For instance, in a sequent calculus where cutelimination holds, the cut rule is admissible.

12.5 See also

• Inference objection

• Immediate inference

• Law of thought

• List of rules of inference

• Logical truth

• Structural rule

Page 70: Syntax (Logic)

12.6. REFERENCES 63

12.6 References[1] Boolos, George; Burgess, John; Jeffrey, Richard C. (2007). Computability and logic. Cambridge: Cambridge University

Press. p. 364. ISBN 0-521-87752-0.

[2] John C. Reynolds (2009) [1998]. Theories of Programming Languages. Cambridge University Press. p. 12. ISBN 978-0-521-10697-9.

[3] Kosta Dosen (1996). “Logical consequence: a turn in style”. In Maria Luisa Dalla Chiara, Kees Doets, Daniele Mundici,Johan van Benthem. Logic and Scientific Methods: Volume One of the Tenth International Congress of Logic, Methodologyand Philosophy of Science, Florence, August 1995. Springer. p. 290. ISBN 978-0-7923-4383-7. preprint (with differentpagination)

[4] Bergmann, Merrie (2008). An introduction to many-valued and fuzzy logic: semantics, algebras, and derivation systems.Cambridge University Press. p. 100. ISBN 978-0-521-88128-9.

[5] Bergmann, Merrie (2008). An introduction to many-valued and fuzzy logic: semantics, algebras, and derivation systems.Cambridge University Press. p. 114. ISBN 978-0-521-88128-9.

Page 71: Syntax (Logic)

Chapter 13

Symbol (formal)

For other uses see Symbol (disambiguation)

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities that may be constructed from formal languages. The symbols and strings of symbols maybe broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of itswell-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

A logical symbol is a fundamental concept in logic, tokens of which may be marks or a configuration of markswhich form a particular pattern. Although the term “symbol” in common use refers at some times to the idea beingsymbolized, and at other times to the marks on a piece of paper or chalkboard which are being used to express that

64

Page 72: Syntax (Logic)

13.1. CAN WORDS BE MODELED AS FORMAL SYMBOLS? 65

idea; in the formal languages studied in mathematics and logic, the term “symbol” refers to the idea, and the marksare considered to be a token instance of the symbol. In logic, symbols build literal utility to illustrate ideas.Symbols of a formal language need not be symbols of anything. For instance there are logical constants which do notrefer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses). Symbols of a formallanguage must be capable of being specified without any reference to any interpretation of them.A symbol or string of symbols may comprise a well-formed formula if it is consistent with the formation rules of thelanguage.In a formal system a symbol may be used as a token in formal operations. The set of formal symbols in a formallanguage is referred to as an alphabet (hence each symbol may be referred to as a “letter”)[1]

A formal symbol as used in first-order logic may be a variable (member from a universe of discourse), a constant, afunction (mapping to another member of universe) or a predicate (mapping to T/F).Formal symbols are usually thought of as purely syntactic structures, composed into larger structures using a formalgrammar, though sometimes they may be associated with an interpretation or model (a formal semantics).

13.1 Can words be modeled as formal symbols?

The move to view units in natural language (e.g. English) as formal symbols was initiated by Noam Chomsky (itwas this work that resulted in the Chomsky hierarchy in formal languages). The generative grammar model lookedupon syntax as autonomous from semantics. Building on these models, the logician Richard Montague proposed thatsemantics could also be constructed on top of the formal structure:

There is in my opinion no important theoretical difference between natural languages and the artificiallanguages of logicians; indeed, I consider it possible to comprehend the syntax and semantics of bothkinds of language within a single natural and mathematically precise theory. On this point I differ froma number of philosophers, but agree, I believe, with Chomsky and his associates.” [2]

This is the philosophical premise underlying Montague grammar.However, this attempt to equate linguistic symbols with formal symbols has been challenged widely, particularly inthe tradition of cognitive linguistics, by philosophers like Stevan Harnad, and linguists like George Lakoff and RonaldLangacker.

13.2 References[1] John Hopcroft, RajeevMotwani and Jeffrey Ullman, Introduction to Automata Theory, Languages, and Computation, 2000

[2] Richard Montague, Universal Grammar, 1970

13.3 See also• List of mathematical symbols

Page 73: Syntax (Logic)

Chapter 14

Syntax (logic)

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities which may be constructed from formal languages.[1] The symbols and strings of symbolsmay be broadly divided into nonsense and well-formed formulas. A formal language is identical to the set of its well-formedformulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretationor meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols andwords of a language, as contrasted with the semantics of a language which is concerned with its meaning.The symbols, formulas, systems, theorems, proofs, and interpretations expressed in formal languages are syntactic

66

Page 74: Syntax (Logic)

14.1. SYNTACTIC ENTITIES 67

entities whose properties may be studied without regard to any meaning they may be given, and, in fact, need not begiven any.Syntax is usually associated with the rules (or grammar) governing the composition of texts in a formal language thatconstitute the well-formed formulas of a formal system.In computer science, the term syntax refers to the rules governing the composition of meaningful texts in a formallanguage, such as a programming language, that is, those texts for which it makes sense to define the semantics ormeaning, or otherwise provide an interpretation.[2]

14.1 Syntactic entities

14.1.1 Symbols

Main article: Symbol (formal)

A symbol is an idea, abstraction or concept, tokens of which may be marks or a configuration of marks which forma particular pattern. Symbols of a formal language need not be symbols of anything. For instance there are logicalconstants which do not refer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses).A symbol or string of symbols may comprise a well-formed formula if the formulation is consistent with the formationrules of the language. Symbols of a formal language must be capable of being specified without any reference to anyinterpretation of them.

14.1.2 Formal language

Main article: Formal language

A formal language is a syntactic entity which consists of a set of finite strings of symbols which are its words (usuallycalled its well-formed formulas). Which strings of symbols are words is determined by fiat by the creator of thelanguage, usually by specifying a set of formation rules. Such a language can be defined without reference to anymeanings of any of its expressions; it can exist before any interpretation is assigned to it – that is, before it has anymeaning.

14.1.3 Formation rules

Main article: Formation rule

Formation rules are a precise description of which strings of symbols are the well-formed formulas of a formallanguage. It is synonymous with the set of strings over the alphabet of the formal language which constitute wellformed formulas. However, it does not describe their semantics (i.e. what they mean).

14.1.4 Propositions

Main article: Proposition

A proposition is a sentence expressing something true or false. A proposition is identified ontologically as an idea,concept or abstraction whose token instances are patterns of symbols, marks, sounds, or strings of words.[3] Proposi-tions are considered to be syntactic entities and also truthbearers.

14.1.5 Formal theories

Main article: Theory (mathematical logic)

Page 75: Syntax (Logic)

68 CHAPTER 14. SYNTAX (LOGIC)

A formal theory is a set of sentences in a formal language.

14.1.6 Formal systems

Main article: Formal system

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with adeductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformationrules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expres-sion from one or more other expressions. Formal systems, like other syntactic entities may be defined without anyinterpretation given to it (as being, for instance, a system of arithmetic).

Syntactic consequence within a formal system

A formula A is a syntactic consequence[4][5][6][7] within some formal system FS of a set Г of formulas if there is aderivation in formal system FS of A from the set Г.

Γ ⊢FS A

Syntactic consequence does not depend on any interpretation of the formal system.[8]

Syntactic completeness of a formal system

Main article: Completeness (logic)

A formal system S is syntactically complete[9][10][11][12] (also deductively complete, maximally complete, negation com-plete or simply complete) iff for each formula A of the language of the system either A or ¬A is a theorem of S . Inanother sense, a formal system is syntactically complete iff no unprovable axiom can be added to it as an axiom with-out introducing an inconsistency. Truth-functional propositional logic and first-order predicate logic are semanticallycomplete, but not syntactically complete (for example the propositional logic statement consisting of a single variable“a” is not a theorem, and neither is its negation, but these are not tautologies). Gödel’s incompleteness theorem showsthat no recursive system that is sufficiently powerful, such as the Peano axioms, can be both consistent and complete.

14.1.7 Interpretations

Main articles: Formal semantics (logic) and Interpretation (logic)

An interpretation of a formal system is the assignment of meanings to the symbols, and truth values to the sentencesof a formal system. The study of interpretations is called formal semantics. Giving an interpretation is synonymouswith constructing a model. An interpretation is expressed in a metalanguage, which may itself be a formal language,and as such itself is a syntactic entity.

14.2 References[1] Dictionary Definition

[2] Abstract Syntax and Logic Programming

[3] Metalogic, Geoffrey Hunter

[4] Dummett, M. (1981). Frege: Philosophy of Language. Harvard University Press. p. 82. ISBN 9780674319318. Retrieved2014-10-15.

Page 76: Syntax (Logic)

14.3. SEE ALSO 69

[5] Lear, J. (1986). Aristotle and Logical Theory. Cambridge University Press. p. 1. ISBN 9780521311786. Retrieved2014-10-15.

[6] Creath, R.; Friedman, M. (2007). The Cambridge Companion to Carnap. Cambridge University Press. p. 189. ISBN9780521840156. Retrieved 2014-10-15.

[7] “syntactic consequence from FOLDOC”. swif.uniba.it. Retrieved 2014-10-15.

[8] Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of CaliforniaPres, 1971, p. 75.

[9] “A Note on Interaction and Incompleteness” (PDF). Retrieved 2014-10-15.

[10] “Normal forms and syntactic completeness proofs for functional independencies”. portal.acm.org. Retrieved 2014-10-15.

[11] Barwise, J. (1982). Handbook of Mathematical Logic. Elsevier Science. p. 236. ISBN 9780080933641. Retrieved2014-10-15.

[12] “syntactic completeness from FOLDOC”. swif.uniba.it. Retrieved 2014-10-15.

14.3 See also• Symbol (formal)

• Formation rule

• Formal grammar

• Syntax (linguistics)

• Syntax (programming languages)

• Mathematical logic

• Well-formed formula

Page 77: Syntax (Logic)

Chapter 15

Theorem

For the Italian film, see Teorema (film).In mathematics, a theorem is a statement that has been proven on the basis of previously established statements,such as other theorems—and generally accepted statements, such as axioms. The proof of a mathematical theoremis a logical argument for the theorem statement given in accord with the rules of a deductive system. The proof ofa theorem is often interpreted as justification of the truth of the theorem statement. In light of the requirement thattheorems be proved, the concept of a theorem is fundamentally deductive, in contrast to the notion of a scientifictheory, which is empirical.[2]

Many mathematical theorems are conditional statements. In this case, the proof deduces the conclusion from condi-tions called hypotheses or premises. In light of the interpretation of proof as justification of truth, the conclusion isoften viewed as a necessary consequence of the hypotheses, namely, that the conclusion is true in case the hypothe-ses are true, without any further assumptions. However, the conditional could be interpreted differently in certaindeductive systems, depending on the meanings assigned to the derivation rules and the conditional symbol.Although they can be written in a completely symbolic form, for example, within the propositional calculus, theoremsare often expressed in a natural language such as English. The same is true of proofs, which are often expressed aslogically organized and clearly worded informal arguments, intended to convince readers of the truth of the statementof the theorem beyond any doubt, and from which a formal symbolic proof can in principle be constructed. Sucharguments are typically easier to check than purely symbolic ones—indeed, many mathematicians would express apreference for a proof that not only demonstrates the validity of a theorem, but also explains in some way why it isobviously true. In some cases, a picture alone may be sufficient to prove a theorem. Because theorems lie at the coreof mathematics, they are also central to its aesthetics. Theorems are often described as being “trivial”, or “difficult”,or “deep”, or even “beautiful”. These subjective judgments vary not only from person to person, but also with time:for example, as a proof is simplified or better understood, a theorem that was once difficult may become trivial. Onthe other hand, a deep theorem may be simply stated, but its proof may involve surprising and subtle connectionsbetween disparate areas of mathematics. Fermat’s Last Theorem is a particularly well-known example of such atheorem.

15.1 Informal account of theorems

Logically, many theorems are of the form of an indicative conditional: if A, then B. Such a theorem does not assertB, only that B is a necessary consequence of A. In this case A is called the hypothesis of the theorem (note that“hypothesis” here is something very different from a conjecture) and B the conclusion (formally, A and B are termedthe antecedent and consequent). The theorem “If n is an even natural number then n/2 is a natural number” is a typicalexample in which the hypothesis is "n is an even natural number” and the conclusion is "n/2 is also a natural number”.To be proven, a theorem must be expressible as a precise, formal statement. Nevertheless, theorems are usuallyexpressed in natural language rather than in a completely symbolic form, with the intention that the reader can producea formal statement from the informal one.It is common in mathematics to choose a number of hypotheses within a given language and declare that the theoryconsists of all statements provable from these hypotheses. These hypothesis form the foundational basis of the theoryand are called axioms or postulates. The field of mathematics known as proof theory studies formal languages, axioms

70

Page 78: Syntax (Logic)

15.2. PROVABILITY AND THEOREMHOOD 71

and the structure of proofs.Some theorems are “trivial”, in the sense that they follow from definitions, axioms, and other theorems in obviousways and do not contain any surprising insights. Some, on the other hand, may be called “deep”, because their proofsmay be long and difficult, involve areas of mathematics superficially distinct from the statement of the theorem itself,or show surprising connections between disparate areas of mathematics.[3] A theorem might be simple to state andyet be deep. An excellent example is Fermat’s Last Theorem, and there are many other examples of simple yet deeptheorems in number theory and combinatorics, among other areas.Other theorems have a known proof that cannot easily be written down. The most prominent examples are the fourcolor theorem and the Kepler conjecture. Both of these theorems are only known to be true by reducing them to acomputational search that is then verified by a computer program. Initially, many mathematicians did not accept thisform of proof, but it has become more widely accepted. The mathematician Doron Zeilberger has even gone so far asto claim that these are possibly the only nontrivial results that mathematicians have ever proved.[4] Many mathemat-ical theorems can be reduced to more straightforward computation, including polynomial identities, trigonometricidentities and hypergeometric identities.[5]

15.2 Provability and theoremhood

To establish a mathematical statement as a theorem, a proof is required, that is, a line of reasoning from axioms in thesystem (and other, already established theorems) to the given statement must be demonstrated. However, the proof isusually considered as separate from the theorem statement. Although more than one proof may be known for a singletheorem, only one proof is required to establish the status of a statement as a theorem. The Pythagorean theorem andthe law of quadratic reciprocity are contenders for the title of theorem with the greatest number of distinct proofs.

15.3 Relation with scientific theories

Theorems inmathematics and theories in science are fundamentally different in their epistemology. A scientific theorycannot be proven; its key attribute is that it is falsifiable, that is, it makes predictions about the natural world that aretestable by experiments. Any disagreement between prediction and experiment demonstrates the incorrectness of thescientific theory, or at least limits its accuracy or domain of validity. Mathematical theorems, on the other hand, arepurely abstract formal statements: the proof of a theorem cannot involve experiments or other empirical evidence inthe same way such evidence is used to support scientific theories.Nonetheless, there is some degree of empiricism and data collection involved in the discovery of mathematical the-orems. By establishing a pattern, sometimes with the use of a powerful computer, mathematicians may have an ideaof what to prove, and in some cases even a plan for how to set about doing the proof. For example, the Collatzconjecture has been verified for start values up to about 2.88 × 1018. The Riemann hypothesis has been verified forthe first 10 trillion zeroes of the zeta function. Neither of these statements is considered proven.Such evidence does not constitute proof. For example, the Mertens conjecture is a statement about natural numbersthat is now known to be false, but no explicit counterexample (i.e., a natural number n for which the Mertens functionM(n) equals or exceeds the square root of n) is known: all numbers less than 1014 have the Mertens property, and thesmallest number that does not have this property is only known to be less than the exponential of 1.59 × 1040, whichis approximately 10 to the power 4.3 × 1039. Since the number of particles in the universe is generally consideredless than 10 to the power 100 (a googol), there is no hope to find an explicit counterexample by exhaustive search.Note that the word “theory” also exists in mathematics, to denote a body of mathematical axioms, definitions andtheorems, as in, for example, group theory. There are also “theorems” in science, particularly physics, and in en-gineering, but they often have statements and proofs in which physical assumptions and intuition play an importantrole; the physical axioms on which such “theorems” are based are themselves falsifiable.

15.4 Terminology

A number of different terms for mathematical statements exist, these terms indicate the role statements play in aparticular subject. The distinction between different terms is sometimes rather arbitrary and the usage of some termshas evolved over time.

Page 79: Syntax (Logic)

72 CHAPTER 15. THEOREM

• An axiom or postulate is a statement that is accepted without proof and regarded as fundamental to a subject.Historically these have been regarded as “self-evident”, but more recently they are considered assumptions thatcharacterize the subject of study. In classical geometry, axioms are general statements, while postulates arestatements about geometrical objects.[6] A definition is also accepted without proof since it simply gives themeaning of a word or phrase in terms of known concepts.

• An unproven statement that is believed true is called a conjecture (or sometimes a hypothesis, but with adifferent meaning from the one discussed above). To be considered a conjecture, a statement must usuallybe proposed publicly, at which point the name of the proponent may be attached to the conjecture, as withGoldbach’s conjecture. Other famous conjectures include the Collatz conjecture and the Riemann hypothesis.On the other hand, Fermat’s last theorem has always been known by that name, even before it was proven; itwas never known as “Fermat’s conjecture”.

• A proposition is a theorem of no particular importance. This term sometimes connotes a statement with asimple proof, while the term theorem is usually reserved for the most important results or those with long ordifficult proofs. In classical geometry, a proposition may be a construction that satisfies given requirements; forexample, Proposition 1 in Book I of Euclid’s elements is the construction of an equilateral triangle.[7]

• A lemma is a “helping theorem”, a proposition with little applicability except that it forms part of the proofof a larger theorem. In some cases, as the relative importance of different theorems becomes more clear, whatwas once considered a lemma is now considered a theorem, though the word “lemma” remains in the name.Examples include Gauss’s lemma, Zorn’s lemma, and the Fundamental lemma.

• A corollary is a proposition that follows with little or no proof from another theorem or definition.[8]

• A converse of a theorem is a statement formed by interchanging what is given in a theorem and what is to beproved. For example, the isosceles triangle theorem states that if two sides of a triangle are equal then twoangles are equal. In the converse, the given (that two sides are equal) and what is to be proved (that two anglesare equal) are swapped, so the converse is the statement that if two angles of a triangle are equal then twosides are equal. In this example, the converse can be proven as another theorem, but this is often not the case.For example, the converse to the theorem that two right angles are equal angles is the statement that two equalangles must be right angles, and this is clearly not always the case.[9]

• A generalization is a theorem which includes a previously proven theorem as a special case and hence as acorollary.

There are other terms, less commonly used, that are conventionally attached to proven statements, so that certaintheorems are referred to by historical or customary names. For example:

• An identity is an equality, contained in a theorem, between two mathematical expressions that holds regardlessof what values are used for any variables or parameters appearing in the expressions. Examples include Euler’sformula and Vandermonde’s identity.

• A rule is a theorem, such as Bayes’ rule and Cramer’s rule, that establishes a useful formula.

• A law or a principle is a theorem that applies in a wide range of circumstances. Examples include the lawof large numbers, the law of cosines, Kolmogorov’s zero-one law, Harnack’s principle, the least upper boundprinciple, and the pigeonhole principle.[10]

A few well-known theorems have even more idiosyncratic names. The division algorithm (see Euclidean division) isa theorem expressing the outcome of division in the natural numbers and more general rings. The Bézout’s identityis a theorem asserting that the greatest common divisor of two numbers may be written as a linear combination ofthese numbers. The Banach–Tarski paradox is a theorem in measure theory that is paradoxical in the sense that itcontradicts common intuitions about volume in three-dimensional space.

Page 80: Syntax (Logic)

15.5. LAYOUT 73

15.5 Layout

A theorem and its proof are typically laid out as follows:

Theorem (name of person who proved it and year of discovery, proof or publication).Statement of theorem (sometimes called the proposition).Proof.Description of proof.

End mark.

The end of the proof may be signalled by the letters Q.E.D. (quod erat demonstrandum) or by one of the tombstonemarks "□" or "∎" meaning “End of Proof”, introduced by Paul Halmos following their usage in magazine articles.The exact style depends on the author or publication. Many publications provide instructions or macros for typesettingin the house style.It is common for a theorem to be preceded by definitions describing the exact meaning of the terms used in thetheorem. It is also common for a theorem to be preceded by a number of propositions or lemmas which are thenused in the proof. However, lemmas are sometimes embedded in the proof of a theorem, either with nested proofs,or with their proofs presented after the proof of the theorem.Corollaries to a theorem are either presented between the theorem and the proof, or directly after the proof. Some-times, corollaries have proofs of their own that explain why they follow from the theorem.

15.6 Lore

It has been estimated that over a quarter of a million theorems are proved every year.[11]

The well-known aphorism, “A mathematician is a device for turning coffee into theorems”, is probably due to AlfrédRényi, although it is often attributed to Rényi’s colleague Paul Erdős (and Rényi may have been thinking of Erdős),who was famous for the many theorems he produced, the number of his collaborations, and his coffee drinking.[12]

The classification of finite simple groups is regarded by some to be the longest proof of a theorem. It comprisestens of thousands of pages in 500 journal articles by some 100 authors. These papers are together believed to givea complete proof, and several ongoing projects hope to shorten and simplify this proof.[13] Another theorem of thistype is the Four color theorem whose computer generated proof is too long for a human to read. It is certainly thelongest known proof of a theorem whose statement can be easily understood by a layman.

15.7 Theorems in logic

Logic, especially in the field of proof theory, considers theorems as statements (called formulas or well formedformulas) of a formal language. The statements of the language are strings of symbols and may be broadly dividedinto nonsense and well-formed formulas. A set of deduction rules, also called transformation rules or rules ofinference, must be provided. These deduction rules tell exactly when a formula can be derived from a set of premises.The set of well-formed formulas may be broadly divided into theorems and non-theorems. However, according toHofstadter, a formal system often simply defines all its well-formed formula as theorems.[14]

Different sets of derivation rules give rise to different interpretations of what it means for an expression to be atheorem. Some derivation rules and formal languages are intended to capture mathematical reasoning; the mostcommon examples use first-order logic. Other deductive systems describe term rewriting, such as the reduction rulesfor λ calculus.The definition of theorems as elements of a formal language allows for results in proof theory that study the structureof formal proofs and the structure of provable formulas. The most famous result is Gödel’s incompleteness theorem;by representing theorems about basic number theory as expressions in a formal language, and then representingthis language within number theory itself, Gödel constructed examples of statements that are neither provable nordisprovable from axiomatizations of number theory.

Page 81: Syntax (Logic)

74 CHAPTER 15. THEOREM

A theorem may be expressed in a formal language (or “formalized”). A formal theorem is the purely formal analogueof a theorem. In general, a formal theorem is a type of well-formed formula that satisfies certain logical and syntacticconditions. The notation S is often used to indicate that S is a theorem.Formal theorems consist of formulas of a formal language and the transformation rules of a formal system. Specif-ically, a formal theorem is always the last formula of a derivation in some formal system each formula of which isa logical consequence of the formulas that came before it in the derivation. The initially accepted formulas in thederivation are called its axioms, and are the basis on which the theorem is derived. A set of theorems is called atheory.What makes formal theorems useful and of interest is that they can be interpreted as true propositions and theirderivations may be interpreted as a proof of the truth of the resulting expression. A set of formal theorems may bereferred to as a formal theory. A theorem whose interpretation is a true statement about a formal system is called ametatheorem.

15.7.1 Syntax and semantics

Main articles: Syntax (logic) and Formal semantics (logic)

The concept of a formal theorem is fundamentally syntactic, in contrast to the notion of a true proposition, whichintroduces semantics. Different deductive systems can yield other interpretations, depending on the presumptionsof the derivation rules (i.e. belief, justification or other modalities). The soundness of a formal system depends onwhether or not all of its theorems are also validities. A validity is a formula that is true under any possible inter-pretation, e.g. in classical propositional logic validities are tautologies. A formal system is considered semanticallycomplete when all of its tautologies are also theorems.

15.7.2 Derivation of a theorem

Main article: Formal proof

The notion of a theorem is very closely connected to its formal proof (also called a “derivation”). To illustrate howderivations are done, we will work in a very simplified formal system. Let us call ours FS Its alphabet consists onlyof two symbols { A, B } and its formation rule for formulas is:

FS

The single axiom of FS is:

ABBA.

The only rule of inference (transformation rule) for FS is:

Any occurrence of "A" in a theorem may be replaced by an occurrence of the string "AB" and the resultis a theorem.

Theorems in FS are defined as those formulae that have a derivation ending with that formula. For example

1. ABBA (Given as axiom)

2. ABBBA (by applying the transformation rule)

3. ABBBAB (by applying the transformation rule)

is a derivation. Therefore, "ABBBAB" is a theorem of FS . The notion of truth (or falsity) cannot be applied to theformula "ABBBAB" until an interpretation is given to its symbols. Thus in this example, the formula does not yetrepresent a proposition, but is merely an empty abstraction.Two metatheorems of FS are:

Page 82: Syntax (Logic)

15.8. SEE ALSO 75

Every theorem begins with "A".Every theorem has exactly two "A“s.

15.7.3 Interpretation of a formal theorem

Main article: Interpretation (logic)

15.7.4 Theorems and theories

Main articles: Theory and Theory (mathematical logic)

15.8 See also

• Inference

• List of theorems

• Toy theorem

• Metamath – a language for developing strictly formalized mathematical definitions and proofs accompanied bya proof checker for this language and a growing database of thousands of proved theorems

15.9 Notes

[1] For full text of 2nd edition of 1940, see Elisha Scott Loomis. “The Pythagorean proposition: its demonstrations analyzedand classified, and bibliography of sources for data of the four kinds of proofs” (PDF). Education Resources InformationCenter. Institute of Education Sciences (IES) of the U.S. Department of Education. Retrieved 2010-09-26. Originallypublished in 1940 and reprinted in 1968 by National Council of Teachers of Mathematics.

[2] However, both theorems and theories are investigations. See Heath 1897 Introduction, The terminology of Archimedes, p.clxxxii:"theorem (θεὼρνμα) from θεωρεἳν to investigate”

[3] Weisstein, Eric W., “Deep Theorem”, MathWorld.

[4] Doron Zeilberger. “Opinion 51”.

[5] Petkovsek et al. 1996.

[6] Wentworth, G.; Smith, D.E. (1913). “Art. 46, 47”. Plane Geometry. Ginn & Co.

[7] Wentworth & Smith Art. 50

[8] Wentworth & Smith Art. 51

[9] Follows Wentworth & Smith Art. 79

[10] The word law can also refer to an axiom, a rule of inference, or, in probability theory, a probability distribution.

[11] Hoffman 1998, p. 204.

[12] Hoffman 1998, p. 7.

[13] An enormous theorem: the classification of finite simple groups, Richard Elwes, Plus Magazine, Issue 41 December 2006.

[14] Hofstadter 1980

Page 83: Syntax (Logic)

76 CHAPTER 15. THEOREM

15.10 References• Heath, Sir Thomas Little (1897). The works of Archimedes. Dover. Retrieved 2009-11-15.

• Hoffman, P. (1998). The Man Who Loved Only Numbers: The Story of Paul Erdős and the Search for Mathe-matical Truth. Hyperion, New York. ISBN 1-85702-829-5.

• Hofstadter, Douglas (1979). Gödel, Escher, Bach: An Eternal Golden Braid. Basic Books.

• Hunter, Geofrfrey (1996) [1973]. Metalogic: An Introduction to the Metatheory of Standard First Order Logic.University of California Press. ISBN 0-520-02356-0.

• Mates, Benson (1972). Elementary Logic. Oxford University Press. ISBN 0-19-501491-X.

• Petkovsek, Marko; Wilf, Herbert; Zeilberger, Doron (1996). A = B. A.K. Peters, Wellesley, Massachusetts.ISBN 1-56881-063-6.

15.11 External links• Weisstein, Eric W., “Theorem”, MathWorld.

• Theorem of the Day

Page 84: Syntax (Logic)

15.11. EXTERNAL LINKS 77

The Pythagorean theorem has at least 370 known proofs[1]

Page 85: Syntax (Logic)

78 CHAPTER 15. THEOREM

A planar map with five colors such that no two regions with the same color meet. It can actually be colored in this way with onlyfour colors. The four color theorem states that such colorings are possible for any planar map, but every known proof involves acomputational search that is too long to check by hand.

Page 86: Syntax (Logic)

15.11. EXTERNAL LINKS 79

The Collatz conjecture: one way to illustrate its complexity is to extend the iteration from the natural numbers to the complex numbers.The result is a fractal, which (in accordance with universality) resembles the Mandelbrot set.

Page 87: Syntax (Logic)

80 CHAPTER 15. THEOREM

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities that can be constructed from formal languages. The symbols and strings of symbols maybe broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of itswell-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

Page 88: Syntax (Logic)

Chapter 16

Theory (mathematical logic)

In mathematical logic, a theory (also called a formal theory) is a set of sentences in a formal language. Usually adeductive system is understood from context. An element ϕ ∈ T of a theory T is then called an axiom of the theory,and any sentence that follows from the axioms ( T ⊢ ϕ ) is called a theorem of the theory. Every axiom is also atheorem. A first-order theory is a set of first-order sentences.

16.1 Theories expressed in formal language generally

When defining theories for foundational purposes, additional care must be taken and normal set-theoretic languagemay not be appropriate.The construction of a theory begins by specifying a definite non-empty conceptual class E , the elements of whichare called statements. These initial statements are often called the primitive elements or elementary statements of thetheory, to distinguish them from other statements which may be derived from them.A theory T is a conceptual class consisting of certain of these elementary statements. The elementary statementswhich belong to T are called the elementary theorems of T and said to be true. In this way, a theory is a way ofdesignating a subset of E which consists entirely of true statements.This general way of designating a theory stipulates that the truth of any of its elementary statements is not knownwithout reference to T . Thus the same elementary statement may be true with respect to one theory, and not truewith respect to another. This is as in ordinary language, where statements such as “He is a terrible person.” cannot bejudged to be true or false without reference to some interpretation of who “He” is and for that matter what a “terribleperson” is under this theory.[1]

16.1.1 Subtheories and extensions

A theory S is a subtheory of a theory T if S is a subset of T. If T is a subset of S then S is an extension or supertheoryof T

16.1.2 Deductive theories

A theory is said to be a deductive theory if T is an inductive class. That is, that its content is based on some formaldeductive system and that some of its elementary statements are taken as axioms. In a deductive theory, any sentencewhich is a logical consequence of one or more of the axioms is also a sentence of that theory.[1]

16.1.3 Consistency and completeness

Main articles: Consistency and Completeness (logic)

81

Page 89: Syntax (Logic)

82 CHAPTER 16. THEORY (MATHEMATICAL LOGIC)

A syntactically consistent theory is a theory fromwhich not every sentence in the underlying language can be proven(with respect to some deductive system which is usually clear from context). In a deductive system (such as first-orderlogic) that satisfies the principle of explosion, this is equivalent to requiring that there is no sentence φ such that bothφ and its negation can be proven from the theory.A satisfiable theory is a theory that has a model. This means there is a structure M that satisfies every sentence inthe theory. Any satisfiable theory is syntactically consistent, because the structure satisfying the theory will satisfyexactly one of φ and the negation of φ, for each sentence φ.A consistent theory is sometimes defined to be a syntactically consistent theory, and sometimes defined to be asatisfiable theory. For first-order logic, the most important case, it follows from the completeness theorem that thetwo meanings coincide. In other logics, such as second-order logic, there are syntactically consistent theories that arenot satisfiable, such as ω-inconsistent theories.A complete consistent theory (or just a complete theory) is a consistent theory T such that for every sentence φ inits language, either φ is provable from T or T ∪ {φ} is inconsistent. For theories closed under logical consequence,this means that for every sentence φ, either φ or its negation is contained in the theory. An incomplete theory is aconsistent theory that is not complete.See also ω-consistent theory for a stronger notion of consistency.

16.1.4 Interpretation of a theory

Main article: Interpretation (logic)

An interpretation of a theory is the relationship between a theory and some contensive subject matter when there is amany-to-one correspondence between certain elementary statements of the theory, and certain contensive statementsrelated to the subject matter. If every elementary statement in the theory has a contensive correspondent it is calleda full interpretation, otherwise it is called a partial interpretation.[2]

16.1.5 Theories associated with a structure

Each structure has several associated theories. The complete theory of a structure A is the set of all first-ordersentences over the signature of A which are satisfied by A. It is denoted by Th(A). More generally, the theory of K,a class of σ-structures, is the set of all first-order σ-sentences that are satisfied by all structures in K, and is denotedby Th(K). Clearly Th(A) = Th({A}). These notions can also be defined with respect to other logics.For each σ-structure A, there are several associated theories in a larger signature σ' that extends σ by adding one newconstant symbol for each element of the domain of A. (If the new constant symbols are identified with the elementsof A which they represent, σ' can be taken to be σ ∪ A.) The cardinality of σ' is thus the larger of the cardinality ofσ and the cardinality of A.The diagram of A consists of all atomic or negated atomic σ'-sentences that are satisfied by A and is denoted bydiagA. The positive diagram of A is the set of all atomic σ'-sentences which A satisfies. It is denoted by diag+A. Theelementary diagram of A is the set eldiagA of all first-order σ'-sentences that are satisfied by A or, equivalently, thecomplete (first-order) theory of the natural expansion of A to the signature σ'.

16.2 First-order theories

Further information: List of first-order theories

A first-order theory QS is a set of sentences in a first-order formal language Q .

16.2.1 Derivation in a first order theory

Main article: First order logic § Deductive systems

Page 90: Syntax (Logic)

16.3. EXAMPLES 83

There are many formal derivation (“proof”) systems for first-order logic.

16.2.2 Syntactic consequence in a first order theory

Main article: First-order logic § Validity, satisfiability, and logical consequence

A formula A is a syntactic consequence of a first-order theory QS if there is a derivation of A using only formulasin QS as non-logical axioms. Such a formula A is also called a theorem ofQS . The notation " QS ⊢ A " indicatesA is a theorem of QS

16.2.3 Interpretation of a first order theory

Main article: Structure (mathematical logic)

An interpretation of a first-order theory provides a semantics for the formulas of the theory. An interpretation issaid to satisfy a formula if the formula is true according to the interpretation. A model of a first order theory QS isan interpretation in which every formula of QS is satisfied.

16.2.4 First order theories with identity

Main article: First order logic § Equality and its axioms

A first order theory QS is a first-order theory with identity if QS includes the identity relation symbol "=" and thereflexivity and substitution axiom schemes for this symbol.

16.2.5 Topics related to first order theories

• Compactness theorem

• Consistent set

• Deduction theorem

• Enumeration theorem

• Lindenbaum’s lemma

• Löwenheim–Skolem theorem

16.3 Examples

One way to specify a theory is to define a set of axioms in a particular language. The theory can be taken to includejust those axioms, or their logical or provable consequences, as desired. Theories obtained this way include ZFC andPeano arithmetic.A second way to specify a theory is to begin with a structure and then let the theory be the set of sentences that aresatisfied by the structure. This is one method for producing complete theories, described below. Examples of theoriesof this sort include the sets of true sentences in the structures (N, +, ×, 0, 1, =) and (R, +, ×, 0, 1, =), where N isthe set of natural numbers and R is the set of real numbers. The first of these, called the theory of true arithmetic,cannot be written as the set of logical consequences of any enumerable set of axioms. The theory of (R, +, ×, 0, 1,=) was shown by Tarski to be decidable; it is the theory of real closed fields.

Page 91: Syntax (Logic)

84 CHAPTER 16. THEORY (MATHEMATICAL LOGIC)

16.4 See also• Axiomatic system

• List of first-order theories

16.5 References[1] Curry, Haskell, Foundations of Mathematical Logic

[2] Curry, Haskell, Foundations of Mathematical Logic p.48

16.6 Further reading• Hodges, Wilfrid (1997). A shorter model theory. Cambridge University Press. ISBN 0-521-58713-1.

Page 92: Syntax (Logic)

Chapter 17

Unate function

A unate function is a type of boolean function which has monotonic properties. They have been studied extensivelyin switching theory.A function f(x1, x2, . . . , xn) is said to be positive unate in xi if for all possible values of xj , j ̸= i

f(x1, x2, . . . , xi−1, 1, xi+1, . . . , xn) ≥ f(x1, x2, . . . , xi−1, 0, xi+1, . . . , xn).

Likewise, it is negative unate in xi if

f(x1, x2, . . . , xi−1, 0, xi+1, . . . , xn) ≥ f(x1, x2, . . . , xi−1, 1, xi+1, . . . , xn).

If for every xi f is either positive or negative unate in the variable xi then it is said to be unate (note that some ximay be positive unate and some negative unate to satisfy the definition of unate function). A function is binate if itis not unate (i.e., is neither positive unate nor negative unate in at least one of its variables).For example the Logical disjunction function or with boolean values used for true (1) and false (0) is positive unate.NB: positive unateness can also be considered as passing the same slope (no change in the input) and negative unateis passing the opposite slope.... non unate is dependence on more than one input (of same or different slopes)

85

Page 93: Syntax (Logic)

Chapter 18

Variable (mathematics)

For variables in computer science, see Variable (computer science). For other uses, see Variable (disambiguation).

In elementary mathematics, a variable is an alphabetic character representing a number, called the value of thevariable, which is either arbitrary or not fully specified or unknown. Making algebraic computations with variables asif they were explicit numbers allows one to solve a range of problems in a single computation. A typical example isthe quadratic formula, which allows one to solve every quadratic equation by simply substituting the numeric valuesof the coefficients of the given equation to the variables that represent them.The concept of variable is also fundamental in calculus. Typically, a function y = f(x) involves two variables, y andx, representing respectively the value and the argument of the function. The term “variable” comes from the fact that,when the argument (also called the “variable of the function”) varies, then the value varies accordingly.[1]

In more advanced mathematics, a variable is a symbol that denotes a mathematical object, which could be a number,a vector, a matrix, or even a function. In this case, the original property of “variability” of a variable is not kept(except, sometimes, for informal explanations).Similarly, in computer science, a variable is a name (commonly an alphabetic character or a word) representingsome value represented in computer memory. In mathematical logic, a variable is either a symbol representing anunspecified term of the theory, or a basic object of the theory, which is manipulated without referring to its possibleintuitive interpretation.

18.1 Etymology

“Variable” comes from a Latin word, variābilis, with "vari(us)"' meaning “various” and "-ābilis"' meaning "-able”,meaning “capable of changing”.[2]

18.2 Genesis and evolution of the concept

François Viète introduced at the end of 16th century the idea of representing known and unknown numbers by letters,nowadays called variables, and of computing with them as if they were numbers, in order to obtain, at the end, theresult by a simple replacement. François Viète's convention was to use consonants for known values and vowels forunknowns.[3]

In 1637, René Descartes “invented the convention of representing unknowns in equations by x, y, and z, and knownsby a, b, and c".[4] Contrarily to Viète’s convention, Descartes’ one is still commonly in use.Starting in the 1660s, Isaac Newton and Gottfried Wilhelm Leibniz independently developed the infinitesimal calcu-lus, which essentially consists of studying how an infinitesimal variation of a variable quantity induces a correspondingvariation of another quantity which is a function of the first variable (quantity). Almost a century later Leonhard Eu-ler fixed the terminology of infinitesimal calculus and introduced the notation y = f(x) for a function f, its variablex and its value y. Until the end of the 19th century, the word variable referred almost exclusively to the argumentsand the values of functions.

86

Page 94: Syntax (Logic)

18.3. SPECIFIC KINDS OF VARIABLES 87

In the second half of the 19th century, it appeared that the foundation of infinitesimal calculus was not formalizedenough to deal with apparent paradoxes such as a continuous function which is nowhere differentiable. To solve thisproblem, Karl Weierstrass introduced a new formalism consisting of replacing the intuitive notion of limit by a formaldefinition. The older notion of limit was “when the variable x varies and tends toward a, then f(x) tends toward L",without any accurate definition of “tends”. Weierstrass replaced this sentence by the formula

(∀ϵ > 0)(∃η > 0)(∀x) |x− a| < η ⇒ |L− f(x)| < ϵ,

in which none of the five variables is considered as varying.This static formulation led to the modern notion of variable which is simply a symbol representing a mathematicalobject which either is unknown or may be replaced by any element of a given set; for example, the set of real numbers.

18.3 Specific kinds of variables

It is common that many variables appear in the same mathematical formula, which play different roles. Some namesor qualifiers have been introduced to distinguish them. For example, in the general cubic equation

ax3 + bx2 + cx+ d = 0,

there are five variables. Four of them, a, b, c, d represent given numbers, and the last one, x, represents the unknownnumber, which is a solution of the equation. To distinguish them, the variable x is called an unknown, and the othervariables are called parameters or coefficients, or sometimes constants, although this last terminology is incorrect foran equation and should be reserved for the function defined by the left-hand side of this equation.In the context of functions, the term variable refers commonly to the arguments of the functions. This is typicallythe case in sentences like "function of a real variable", "x is the variable of the function f: x↦ f(x)", "f is a functionof the variable x" (meaning that the argument of the function is referred to by the variable x).In the same context, the variables that are independent of x define constant functions and are therefore called constant.For example, a constant of integration is an arbitrary constant function that is added to a particular antiderivativeto obtain the other antiderivatives. Because the strong relationship between polynomials and polynomial function,the term “constant” is often used to denote the coefficients of a polynomial, which are constant functions of theindeterminates.This use of “constant” as an abbreviation of “constant function” must be distinguished from the normal meaning ofthe word in mathematics. A constant, or mathematical constant is a well and unambiguously defined number orother mathematical object, as, for example, the numbers 0, 1, π and the identity element of a group.Here are other specific names for variables.

• A unknown is a variable in which an equation has to be solved for.

• An indeterminate is a symbol, commonly called variable, that appears in a polynomial or a formal powerseries. Formally speaking, an indeterminate is not a variable, but a constant in the polynomial ring of the ringof formal power series. However, because of the strong relationship between polynomials or power series andthe functions that they define, many authors consider indeterminates as a special kind of variables.

• A parameter is a quantity (usually a number) which is a part of the input of a problem, and remains constantduring the whole solution of this problem. For example, in mechanics the mass and the size of a solid bodyare parameters for the study of its movement. It should be noted that in computer science, parameter has adifferent meaning and denotes an argument of a function.

• Free variables and bound variables

• A random variable is a kind of variable that is used in probability theory and its applications.

It should be emphasized that all these denominations of variables are of semantic nature and that the way of computingwith them (syntax) is the same for all.

Page 95: Syntax (Logic)

88 CHAPTER 18. VARIABLE (MATHEMATICS)

18.3.1 Dependent and independent variables

Main article: Dependent and independent variables

In calculus and its application to physics and other sciences, it is rather common to consider a variable, say y, whosepossible values depend of the value of another variable, say x. In mathematical terms, the dependent variable yrepresents the value of a function of x. To simplify formulas, it is often useful to use the same symbol for thedependent variable y and the function mapping x onto y. For example, the state of a physical system depends onmeasurable quantities such as the pressure, the temperature, the spatial position, ..., and all these quantities varieswhen the system evolves, that is, they are function of the time. In the formulas describing the system, these quantitiesare represented by variables which are dependent on the time, and thus considered implicitly as functions of the time.Therefore, in a formula, a dependent variable is a variable that is implicitly a function of another (or several other)variables. An independent variable is a variable that is not dependent.[5]

The property of a variable to be dependent or independent depends often of the point of view and is not intrinsic. Forexample, in the notation f(x, y, z), the three variables may be all independent and the notation represents a functionof three variables. On the other hand, if y and z depend on x (are dependent variables) then the notation represent afunction of the single independent variable x.[6]

18.3.2 Examples

If one defines a function f from the real numbers to the real numbers by

f(x) = x2 + sin(x+ 4)

then x is a variable standing for the argument of the function being defined, which can be any real number. In theidentity

n∑i=1

i =n2 + n

2

the variable i is a summation variable which designates in turn each of the integers 1, 2, ..., n (it is also called indexbecause its variation is over a discrete set of values) while n is a parameter (it does not vary within the formula).In the theory of polynomials, a polynomial of degree 2 is generally denoted as ax2 + bx + c, where a, b and c are calledcoefficients (they are assumed to be fixed, i.e., parameters of the problem considered) while x is called a variable.When studying this polynomial for its polynomial function this x stands for the function argument. When studyingthe polynomial as an object in itself, x is taken to be an indeterminate, and would often be written with a capital letterinstead to indicate this status.

18.4 Notation

In mathematics, the variables are generally denoted by a single letter. However, this letter is frequently followed bya subscript, as in x2, and this subscript may be a number, another variable (xi), a word or the abbreviation of a word(xᵢ and xₒᵤ ), and even a mathematical expression. Under the influence of computer science, one may encounter inpure mathematics some variable names consisting in several letters and digits.Following the 17th century French philosopher and mathematician, René Descartes, letters at the beginning of thealphabet, e.g. a, b, c are commonly used for known values and parameters, and letters at the end of the alphabet, e.g.x, y, z, and t are commonly used for unknowns and variables of functions.[7] In printed mathematics, the norm is toset variables and constants in an italic typeface.[8]

For example, a general quadratic function is conventionally written as:

ax2 + bx+ c ,

Page 96: Syntax (Logic)

18.4. NOTATION 89

where a, b and c are parameters (also called constants, because they are constant functions), while x is the variable ofthe function. A more explicit way to denote this function is

x 7→ ax2 + bx+ c ,

which makes the function-argument status of x clear, and thereby implicitly the constant status of a, b and c. Since coccurs in a term that is a constant function of x, it is called the constant term.[9]:18

Specific branches and applications of mathematics usually have specific naming conventions for variables. Variableswith similar roles or meanings are often assigned consecutive letters. For example, the three axes in 3D coordinatespace are conventionally called x, y, and z. In physics, the names of variables are largely determined by the physicalquantity they describe, but various naming conventions exist. A convention often followed in probability and statisticsis to use X, Y, Z for the names of random variables, keeping x, y, z for variables representing corresponding actualvalues.There are many other notational usages. Usually, variables that play a similar role are represented by consecutiveletters or by the same letter with different subscript. Below are some of the most common usages.

• a, b, c, and d (sometimes extended to e and f) often represent parameters or coefficients.

• a0, a1, a2, ... play a similar role, when otherwise too many different letters would be needed.

• ai or ui is often used to denote the i-th term of a sequence or the i-th coefficient of a series.

• f and g (sometimes h) commonly denote functions.

• i, j, and k (sometimes l or h) are often used to denote varying integers or indices in an indexed family.

• l and w are often used to represent the length and width of a figure.

• l is also used to denote a line. In number theory, l often denotes a prime number not equal to p.

• n usually denotes a fixed integer, such as a count of objects or the degree of an equation.

• When two integers are needed, for example for the dimensions of a matrix, one uses commonly m and n.

• p often denotes a prime numbers or a probability.

• q often denotes a prime power or a quotient

• r often denotes a remainder.

• t often denotes time.

• x, y and z usually denote the three Cartesian coordinates of a point in Euclidean geometry. By extension, theyare used to name the corresponding axes.

• z typically denotes a complex number, or, in statistics, a normal random variable.

• α, β, γ, θ and φ commonly denote angle measures.

• ε usually represents an arbitrarily small positive number.

• ε and δ commonly denote two small positives.

• λ is used for eigenvalues.

• σ often denotes a sum, or, in statistics, the standard deviation.

Page 97: Syntax (Logic)

90 CHAPTER 18. VARIABLE (MATHEMATICS)

18.5 See also• Free variables and bound variables (Bound variables are also known as dummy variables)

• Variable (programming)

• Mathematical expression

• Physical constant

• Coefficient

• Constant of integration

• Constant term of a polynomial

• Indeterminate (variable)

• Lambda calculus

18.6 Bibliography• J. Edwards (1892). Differential Calculus. London: MacMillan and Co. pp. 1 ff.

• Karl Menger, “On Variables in Mathematics and in Natural Science”, The British Journal for the Philosophy ofScience 5:18:134-142 (August 1954) JSTOR 685170

• Jaroslav Peregrin, “Variables in Natural Language: Where do they come from?", in M. Boettner, W. Thümmel,eds., Variable-Free Semantics, 2000, p. 46-65.

• W. V. Quine, “Variables Explained Away”, Proceedings of the American Philosophical Society 104:343-347(1960).

18.7 References[1] Syracuse University. “Appendix One Review of Constants and Variables”. cstl.syr.edu.

[2] ""Variable” Origin”. dictionary.com. Retrieved 18 May 2015.

[3] Fraleigh, John B. (1989). A First Course in Abstract Algebra (4 ed.). United States: Addison-Wesley. p. 276. ISBN0-201-52821-5.

[4] Tom Sorell, Descartes: A Very Short Introduction, (2000). New York: Oxford University Press. p. 19.

[5] Edwards Art. 5

[6] Edwards Art. 6

[7] Edwards Art. 4

[8] William L. Hosch (editor), The Britannica Guide to Algebra and Trigonometry, Britannica Educational Publishing, TheRosen Publishing Group, 2010, ISBN 1615302190, 9781615302192, page 71

[9] Foerster, Paul A. (2006). Algebra and Trigonometry: Functions and Applications, Teacher’s Edition (Classics ed.). UpperSaddle River, NJ: Prentice Hall. ISBN 0-13-165711-9.

Page 98: Syntax (Logic)

Chapter 19

Well-formed formula

Symbols andstrings of symbols

Well-formed formulas

Theorems

This diagram shows the syntactic entities which may be constructed from formal languages. The symbols and strings of symbolsmay be broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of itswell-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

In mathematical logic, a well-formed formula, shortly wff, often simply formula, is a word (i.e. a finite sequenceof symbols from a given alphabet) that is part of a formal language.[1] A formal language can be considered to beidentical to the set containing all and only its formulas.A formula is a syntactic formal object that can be given a semantic meaning by means of semantics.

91

Page 99: Syntax (Logic)

92 CHAPTER 19. WELL-FORMED FORMULA

19.1 Introduction

A key use of formulae is in propositional logic and predicate logics such as first-order logic. In those contexts, aformula is a string of symbols φ for which it makes sense to ask “is φ true?", once any free variables in φ have beeninstantiated. In formal logic, proofs can be represented by sequences of formulas with certain properties, and thefinal formula in the sequence is what is proven.Although the term “formula” may be used for written marks (for instance, on a piece of paper or chalkboard), it ismore precisely understood as the sequence being expressed, with the marks being a token instance of formula. It isnot necessary for the existence of a formula that there be any actual tokens of it. A formal language may thus have aninfinite number of formulas regardless whether each formula has a token instance. Moreover, a single formula mayhave more than one token instance, if it is written more than once.Formulas are quite often interpreted as propositions (as, for instance, in propositional logic). However formulas aresyntactic entities, and as such must be specified in a formal language without regard to any interpretation of them.An interpreted formula may be the name of something, an adjective, an adverb, a preposition, a phrase, a clause, animperative sentence, a string of sentences, a string of names, etc.. A formula may even turn out to be nonsense, if thesymbols of the language are specified so that it does. Furthermore, a formula need not be given any interpretation.

19.2 Propositional calculus

The formulas of propositional calculus, also called propositional formulas,[2] are expressions such as (A ∧ (B ∨C)). Their definition begins with the arbitrary choice of a set V of propositional variables. The alphabet consists of theletters in V along with the symbols for the propositional connectives and parentheses "(" and ")", all of which areassumed to not be in V. The formulas will be certain expressions (that is, strings of symbols) over this alphabet.The formulas are inductively defined as follows:

• Each propositional variable is, on its own, a formula.

• If φ is a formula, then ¬ φ is a formula.

• If φ and ψ are formulas, and • is any binary connective, then ( φ • ψ) is a formula. Here • could be (but is notlimited to) the usual operators ∨, ∧, →, or ↔.

This definition can also be written as a formal grammar in Backus–Naur form, provided the set of variables is finite:

<alpha set> ::= p | q | r | s | t | u | ... (the arbitrary finite set of propositional variables)<form> ::= <alpha set> | ¬ <form> | (<form> ∧ <form>) | (<form> ∨ <form>) | (<form>→ <form>) |(<form>↔ <form>)

Using this grammar, the sequence of symbols

(((p → q) ∧ (r → s)) ∨ ( ¬ q ∧ ¬ s))

is a formula, because it is grammatically correct. The sequence of symbols

((p → q)→ (qq))p))

is not a formula, because it does not conform to the grammar.A complex formula may be difficult to read, owing to, for example, the proliferation of parentheses. To alleviatethis last phenomenon, precedence rules (akin to the standard mathematical order of operations) are assumed amongthe operators, making some operators more binding than others. For example, assuming the precedence (from mostbinding to least binding) 1. ¬ 2. → 3. ∧ 4. ∨ . Then the formula

(((p → q) ∧ (r → s)) ∨ ( ¬ q ∧ ¬ s))

Page 100: Syntax (Logic)

19.3. PREDICATE LOGIC 93

may be abbreviated as

p → q ∧ r → s ∨ ¬ q ∧ ¬ s

This is, however, only a convention used to simplify the written representation of a formula. If the precedence wasassumed, for example, to be left-right associative, in following order: 1. ¬ 2. ∧ 3. ∨ 4. → , then the same formulaabove (without parentheses) would be rewritten as

(p → (q ∧ r))→ (s ∨ (( ¬ q) ∧ ( ¬ s)))

19.3 Predicate logic

The definition of a formula in first-order logic QS is relative to the signature of the theory at hand. This signaturespecifies the constant symbols, relation symbols, and function symbols of the theory at hand, along with the arities ofthe function and relation symbols.The definition of a formula comes in several parts. First, the set of terms is defined recursively. Terms, informally,are expressions that represent objects from the domain of discourse.

1. Any variable is a term.

2. Any constant symbol from the signature is a term

3. an expression of the form f(t1,...,tn), where f is an n-ary function symbol, and t1,...,tn are terms, is again aterm.

The next step is to define the atomic formulas.

1. If t1 and t2 are terms then t1=t2 is an atomic formula

2. If R is an n-ary relation symbol, and t1,...,tn are terms, then R(t1,...,tn) is an atomic formula

Finally, the set of formulas is defined to be the smallest set containing the set of atomic formulas such that thefollowing holds:

1. ¬ϕ is a formula when ϕ is a formula

2. (ϕ ∧ ψ) and (ϕ ∨ ψ) are formulas when ϕ and ψ are formulas;

3. ∃xϕ is a formula when x is a variable and ϕ is a formula;

4. ∀xϕ is a formula when x is a variable and ϕ is a formula (alternatively, ∀xϕ could be defined as an abbreviationfor ¬∃x¬ϕ ).

If a formula has no occurrences of ∃x or ∀x , for any variable x , then it is called quantifier-free. An existentialformula is a formula starting with a sequence of existential quantification followed by a quantifier-free formula.

19.4 Atomic and open formulas

Main article: Atomic formula

An atomic formula is a formula that contains no logical connectives nor quantifiers, or equivalently a formula that hasno strict subformulas. The precise form of atomic formulas depends on the formal system under consideration; forpropositional logic, for example, the atomic formulas are the propositional variables. For predicate logic, the atomsare predicate symbols together with their arguments, each argument being a term.According to some terminology, an open formula is formed by combining atomic formulas using only logical con-nectives, to the exclusion of quantifiers.[3] This has not to be confused with a formula which is not closed.

Page 101: Syntax (Logic)

94 CHAPTER 19. WELL-FORMED FORMULA

19.5 Closed formulas

Main article: Sentence (mathematical logic)

A closed formula, also ground formula or sentence, is a formula in which there are no free occurrences of any variable.If A is a formula of a first-order language in which the variables v1, ..., vn have free occurrences, then A preceded byv1 ... vn is a closure of A.

19.6 Properties applicable to formulas

• A formula A in a language Q is valid if it is true for every interpretation of Q .

• A formula A in a language Q is satisfiable if it is true for some interpretation of Q .

• A formula A of the language of arithmetic is decidable if it represents a decidable set, i.e. if there is aneffective method which, given a substitution of the free variables of A, says that either the resulting instance ofA is provable or its negation is.

19.7 Usage of the terminology

In earlier works on mathematical logic (e.g. by Church[4]), formulas referred to any strings of symbols and amongthese strings, well-formed formulas were the strings that followed the formation rules of (correct) formulas.Several authors simply say formula.[5][6][7][8] Modern usages (especially in the context of computer science withmathematical software such as model checkers, automated theorem provers, interactive theorem provers) tend toretain of the notion of formula only the algebraic concept and to leave the question of well-formedness, i.e. of theconcrete string representation of formulas (using this or that symbol for connectives and quantifiers, using this or thatparenthesizing convention, using Polish or infix notation, etc.) as a mere notational problem.However, the expression well-formed formulas can still be found in various works,[9][10][11] these authors using thename well-formed formula without necessarily opposing it to the old sense of formula as arbitrary string of symbols sothat it is no longer common in mathematical logic to refer to arbitrary strings of symbols in the old sense of formulas.The expression “well-formed formulas” (WFF) also pervaded in popular culture. Indeed,WFF is part of an esotericpun used in the name of the academic game "WFF 'N PROOF: The Game of Modern Logic,” by Layman Allen,[12]developed while he was at Yale Law School (he was later a professor at the University of Michigan). The suite ofgames is designed to teach the principles of symbolic logic to children (in Polish notation).[13] Its name is an echoof whiffenpoof, a nonsense word used as a cheer at Yale University made popular in The Whiffenpoof Song and TheWhiffenpoofs.[14]

19.8 See also

• Ground expression

19.9 Notes[1] Formulas are a standard topic in introductory logic, and are covered by all introductory textbooks, including Enderton

(2001), Gamut (1990), and Kleene (1967)

[2] First-order logic and automated theorem proving, Melvin Fitting, Springer, 1996

[3] Handbook of the history of logic, (Vol 5, Logic from Russell to Church), Tarski’s logic by Keith Simmons, D. Gabbay andJ. Woods Eds, p568 .

[4] Alonzo Church, [1996] (1944), Introduction to mathematical logic, page 49

Page 102: Syntax (Logic)

19.10. REFERENCES 95

[5] Hilbert, David; Ackermann, Wilhelm (1950) [1937], Principles of Mathematical Logic, New York: Chelsea

[6] Hodges, Wilfrid (1997), A shorter model theory, Cambridge University Press, ISBN 978-0-521-58713-6

[7] Barwise, Jon, ed. (1982), Handbook of Mathematical Logic, Studies in Logic and the Foundations of Mathematics, Am-sterdam: North-Holland, ISBN 978-0-444-86388-1

[8] Cori, Rene; Lascar, Daniel (2000), Mathematical Logic: A Course with Exercises, Oxford University Press, ISBN 978-0-19-850048-3

[9] Enderton, Herbert [2001] (1972), A mathematical introduction to logic (2nd ed.), Boston, MA: Academic Press, ISBN978-0-12-238452-3

[10] R. L. Simpson (1999), Essentials of Symbolic Logic, page 12

[11] Mendelson, Elliott [2010] (1964), An Introduction to Mathematical Logic (5th ed.), London: Chapman & Hall

[12] Ehrenburg 2002

[13] More technically, propositional logic using the Fitch-style calculus.

[14] Allen (1965) acknowledges the pun.

19.10 References• Allen, Layman E. (1965), “TowardAutotelic Learning ofMathematical Logic by theWFF 'N PROOFGames”,

Mathematical Learning: Report of a Conference Sponsored by the Committee on Intellective Processes Researchof the Social Science Research Council, Monographs of the Society for Research in Child Development 30 (1):29–41

• Boolos, George; Burgess, John; Jeffrey, Richard (2002), Computability and Logic (4th ed.), Cambridge Uni-versity Press, ISBN 978-0-521-00758-0

• Ehrenberg, Rachel (Spring 2002). “He’s Positively Logical”. Michigan Today (University of Michigan). Re-trieved 2007-08-19.

• Enderton, Herbert (2001), A mathematical introduction to logic (2nd ed.), Boston, MA: Academic Press, ISBN978-0-12-238452-3

• Gamut, L.T.F. (1990), Logic, Language, and Meaning, Volume 1: Introduction to Logic, University Of ChicagoPress, ISBN 0-226-28085-3

• Hodges, Wilfrid (2001), “Classical Logic I: First-Order Logic”, in Goble, Lou, The Blackwell Guide to Philo-sophical Logic, Blackwell, ISBN 978-0-631-20692-7

• Hofstadter, Douglas (1980), Gödel, Escher, Bach: An Eternal Golden Braid, Penguin Books, ISBN 978-0-14-005579-5

• Kleene, Stephen Cole (2002) [1967], Mathematical logic, New York: Dover Publications, ISBN 978-0-486-42533-7, MR 1950307

• Rautenberg, Wolfgang (2010), A Concise Introduction to Mathematical Logic (3rd ed.), New York: SpringerScience+Business Media, doi:10.1007/978-1-4419-1221-3, ISBN 978-1-4419-1220-6

19.11 External links• Well-Formed Formula for First Order Predicate Logic - includes a short Java quiz.

• Well-Formed Formula at ProvenMath

• WFF N PROOF game site

Page 103: Syntax (Logic)

96 CHAPTER 19. WELL-FORMED FORMULA

19.12 Text and image sources, contributors, and licenses

19.12.1 Text• Atomic sentence Source: https://en.wikipedia.org/wiki/Atomic_sentence?oldid=654112998 Contributors: AugPi, OkPerson, Creidieki,

Nortexoid, Oleg Alexandrov, Joriki, Linas, BD2412, Rjwilmsi, TheDJ, Rick Norwood, Tomisti, Mhss, Byelf2007, Dreftymac, CR-Greathouse, CBM, Gregbard, Cydebot, Julian Mendez, DWRZ, Zorakoid, Juliancolton, Tomer T, Philogo, JimJJewett, Arda Xi, Flyer22,Botsjeh, WikHead, Addbot, Zorrobot, Headlikeawhole, FrescoBot, BrideOfKripkenstein, Joerom5883, Mhiji, Blitzmut, Khazar2, CamilaCavalcanti Nery, Erick Lucena and Anonymous: 11

• Formal proof Source: https://en.wikipedia.org/wiki/Formal_proof?oldid=661169170 Contributors: Charles Matthews, Hyacinth, J D,Timrollpickering, Pmanderson, EmilJ, Dfranke, BD2412, Salix alba, Gaius Cornelius, Seegoon, Chrylis, Gregbard, Nick Number, Phil-ogo, VanishedUserABC, Radagast3, Ndenison, Mild Bill Hiccup, Hans Adler, HexaChord, Addbot, Numbo3-bot, AnomieBOT,MattTait,The Wiki ghost, Disinvented and Anonymous: 9

• Formal system Source: https://en.wikipedia.org/wiki/Formal_system?oldid=671526383Contributors: Michael Hardy,Modster, Mpagano,AugPi, Charles Matthews, Dysprosia, Hyacinth, Timrollpickering, Benc, Ancheta Wis, Giftlite, Acattell, Pmanderson, Felix Wiemann,Ascánder, R. S. Shaw, PWilkinson, Obradovic Goran, Mdd, Ruud Koot, Waldir, Raguks, Qwertyus, FlaBot, Margosbot~enwiki, Tillmo,Rekleov, YurikBot, Jpbowen, Arthur Rubin, Bsod2, SmackBot, Rex the first, Pro8, Hmains, Colonies Chris, Salt Yeung, Jon Awbrey,Byelf2007, Lambiam, Bjankuloski06en~enwiki, RandomCritic, 16@r, Mets501, JMK, BrainMagMo, CBM, Neelix, Gregbard, Cydebot,Al Lemos, Olaf, MartinBot, R'n'B, Maurice Carbonaro, Jonathanzung, Am Fiosaigear~enwiki, Maximillion Pegasus, Philogo, Popopp,Dmcq, Udirock, Vanished user kijsdion3i4jf, DesolateReality, Kas-nik, WestwoodMatt, Hans Adler, Aleksd, Sleepinj, Libcub, Addbot,TomorrowsDream, Adama44, Ptbotgourou, TaBOT-zerem, Pcap, KamikazeBot, AnomieBOT, Buenasdiaz, Omnipaedista, The Wikighost, Nikiriy, Undsoweiter, FrescoBot, Anthony.h.burton, Jonkerz, EmausBot, John of Reading, Montgolfière, Architectchao, Tijfo098,ChuispastonBot, Rmashhadi, Wcherowi, Paylett, JRBugembe, Helpful Pixie Bot, BG19bot, Justincheng12345-bot, Steamerandy, Saehry,Trackteur, Mario Castelán Castro and Anonymous: 36

• Formation rule Source: https://en.wikipedia.org/wiki/Formation_rule?oldid=635230107 Contributors: Michael Hardy, Hyacinth, Tim-rollpickering, Giftlite, EmilJ, BD2412, Kbdank71, Arthur Rubin, SmackBot, Gregbard, Cydebot, Cpiral, ClueBot, Hans Adler, Addbot,Tahu88810, Xqbot, The Wiki ghost, Snotbot, Brirush and Anonymous: 1

• Logical consequence Source: https://en.wikipedia.org/wiki/Logical_consequence?oldid=667253451 Contributors: Hyacinth, AnchetaWis, Giftlite, Mani1, Eric Kvaalen, BDD, Velho, Dionyziz, Macaddct1984, BD2412, Kbdank71, Koavf, Mathbot, Algebraist, Borgx,Alynna Kasmira, Arthur Rubin, SmackBot, Incnis Mrsi, Kintetsubuffalo, Bluebot, Javalenok, DMacks, Dbtfz, Grumpyyoungman01,Slakr, Inquisitus, KyleP, Igoldste, CBM, Gregbard, Cydebot, Gimmetrow, Thijs!bot, Luna Santin, Albany NY, Magioladitis, Trusilver,Maurice Carbonaro, VolkovBot, Philogo, Jamelan, Graymornings, Wemlands, Cnilep, Botev, Aplex, ClueBot, Tomas e, Sps00789, Panyd,Hans Adler, Good Olfactory, Iranway, Addbot, Niriel, AnomieBOT, RibotBOT, Minister Alkabaz, Machine Elf 1735, I dream of horses,MoreNet, Adam.a.a.golding, Tijfo098, Tziemer991, ClueBot NG, Wbm1058, Hanlon1755, ,چالاک Aubreybardo and Anonymous: 30

• Logical constant Source: https://en.wikipedia.org/wiki/Logical_constant?oldid=666654591 Contributors: Timrollpickering, Nortexoid,BD2412, Reinis, SmackBot, The great kawa, Frap, Lambiam, Dbtfz, CBM, Gregbard, Nick Number, SieBot, Randomblue, Fadesga,MystBot, Addbot, Luckas-bot, AnomieBOT, RibotBOT, Undsoweiter, FrescoBot, Hriber, Tijfo098, Frietjes, Masssly, Camila CavalcantiNery, ZX95 and Anonymous: 6

• Logical Syntax of Language Source: https://en.wikipedia.org/wiki/Rudolf_Carnap?oldid=671120071 Contributors: Mav, William Av-ery, Michael Hardy, Gabbe, Poor Yorick, Jod, Charles Matthews, Radgeek, Markhurd, Hyacinth, Qertis, Banno, Robbot, Josh Cherry,Fredrik, Timrollpickering, Snobot, Giftlite, Carlo.Ierna, Protagoras~enwiki, Ot, Marcos, Hkpawn~enwiki, Jutta, Lucidish, D6, Mormegil,Simonides, Rich Farmbrough, Euthydemos, DcoetzeeBot~enwiki, Bender235, Whosyourjudas, Sebastianlutz, Rje, LostLeviathan, Lok-ifer, Mdd, Tweedy7736, Velho, Eras-mus, BD2412, Rjwilmsi, Pitan, Olessi, FlaBot, Chobot, YurikBot, Wavelength, RobotE, Gellersen,Arado, KSchutte, Tony1, Hans G. Oberlack, Elonka, Hmains, Jaymay, Josteinn, Sistema13, OrphanBot, TheKMan, Makemi, JJstro-ker, Lacatosias, Byelf2007, ArglebargleIV, Ser Amantio di Nicolao, JzG, Dbtfz, Dfass, Aiwendil42, Kripkenstein, RevTarthpeigust,Joseph Solis in Australia, Ewulp, Gregbard, Cydebot, Bellerophon5685, Universitytruth, Lindsay658, PKT, Basilo, Dalliance, OreoPriest, Sobaka, Fayenatic london, Danny lost, Nikolaos Bakalis, VoABot II, Here2fixCategorizations, Exiledone, Philosophy Junkie, JaGa,Pikolas, CommonsDelinker, Johnpacklambert, Warren Allen Smith, DadaNeem, Cometstyles, Inwind, Nadavvv, VolkovBot, TXiKiBoT,BertSen, Starnold, Ontoraul, Broadbot, Insane-Contrast, Alcmaeonid, Logan, Castelargus, Jmccance, BotMultichill, Jauerback, Vojvo-daen, Admitone, ClueBot, Marenty, Auntof6, PixelBot, Brews ohare, Adamwodeham, Wulf Isebrand, Tassedethe, Luckas-bot, Yobot,Bunnyhop11, Ptbotgourou, AnomieBOT, W.stanovsky, MauritsBot, Xqbot, Omnipaedista, FreeKnowledgeCreator, FrescoBot, Schnuf-flus, Shiki2, Tkuvho, Foobarnix, SEVEREN, Minimac, Dewritech, Faolin42, ZéroBot, Fæ, Shuipzv3, Jenks24, Suslindisambiguator,Gwestheimer, Kunstlerbob, Cntras, Masssly, Chastra, ChrisGualtieri, YFdyh-bot, Lf(lx(f)(x)x)lx(f)(x)x, VIAFbot, Jochen Burghardt,Unearthly Stew, Nblount, Tomajohnson, Begotknow, Nixin06, Aubreybardo, A Boelen, Hammel a, Neadv, KasparBot and Anonymous:92

• Metasyntactic variable Source: https://en.wikipedia.org/wiki/Metasyntactic_variable?oldid=666176352 Contributors: Dreamyshade,Tarquin, BlckKnght, Benwbrum, Ortolan88, Shii, DavidLevinson, Anthere, Cayzle, Shaydon, Xoder, Stevertigo, Frecklefoot, Bdesham,Michael Hardy, Tim Starling, Nixdorf, Philbog, Liftarn, MartinHarper, Taras, Graue, Dcljr, TakuyaMurata, GTBacchus, Minesweeper,Tregoweth, Card~enwiki, Ihcoyc, Jpatokal, Kimiko, IMSoP, Cimon Avaro, Kaihsu, Harvester, Ec5618, Timwi, A1r, Dcoetzee, Ww,Dysprosia, Timc, Furrykef, Hyacinth, Saltine, Jnc, Dogface, Populus, Sabbut, Rnbc, ,דוד Dpbsmith, Mpost89, Finlay McWalter, Rob-bot, RichiH, Popageorgio, Josh Cherry, ChrisO~enwiki, Fredrik, Faco~enwiki, R3m0t, Sanders muc, RedWolf, Altenmann, Greudin,Ashdurbat, Henrygb, Meelar, Timrollpickering, Hadal, Iron Bishop, Wereon, Benc, Jor, Danceswithzerglings, Diberri, Jleedev, TobiasBergemann, Vir4030, Mintleaf~enwiki, Inter, BenFrantzDale, Geeoharee, Tom harrison, Sploo22, LT-P, FunnyMan3595, Guanaco, Big-Ben212, RemyB,Mboverload, ElfMage, Lakefall~enwiki, ChicXulub, LucasVB, Gzuckier, Joeblakesley,WhiteDragon, Ablewisuk, Mza-jac, OwenBlacker, Maximaximax, RetiredUser2, Patilkedar, DenisMoskowitz, Drhaggis, Defenestrate, HenHei~enwiki, Kevyn, Trafton,Jbinder, Andreas Kaufmann, Mtnerd, Grunt, RandalSchwartz, Omassey, Astronouth7303, Venu62, Sparky the Seventh Chaos, Mind-spillage, Rich Farmbrough, Reallycoolguy, MCBastos, Drano, Luqui, Rama, Ardonik, Mikkel, Bo Lindbergh, Night Gyr, Dolda2000,Ljosa, Demo, Nabla, CanisRufus, Kwamikagami, Tverbeek, PhilHibbs, Triona, Circeus, GalaxiaGuy, Mike Schwartz, AKGhetto, Co-hesion, Brian McNeil, Kajiki, Alphax, Benji22210, Anthony Appleyard, Diego Moya, Andrewpmk, Andrew Gray, Jnothman, ,ליאורHoary, InShaneee, Dark Shikari, Stillnotelf, Saga City, Vyruss, Paul1337, Yuckfoo, Ilse@, Alanhwiki, Rotring, Kay Dekker, Brookie,

Page 104: Syntax (Logic)

19.12. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 97

Ron Ritzman, Kelly Martin, Simetrical, Ae-a, Quattrop, Riumplus, Tckma, Srborlongan, Firien, Dionyziz, MarkusHagenlocher, XiongChiamiov, RoyHui, Starwed, Radiant!, Marudubshinki, Matturn, Yoghurt, MagisterMathematicae, Keeves, BD2412, Crocodealer, Gram-marbot, BorgHunter, Quiddity, Bruce1ee, Nick R, Brajeshs, Danfuzz, Sarg, Faduci, Cassowary, KaiMartin, Ecb29, Fëaluinix, Margos-bot~enwiki, Loggie, OpenToppedBus, Mark Yen, Correon, Peterl, FrankTobia, YurikBot, Flameviper, RussBot, FrenchIsAwesome,Hyad, SpuriousQ, Jasonglchu, Sikon, Viltris, Dysmorodrepanis~enwiki, Cleared as filed, PeepP, Lomn, Tertulia, Freshgavin, Richard-cavell, FF2010, Ninly, Nzzl, Nentuaby, Clayhalliwell, StealthFox, Eric TF Bat, Trubye, Leon2323, Chrismith, IslandHopper973, Sar-danaphalus, SmackBot, Larry Doolittle, Matt Raines, Kilo-Lima, Reverend Loki, Zzzzz, Tjp1982, Setanta747 (locked), Renesis, ProveIt,Guyalsfere, TheDoctor10, Colonel Tom, Shai-kun, Moe Aboulkheir, Kesafloyd, Richfife, Masklinn, Wcoenen, Nympheta, Crashmatrix,Snori, Stevage, Pigeon.dyndns.org, Nbarth, Jdthood, Efitu, MrBananaGrabber, Metageek, Kristod, Tinctorius, Radagast83, Cybercobra,TheLimbicOne, Datapoohbah, BlackFingolfin, FelisLeo, TenPoundHammer, Amtiss, Gennaro Prota, LtPowers, MegA, Errorx666, Soap,MageKing17, Aroundthewayboy, Breno, IronGargoyle, Astrait47, LandruBek, Shannernanner, LPH, Tawkerbot2, CRGreathouse, Cm-drObot, USMCM1A1, JohnCD, Except, Coldplayer, Davnor, Gregbard, AndrewHowse, Cydebot, Ccnelson, Cvxdes, Thijs!bot, Scampida dude, Scampi da dudee, Criggie, Ajo Mama, Alex Barrett, AgentPeppermint, AntiVandalBot, Jimbomorrison, Scepia, TexMurphy,Alphachimpbot, Phloopy, Danfe~enwiki, Coolhandscot, Tqbf, Murgh, VoABot II, Bruce Tindall, Usien6, Skew-t, Shocking Blue, Thibbs,Talon Artaine, Gwern, Pauljackson1, Jtir, Kf4yfd, MartinBot, Church of emacs, Kiore, B33R, Test0zero, Donnaidh sidhe, Noah Keating,Inapplicable, Jaredehansen, Akuankka en, Thomas Larsen, Dannys-777, Raise exception, Babedacus, OliverHarris, AlKing464, AdamZivner, Rogerkilday, Kevinkor2, D A Patriarche, Thaddeus Slamp, Tclark88, Sami Kerola, Antixt, Cnilep, FrostPaladin, Dgmjr05, Kara-boom, BlueAzure, CaptainLepton, Anchor Link Bot, Foxj, Jaredj, Excirial, Qwfp, DumZiBoT, Asrghasrhiojadrhr, Logicist, Slacker874,Elehack, Legobot, PlankBot, Yobot, TaBOT-zerem, Denispir, Jason Recliner, Esq., AnomieBOT, 1exec1, ThaddeusB, Crimsonmar-garine, Freebirth Toad, Iheartmarek, Carlesso, Shirik, Lothar von Richthofen, ComputScientist, BrideOfKripkenstein, Luisdanielmesa,AreaMan, Lotje, Morton Shumway, Hogehogeloon, Jesse V., Fugafugafudesaka, Fugokyara, Wikipelli, John Cline, Wayne Slam, Ti-jfo098, ClueBot NG,Monotrema, BG19bot, Chmarkine, Fraulein451, Choephix, ChrisGualtieri, Arash.Joorabchi, Enamex, Nyuszika7H,Ahsbdx, Babsbshd and Anonymous: 414

• Metavariable Source: https://en.wikipedia.org/wiki/Metavariable?oldid=618282154Contributors: Timrollpickering, Vegaswikian, DavidPierce, Courcelles, Gregbard, Kf4yfd, Cnilep, CorenSearchBot, AnomieBOT,Morton Shumway, Bk314159, Tijfo098, Garetjax3891 andAnonymous: 3

• Proposition Source: https://en.wikipedia.org/wiki/Proposition?oldid=672228119 Contributors: AxelBoldt, Mav, Toby Bartels, Zoe,Stevertigo, K.lee, Michael Hardy, Zeno Gantner, TakuyaMurata, Minesweeper, Evercat, Sethmahoney, Conti, Reddi, Greenrd, Markhurd,Hyacinth, Banno, RedWolf, Ojigiri~enwiki, Timrollpickering, Tobias Bergemann, Giftlite, Jason Quinn, Stevietheman, Antandrus, Su-perborsuk, Sebbe, Amicuspublilius, Martpol, Hapsiainen, Vanished user lp09qa86ft, Chalst, Phiwum, Duesentrieb, Bobo192, Larry V,MPerel, Helix84, V2Blast, Ish ishwar, Emvee~enwiki, RJFJR, Bobrayner, Philthecow, Velho, Woohookitty, Kzollman, Isnow, Patl,Brolin Empey, Lakitu~enwiki, Fresheneesz, Bornhj, YurikBot, Hairy Dude, Rick Norwood, Wknight94, Finell, SmackBot, Evanreyes,Ignacioerrico, Bluebot, Jaymay, DHN-bot~enwiki, Cybercobra, Richard001, Lacatosias, Jon Awbrey, Vina-iwbot~enwiki, Byelf2007,Harryboyles, SilkTork, Ckatz, 16@r, Grumpyyoungman01, Stwalkerster, Caiaffa, Levineps, Iridescent, JoeBot, Gveret Tered, Eastlaw,CRGreathouse, CBM, Sdorrance, Andkore, Gregbard, Juansempere, Yesterdog, Thijs!bot, Barticus88, Kredal, AllenFerguson, Voyag-ing, NSH001, JAnDbot, MER-C, Leolaursen, Bookinvestor, Connormah, VoABot II, WhatamIdoing, Pomte, Tgeairn, J.delanoy, Ali,Ginsengbomb, Katalaveno, Coppertwig, Nieske, Funandtrvl, King Lopez, ABF, TXiKiBoT, Philogo, Tracerbullet11, Cnilep, Barkeep,SieBot, Legion fi, Oxymoron83, OKBot, ClueBot, The Thing That Should Not Be, Watchduck, Estirabot, Hans Adler, Hugo Herbelin,DumZiBoT, Makotoy, Crazy Boris with a red beard, Dthomsen8, Dwnelson, SilvonenBot, Good Olfactory, Addbot, Andrewghutchison,LAAFan, Luckas-bot, TheSuave, Denyss, THENWHOWAS PHONE?, Ehuss, KamikazeBot, AnomieBOT, E235, Yalckram, Wortafad,ArthurBot, Luis Felipe Schenone, Omnipaedista, FrescoBot, BrideOfKripkenstein, Motomuku, Pinethicket, A8UDI, Monkeymanman,Gamewizard71, FoxBot, Lotje, TheMesquito, Daliot, EmausBot, John of Reading, Eekerz, Look2See1, Honestrosewater, Bollyjeff,Coasterlover1994, Chewings72, ClueBot NG, MelbourneStar, Satellizer, Masssly, Helpful Pixie Bot, Hans-Jürgen Streicher~enwiki,,زكريا ChrisGualtieri, Jochen Burghardt, Eyesnore, Purnendu Karmakar, DetectiveKraken, SanketDash, Ashika Bieber, Eavestn andAnonymous: 163

• Propositional formula Source: https://en.wikipedia.org/wiki/Propositional_formula?oldid=658354394 Contributors: Michael Hardy,Hyacinth, Timrollpickering, Tobias Bergemann, Filemon, Giftlite, Golbez, PWilkinson, Klparrot, Bookandcoffee, Woohookitty, Linas,Mindmatrix, Tabletop, BD2412, Kbdank71, Rjwilmsi, Bgwhite, YurikBot, Hairy Dude, RussBot, Open2universe, SmackBot, Hmains,Chris the speller, Bluebot, Colonies Chris, Tsca.bot, Jon Awbrey, Muhammad Hamza, Lambiam, Wvbailey, Wizard191, Iridescent,Happy-melon, ChrisCork, CBM, Gregbard, Cydebot, Julian Mendez, Nick Number, Arch dude, Djihed, R'n'B, Raise exception, Wiki-isawesome, Billinghurst, Spinningspark, WRK, Maelgwnbot, Jaded-view, Mild Bill Hiccup, Neuralwarp, Addbot, Yobot, Adelpine,AnomieBOT, Neurolysis, LilHelpa, The Evil IP address, Kwiki, Kevin Gorman, Helpful Pixie Bot, BG19bot, PhnomPencil, Wolf-manx122, Jochen Burghardt, Mark viking, Knife-in-the-drawer and Anonymous: 18

• Rule of inference Source: https://en.wikipedia.org/wiki/Rule_of_inference?oldid=671924786 Contributors: Michael Hardy, Darkwind,Poor Yorick, Rossami, BAxelrod, Hyacinth, Ldo, Timrollpickering, Markus Krötzsch, Jason Quinn, Khalid hassani, Neilc, Quadell,CSTAR, Lucidish, MeltBanana, Elwikipedista~enwiki, EmilJ, Nortexoid, Giraffedata, Joriki, Ruud Koot, Hurricane Angel, Waldir,BD2412, Kbdank71, Emallove, Brighterorange, Algebraist, YurikBot, Rsrikanth05, Cleared as filed, Arthur Rubin, Fram, Nahaj, El-wood j blues, Mhss, Chlewbot, Byelf2007, ArglebargleIV, Robofish, Tktktk, Jim.belk, Physis, JHunterJ, Grumpyyoungman01, DanGluck, CRGreathouse, CBM, Simeon, Gregbard, Cydebot, Thijs!bot, Epbr123, LokiClock, TXiKiBoT, Cliff, Eusebius, Alejandro-caro35, Addbot, Luckas-bot, AnomieBOT, Citation bot, GrouchoBot, RibotBOT, WillMall, Undsoweiter, Jonesey95, Gamewizard71,Onel5969, TomT0m, Tesseract2, Tijfo098, ClueBot NG, Delphinebbd, Ginsuloft and Anonymous: 28

• Symbol (formal) Source: https://en.wikipedia.org/wiki/Symbol_(formal)?oldid=630172650 Contributors: Dominus, Markhurd, Hy-acinth, Timrollpickering, Mukerjee, Rich Farmbrough, Ruud Koot, MithrandirMage, Arthur Rubin, SmackBot, CBM, Gregbard, Cy-debot, PamD, Nick Number, Calaka, R'n'B, Good Olfactory, Addbot, AnomieBOT, FrescoBot, Tijfo098, Kejia, Jiri 1984, Masssly,HMSSolent, Saehry, Wamiq, Hbb 1988 and Anonymous: 6

• Syntax (logic) Source: https://en.wikipedia.org/wiki/Syntax_(logic)?oldid=643357223 Contributors: Michael Hardy, Charles Matthews,Hyacinth, Gwydion~enwiki, Rich Farmbrough, Whosyourjudas, Oleg Alexandrov, Woohookitty, BD2412, MithrandirMage, Roboto deAjvol, Bhny, Arthur Rubin, Otto ter Haar, SmackBot, Nbarth, Lambiam, Derek farn, Physis, CBM, Gregbard, FilipeS, Cydebot, Hebrides,Thijs!bot, Nick Number, Thenub314, Johan1298~enwiki, Macedonian, Hans Adler, Marc van Leeuwen, Good Olfactory, Addbot, Coted'Azur, Erik9bot, FrescoBot, Eball, Masssly and Anonymous: 3

Page 105: Syntax (Logic)

98 CHAPTER 19. WELL-FORMED FORMULA

• Theorem Source: https://en.wikipedia.org/wiki/Theorem?oldid=668391294 Contributors: AxelBoldt, Mav, Zundark, The Anome, Tar-quin, Tbackstr, XJaM, Aldie, Michael Hardy, Zeno Gantner, TakuyaMurata, Bagpuss, Glenn, Tim Retout, Rotem Dan, Andres, CharlesMatthews, Dcoetzee, Bemoeial, Hyacinth, Traroth, SirPeebles, Moriel~enwiki, Josh Cherry, Fredrik, MathMartin, Ojigiri~enwiki, Tim-rollpickering, Hadal, Alan Liefting, Snobot, Ancheta Wis, Tosha, Giftlite, Monedula, Fropuff, Fishal, Chowbok, Alaz, MarkSweep,Karol Langner, Jacob grace, Pmanderson, Tyler McHenry, Hkpawn~enwiki, TzankoMatev, Joyous!, EugeneZelenko, Discospinster, RichFarmbrough, Paul August, Bender235, Gauge, Tompw, El C, Edwinstearns, Billymac00, Sasquatch, Alansohn, Gary, Sciurinæ, Joriki,Igny, Ruud Koot, Eras-mus, Mekong Bluesman, Tslocum, Graham87, BD2412, Gmelli, Sdornan, Salix alba, Jrtayloriv, AndriuZ, M7bot,Chobot, MithrandirMage, Sbrools, DVdm, Algebraist, Roboto de Ajvol, Borgx, Chaos, Trovatore, Dbfirs, Bota47, Tomisti, Arthur Ru-bin, Kier07, Anclation~enwiki, Curpsbot-unicodify, Erudy, Finell, Sardanaphalus, SmackBot, RDBury, Bomac, Skizzik, Chris the speller,Fuzzform, MalafayaBot, DHN-bot~enwiki, Sholto Maud, Cybercobra, Acdx, SashatoBot, Lambiam, IronGargoyle, Craigblock, Lalaith,Autonova, Mike Fikes, Zero sharp, JRSpriggs, CRGreathouse, Hi.ro, CBM, Gregbard, Cydebot, R Harris, Moxmalin, Thijs!bot, Epbr123,Mrcs, James086, Nick Number, AntiVandalBot, Thenub314, .anacondabot, Magioladitis, Dvptl, Animum, Gwern, Stephenchou0722,Pomte, Hippasus the Younger, Fcsuper, Jeepday, Nznancy, Coppertwig, Haseldon, Tparameter, Fjbfour, Dessources, DavidCBryant,Caiodnh, Austinmohr, VolkovBot, Psmythirl, Am Fiosaigear~enwiki, TXiKiBoT, Rei-bot, IKiddo, Voorlandt, Philogo, Geometry guy,Graymornings, Dmcq, HiDrNick, DestroyerofDreams, Hthoreau2, Newbyguesses, SieBot, Respir, Huzzah018, Oxymoron83, Simon-Trew, OKBot, Kumioko (renamed), DesolateReality, Wjemather, Loren.wilton, ClueBot, Blanchardb, Excirial, Alexbot, TheSnacks,Hans Adler, Muzz 2008, MonoBot, XLinkBot, Burkaja, Marc van Leeuwen, SilvonenBot, Badgernet, Addbot, DrThunder88, Some jerkon the Internet, CanadianLinuxUser, CarsracBot, Dr. Universe, Favonian, Numbo3-bot, Flatfish89, Stepfordswife, Zorrobot, Legobot,Cote d'Azur, Luckas-bot, Yobot, II MusLiM HyBRiD II, Kristen Eriksen, LilHelpa, Xqbot, Doezxcty, Capricorn42, Almabot, Miym,GrouchoBot, Jubb-green, RibotBOT, FrescoBot, Sirtywell, Haeinous, Heptadecagon, BigDwiki, RedBot, Eric wisniewski, EmausBot,ZéroBot, The Nut, Xzenu, GZ-Bot, H3llBot, D.Lazard, ChuispastonBot, ClueBot NG, Helpful Pixie Bot, Howald, Jibun, bukiyou desukara, Khazar2, Oracions, Wywin, Yamaha5, Loraof, EoRdE6, BabyChastie, Nbro, KasparBot, Blakktaktiks and Anonymous: 111

• Theory (mathematical logic) Source: https://en.wikipedia.org/wiki/Theory_(mathematical_logic)?oldid=655374287Contributors: Vkun-cak, Hyacinth, Timrollpickering, Jabowery, Uffish, Woohookitty, Linas, Waldir, Tillmo, Algebraist, Trovatore, Arthur Rubin, Cícero,Lambiam, CBM, Myasuda, Gregbard, David Eppstein, Philogo, Hans Adler, Iranway, Addbot, Numbo3-bot, Yobot, Buenasdiaz, Masti-Bot, ClueBot NG, Wcherowi, Masssly and Anonymous: 9

• Unate function Source: https://en.wikipedia.org/wiki/Unate_function?oldid=646692253 Contributors: Timrollpickering, Rich Farm-brough, Sgauria, RJFJR, Salix alba, Cydebot, Addbot, Leeaiwei, San wolverine4, Avsmal, Mogism and Anonymous: 3

• Variable (mathematics) Source: https://en.wikipedia.org/wiki/Variable_(mathematics)?oldid=666440432Contributors: Michael Hardy,Rp, TakuyaMurata, Nickshanks, Robbot, Gandalf61, Tobias Bergemann, Giftlite, Micru, Macrakis, Kusunose, Iantresman, Mike Rosoft,Discospinster, Rgdboer, Kwamikagami, MattGiuca, Eclecticos, Silivrenion, Bgwhite, Phantomsteve, Reyk, True Pagan Warrior, Smack-Bot, RDBury, Georg-Johann, Rrburke, Cybercobra, Kashmiri, JForget, CRGreathouse,Myasuda, Gregbard, Cydebot, Thijs!bot, Marek69,Seaphoto, QuiteUnusual, Bongwarrior, P64, JamesBWatson, JaGa, R'n'B, Gill110951, Fylwind, Idioma-bot, VolkovBot, LokiClock,Philip Trueman, LimStift, Enviroboy, Symane, Gaelen S., SieBot, Steorra, Gerakibot, Flyer22, Michel421, Correogsk, Sean.hoyland,Melcombe, ClueBot, Mild Bill Hiccup, Excirial, He7d3r, Muhandes, SchreiberBike, Qwfp, Marc van Leeuwen, Mifter, Addbot, Somejerk on the Internet, Vchorozopoulos, Glane23, Tide rolls, Zorrobot, Luckas-bot, KamikazeBot, Reindra, AnomieBOT, Materialscientist,Holmes7893, The High Fin SpermWhale, Xqbot, Jsharpminor, Isheden, Rodneidy, Erik9bot, Sławomir Biały, Boxplot, Pinethicket, Mrs-marenawalker, LittleWink, RedBot, TobeBot, LogAntiLog, Nataev, TjBot, DASHBot, EmausBot, Razertek, ZéroBot, Bollyjeff, Phrixus-sun, D.Lazard, Paulmiko, FrankFlanagan, BioPupil, Emperyan, ChuispastonBot, EdoBot, Txus.aparicio, ClueBot NG, Wcherowi, Satel-lizer, Cntras, Kevin Gorman, LightBringer, Widr, Jojo966, Ignacitum, HMSSolent, Wiki13, David815, AwamerT, Mark Arsten, Trav-elour, Thatemooverthere, GoShow, EuroCarGT, Dexbot, Lugia2453, Bulba2036, Jamesx12345, Nbeaver, YiFeiBot, Cubism44, Ashleyangulo, Wikapedist, Thewikiguru1, Grayhawk22, Mhsh98, BrianPansky, Gmalaven, Rainboomcool, Gamingforfun365, Karissaisbae andAnonymous: 146

• Well-formed formula Source: https://en.wikipedia.org/wiki/Well-formed_formula?oldid=660313819 Contributors: Edward, MichaelHardy, DavidWBrooks, Charles Matthews, Wik, Hyacinth, Onebyone, Josh Cherry, Tobias Bergemann, Giftlite, Andy, Jeffwarnica,Abdull, Mh, Dolda2000, Spayrard, Linas, GregorB, BD2412, Qwertyus, MithrandirMage, SpuriousQ, IanManka, Ihope127, Trovatore,Pawyilee, Ripper234, Arthur Rubin, Otto ter Haar, SmackBot, Pokipsy76,Mhss, Ioscius, Jgoulden, B7T,Midnighttonight, CRGreathouse,CBM, Gregbard, Cydebot, Julian Mendez, Forgot, Nick Number, Behco, Laymanal, CountingPine, Epsilon0, Dessources, Philogo,Paradoctor, DeathByNukes, Kumioko (renamed), Martarius, ClueBot, PixelBot, Hans Adler, Jonverve, Hugo Herbelin, LheaJLove, Ad-dbot, Yobot, Ptbotgourou, Notacupcakebaker, GrouchoBot, Tkuvho, Jonesey95, Dude1818, Gamewizard71, BillyPreset, GoingBatty,Bamyers99, Pyy999, Wcherowi, Helpful Pixie Bot, Laymanallen, CitationCleanerBot, Blakehill, Jochen Burghardt and Anonymous: 38

19.12.2 Images• File:4CT_Non-Counterexample_1.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a6/4CT_Non-Counterexample_

1.svg License: Public domain Contributors: Based on a this raster image by Dmharvey on en.wikipedia. Original artist: Inductiveload• File:Ambox_important.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b4/Ambox_important.svg License: Public do-

main Contributors: Own work, based off of Image:Ambox scales.svg Original artist: Dsmurat (talk · contribs)• File:CollatzFractal.png Source: https://upload.wikimedia.org/wikipedia/commons/1/1c/CollatzFractal.pngLicense: Public domainCon-

tributors: English wikipedia Original artist: Pokipsy76• File:Complex-adaptive-system.jpg Source: https://upload.wikimedia.org/wikipedia/commons/0/00/Complex-adaptive-system.jpgLi-

cense: Public domain Contributors: Own work by Acadac : Taken from en.wikipedia.org, where Acadac was inspired to create this graphicafter reading: Original artist: Acadac

• File:Folder_Hexagonal_Icon.svg Source: https://upload.wikimedia.org/wikipedia/en/4/48/Folder_Hexagonal_Icon.svg License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Formal_languages.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/da/Formal_languages.svg License: CC BY-SA 3.0 Contributors: Own work based on: en:Image:Formal languages.png by Gregbard. Original artist: MithrandirMage

• File:HelloWorld.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/28/HelloWorld.svg License: Public domain Contrib-utors: Own work Original artist: Wooptoo

Page 106: Syntax (Logic)

19.12. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 99

• File:Logic_portal.svg Source: https://upload.wikimedia.org/wikipedia/commons/7/7c/Logic_portal.svg License: CC BY-SA 3.0 Con-tributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk)

• File:Logical_connectives_Hasse_diagram.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Logical_connectives_Hasse_diagram.svg License: Public domain Contributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk)

• File:Mergefrom.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Mergefrom.svg License: Public domain Contribu-tors: ? Original artist: ?

• File:Propositional_formula_3.png Source: https://upload.wikimedia.org/wikipedia/commons/d/dc/Propositional_formula_3.png Li-cense: CC-BY-SA-3.0 Contributors: Drawn by wvbailey in Autosketch then imported into Adobe Acrobat and exported as .png. Originalartist: User:Wvbailey

• File:Propositional_formula_NANDs.png Source: https://upload.wikimedia.org/wikipedia/commons/c/c9/Propositional_formula_NANDs.png License: CC-BY-SA-3.0 Contributors: Own work Original artist: User:Wvbailey

• File:Propositional_formula_connectives_1.png Source: https://upload.wikimedia.org/wikipedia/en/c/ca/Propositional_formula_connectives_1.png License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Propositional_formula_flip_flops_1.png Source: https://upload.wikimedia.org/wikipedia/en/5/5b/Propositional_formula_flip_flops_1.png License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Propositional_formula_maps_1.png Source: https://upload.wikimedia.org/wikipedia/en/b/bb/Propositional_formula_maps_1.pngLicense: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Propositional_formula_maps_2.png Source: https://upload.wikimedia.org/wikipedia/commons/9/90/Propositional_formula_maps_2.png License: CC-BY-SA-3.0 Contributors: Own work by the original uploader Original artist: User:Wvbailey

• File:Propositional_formula_oscillator_1.png Source: https://upload.wikimedia.org/wikipedia/en/e/e3/Propositional_formula_oscillator_1.png License: Cc-by-sa-3.0 Contributors: ? Original artist: ?

• File:Pythagorean_Proof_(3).PNG Source: https://upload.wikimedia.org/wikipedia/commons/1/16/Pythagorean_Proof_%283%29.PNGLicense: CC BY-SA 3.0 Contributors: Own work Original artist: Brews ohare

• File:Question_book-new.svg Source: https://upload.wikimedia.org/wikipedia/en/9/99/Question_book-new.svg License: Cc-by-sa-3.0Contributors:Created from scratch in Adobe Illustrator. Based on Image:Question book.png created by User:Equazcion Original artist:Tkgd2007

• File:Socrates.png Source: https://upload.wikimedia.org/wikipedia/commons/c/cd/Socrates.png License: Public domain Contributors:Transferred from en.wikipedia to Commons. Original artist: The original uploader was Magnus Manske at English Wikipedia Laterversions were uploaded by Optimager at en.wikipedia.

• File:Venn1001.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/47/Venn1001.svg License: Public domain Contributors:? Original artist: ?

• File:Wikiquote-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikiquote-logo.svg License: Public domainContributors: ? Original artist: ?

• File:Wiktionary-logo-en.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f8/Wiktionary-logo-en.svg License: Publicdomain Contributors: Vector version of Image:Wiktionary-logo-en.png. Original artist: Vectorized by Fvasconcellos (talk · contribs),based on original logo tossed together by Brion Vibber

• File:Wuppertal_Ronsdorf_-_Villa_Carnap_01_ies.jpg Source: https://upload.wikimedia.org/wikipedia/commons/4/49/Wuppertal_Ronsdorf_-_Villa_Carnap_01_ies.jpg License: CC-BY-SA-3.0 Contributors: Own work Original artist: Frank Vincentz

19.12.3 Content license• Creative Commons Attribution-Share Alike 3.0


Recommended