symbolic logic text

160
Basic Symbolic Logic Text for Logic 2003A: Symbolic Logic I Kent A. Peacock Department of Philosophy, University of Lethbridge. August 27, 2007

Upload: journeyman1126

Post on 14-Oct-2014

2.184 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Symbolic Logic Text

Basic Symbolic Logic

Text for

Logic 2003A: Symbolic Logic I

Kent A. Peacock

Department of Philosophy,University of Lethbridge.

August 27, 2007

Page 2: Symbolic Logic Text

ii

Copyright c© 2005, 2006, 2007 Kent A. Peacock

Page 3: Symbolic Logic Text

Preface

The purpose of this text is to provide a correct but flexible command of symbolic logic up tobut not including the point at which one can begin to study metalogic (the “logic of logic”).As such, it covers most of the material usually covered in a typical second-year universitycourse on symbolic logic, including such time-honoured topics as truth tables, truth trees,and natural deduction for propositional and first-order predicate logic (without and withidentity). A working command of these techniques is simply assumed in the the literatureon analytic philosophy, mathematical logic, foundations of mathematics, computer science,and much of physical science.

A concluding chapter introduces the reader briefly and qualitatively to some of the richesthat lie beyond, including modal and deviant logics, Godel, Turing, artificial intelligence,and quantum computing. Not everyone will wish to explore such advanced topics in detail,but every student of philosophy and logic should be aware that such things exist, and haveat least a passing acquaintance with them.

The aim of this text is not to present a rigorous development of propositional andpredicate logic from the simplest and most general first principles available, but to offer aworkaday (but of course valid!) system of logic that will be as useful as possible, both forbeginning logicians who may later wish to go on to a more foundational treatment, and alsothose many others who are interested in logic more as a tool than for its own sake.

One might naturally ask why the world needs to be blessed with yet another logic text,when there are so many excellent texts on the market now. Like almost everyone whohas taught logic, I could not avoid reconstructing it in my own way. I ended up with anapproach that I believe is a useful complement to other methods of teaching this sort oflogic. And there is room for new approaches, for the fact remains that elementary logic,unlike certain parts of elementary mathematics, is still a developing subject. I modestlybelieve that the approach I present here has an advantage over many other texts at a similarlevel because of its emphasis on flexibility and ease of use.

The choice of topics in this text was designed in particular to meet the needs of thesecond-year symbolic logic course at the University of Lethbridge. We have a first yearcourse that is mostly informal logic and critical thinking (with a smattering of “baby”deductive logic), a second-year course called Symbolic Logic I, that introduces the formalmethods of classical two-valued Boolean logic (primarily propositional and predicate calcu-lus) up to the point at which one can begin to do metatheory, and then a third-year course,Symbolic Logic II, in which those who are so inclined can finally dip into metatheory.

The tabular notation for natural deduction I used here is adapted with some simplifi-cations from that used in superb but now largely out-dated texts by Mates and Lemmon[18, 17], supplemented with the highly efficient truth tree techniques pioneered by Jeffrey

iii

Page 4: Symbolic Logic Text

iv PREFACE

[14]. I build my natural deduction system on basic rules similar to those of Lemmon, but Ihave streamlined the presentation of natural deduction techniques in such a way as to makethem as easy to use as possible. I also use Lemmon’s efficient way of setting up truth tables,with a few shortcuts of my own thrown in. In the end, I have written the logic text that Ithink would have benefited me the most as a student, and I hope that it will be useful toothers as well. Although the book is aimed in the first instance at college and universitystudents, I hope that it will also be useful for professionals in disciplines such as physics orcomputer science, or indeed anyone at all who may wish to refresh their knowledge of thetechniques of logic.

The advantage of the tabular notation for natural deduction over the popular andgraphically explicit Fitch system (as used in, e.g., [2, 13]) is that it is more compact, moreflexible, and that it allows one to introduce a theorem at any point in the deduction.(This will all be explained below.) Logic does not yet have one notation that has becomeuniversally accepted (unlike most branches of mathematics such as calculus), but once youhave learned one notation, it is not very hard to learn another. Some students may in theend prefer the Fitch system over the slightly more abstract tabular system because Fitchallows us to visualize whenever we are in a subproof. On the other hand, the concisionand flexibility of the tabular system certainly help to make symbolic logic useful — whichplaces it squarely in the spirit of this text.

Although I tremendously admire Lemmon’s book, I base my teaching of logic on a dif-ferent pedagogical philosophy. Lemmon, and indeed numerous other authors of logic texts,believe that the student should learn how to derive every result from first principles, usingas few rules as possible. This “logical boot camp” approach certainly develops technicaladroitness in those students who are willing to tough it out. But it has the disadvantage thatone often requires derivations of great length and complexity in order to prove results thatshould be very simple and obvious. My view is that the subject is difficult enough as it is.I think that (especially for beginning students) the most useful approach is to do just whatone does in mathematics, which is to use, as creatively and as early as possible, previouslyderived results to get new results. The ultimate purpose is to streamline derivations andget useful or interesting formulas as quickly and painlessly as possible, just as one wouldin (say) a trigonometry text. Trig would be nearly useless if one had to prove an identitysuch as cos2 θ + sin2 θ = 1 every time one needed to use it. I therefore encourage studentsto freely use earlier theorems and sequents, and I provide a formalism that allows one todo this in an efficient and clear manner. (Lemmon allows the introduction of sequents andtheorems, too, but he does not emphasize it nearly as much as I think it would be useful todo so.) Again, the aim is to develop a system of logic that could actually be useful .

Another respect in which this book differs from some other treatments of symboliclogic is in its views about the purpose of learning natural deduction. Once the student hasmastered syntactic methods of verifying a given deduction, such as truth trees, she maywell wonder what is the point of struggling through the process of constructing a syntacticderivation of a result that can be quickly checked with a tree or table.

The experienced logician will know that one reason for learning syntactic methods isthat not all logics are complete, so that once we get beyond first-order predicate logic,syntactic methods are not merely a redundant gloss on semantic methods.

Another reason for learning natural deduction is that it is a formalization of the waywe do actually tend to think about things: presented with a collection of premisses, we do

Page 5: Symbolic Logic Text

v

indeed go through something like natural deduction (although usually sloppily, and with afree admixture of inductive steps) to arrive at a conclusion. Like Moliere’s M. Jourdain,who was astonished to learn that he had been speaking prose his whole life, we may besurprised to learn that we use natural deduction whether we know it or not; the methodsof logic simply make it more explicit and precise.

There is yet another reason that natural deduction is worth learning, however, a reasonthat I do not think has received sufficient attention. What often happens in real life is thatwe are presented with a collection of givens (assumptions or known facts) and we then try todeduce what may follow from those givens. We often do not have a pre-cooked conclusionany more than we know the sum total of a list of expenses before we add it up. Thereare a number of exercises included here which are meant to develop the skill of using therules of natural deduction to work from a given collection of premisses to a conclusion thatmay have certain desired or known properties but which has not yet been found. Once wehave a complete chain of reasoning from premisses to conclusion we can, if we wish, usesemantic methods to check the argument structure we have constructed. The relation ofnatural deduction to truth trees and tables is something like the relation of long divisionto multiplication. In long division we have to guess our trial divisors, but once we havean answer we can check it in a purely mechanical way using multiplication. Similarly, innatural deduction we have to guess which rules will help us uncover a conclusion with thedesired properties, but once we have a conclusion we can check the deduction mechanicallywith semantic methods. In some of the exercises in this book the student is presented witha collection of premisses without a conclusion, and invited to explore the consequences ofthese premisses using the rules of natural deduction.

A number of rules and other results are used in this course that cannot be proven bymeans of the tools introduced in the course. (An example is the Deduction Theorem.) Restassured that all of the rules I use are valid and can be rigorously established in a moreadvanced approach.

Logic resembles gymnastics or language study in that it can be learned only by practice.To this end, I have included a generous selection of exercises. I follow the practice of manyold-fashioned math texts and divide these exercises into A, B, and C levels, together withoccasional Questions for Discussion. The A exercises are “finger exercises” that test yourunderstanding of the definitions and techniques introduced in the chapter. If you cannotdo the A exercises then you have missed something basic in the chapter. The B exercisesare a bit more complex than the As and usually combine several concepts and techniques,but they should be doable by anyone who has worked through the A exercises diligently.The B exercises are therefore tests of basic competency in the material of the chapter orsection. The C exercises are advanced and may require quite a bit of thought, or evenadditional research beyond the material in this text. The solutions to most of the exercisesare presented at the end of this book, except for some of the more open-ended problems,not all of which have a solution in closed form.

The author of any text on logic must at some point decide how to spell the word“premiss” (plural “premisses”). I have decided to follow Lemmon’s convention and use theversion with double s’s because it avoids confusion with the legalistic sense of the term (asin, for instance, “keep off the premises!”) and because it is probably more widely used inlogic.

Although the book is elementary, its pace is brisk. I assume that my readers are like

Page 6: Symbolic Logic Text

vi PREFACE

me in that they do not enjoy being patronized; this is not Logic for the Complete Dummy .I would be very grateful for any suggestions as to how this document can be improved.

Acknowledgements

I am grateful to the University of Lethbridge and the Social Sciences and Humanities Re-search Council of Canada for essential material and financial support. Many thanks for dis-cussions, advice, suggestions, shorter proofs, or error corrections to Bryson Brown, SheldonChow, Dawn Collins, Mark Ebner, Herb Korte, Laura Smith, Mark Smith, Sam Woodruff,and John Woods. A special thanks to Ardis Anderson for her help in tracking down M.Jourdain. Any errors that remain (and a quick induction suggests that there must be some!)are entirely my own responsibility.

The difficult and the easy complement each other.

— Lao Tzu

Page 7: Symbolic Logic Text

Contents

Preface iii

1 Introduction to Logic: Getting Your Feet Wet 11.1 Logic, Logical Form, and Validity . . . . . . . . . . . . . . . . . . . . . . . . 11.2 How to Teach Yourself Logic . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2.1 How To Use This Text . . . . . . . . . . . . . . . . . . . . . . . . . . 51.3 Questions for Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 Propositional Logic: Symbolization and Translation 72.1 Notation in Propositional Logic . . . . . . . . . . . . . . . . . . . . . . . . . 72.2 Common English Equivalents for Propositional Connectives . . . . . . . . . 8

2.2.1 Negation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.2.2 Conjunction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.3 Disjunction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.4 Material Implication . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.2.5 The Biconditional . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.2.6 Commutativity and Associativity . . . . . . . . . . . . . . . . . . . . 112.2.7 “Unless” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122.2.8 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.3 Truth Table Definitions of the Propositional Connectives . . . . . . . . . . . 132.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.4 Interpreting the Material Conditional . . . . . . . . . . . . . . . . . . . . . 142.4.1 “Ifs” versus “Only ifs” . . . . . . . . . . . . . . . . . . . . . . . . . . 142.4.2 The “Paradoxes” of Material Implication . . . . . . . . . . . . . . . 15

2.5 Syntactic Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.5.1 Scope of a Connective . . . . . . . . . . . . . . . . . . . . . . . . . . 162.5.2 Main Connective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.5.3 Bracketing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

2.6 Combinations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

2.7 Negated Forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.7.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2.8 Intensional Contexts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.9 Alternative Notation for Propositional Connectives . . . . . . . . . . . . . . 20

vii

Page 8: Symbolic Logic Text

viii CONTENTS

3 Introduction to Natural Deduction 213.1 Basic Rules of Derivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.1.1 Alternative Names for the Basic Rules. . . . . . . . . . . . . . . . . . 243.2 Comments on Basic Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.3 Simple Examples of the Basic Rules . . . . . . . . . . . . . . . . . . . . . . 26

3.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283.4 Some Useful Sequents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

3.4.1 One-Way Sequents . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293.4.2 Interderivability Results . . . . . . . . . . . . . . . . . . . . . . . . . 323.4.3 Form of Justification When Using Derived Rules . . . . . . . . . . . 40

3.5 Summary of Basic Sequents and Rules . . . . . . . . . . . . . . . . . . . . . 403.5.1 Basic Rules and Sequents . . . . . . . . . . . . . . . . . . . . . . . . 413.5.2 Summary of Derived Sequents . . . . . . . . . . . . . . . . . . . . . . 42

3.6 Fallacy Alert! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 433.6.1 A Common Misuse of vE . . . . . . . . . . . . . . . . . . . . . . . . 443.6.2 Affirming the Consequent . . . . . . . . . . . . . . . . . . . . . . . . 443.6.3 Denying the Antecedent . . . . . . . . . . . . . . . . . . . . . . . . . 443.6.4 Avoid Circularity! . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

3.7 Exercises on Chapter 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

4 Techniques and Applications 494.1 Recognition of Logical Form . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.1.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494.2 Combining Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4.2.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504.3 Theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

4.3.1 The “Laws of Thought” . . . . . . . . . . . . . . . . . . . . . . . . . 534.4 Substitution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4.4.1 Uniform Substitution . . . . . . . . . . . . . . . . . . . . . . . . . . 534.4.2 Nonuniform Substitution . . . . . . . . . . . . . . . . . . . . . . . . 53

4.5 Introduction Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544.5.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

4.6 The Deduction Metatheorem . . . . . . . . . . . . . . . . . . . . . . . . . . 554.6.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564.6.2 A Turnstile is not an Arrow! . . . . . . . . . . . . . . . . . . . . . . 57

4.7 Boolean Algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 574.7.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

4.8 Filling in the Gaps in an Argument . . . . . . . . . . . . . . . . . . . . . . . 58

5 Propositional Semantics: Truth Tables 615.1 Syntax versus Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 615.2 Truth Table Definitions of the Propositional Connectives . . . . . . . . . . . 62

5.2.1 How Many Possible Truth Tables Are There? . . . . . . . . . . . . . 625.2.2 Expressive Completeness . . . . . . . . . . . . . . . . . . . . . . . . 625.2.3 Evaluating Complex Wffs . . . . . . . . . . . . . . . . . . . . . . . . 63

5.3 Semantic Classification of Single Wffs . . . . . . . . . . . . . . . . . . . . . 645.3.1 Tautology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

Page 9: Symbolic Logic Text

CONTENTS ix

5.3.2 Inconsistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 645.3.3 One-to-One Relation Between Inconsistencies and Tautologies . . . . 645.3.4 Contingency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 655.3.5 Consistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

5.4 Relation of Syntax to Semantics . . . . . . . . . . . . . . . . . . . . . . . . 655.5 Semantic Classification of Pairs of Wffs . . . . . . . . . . . . . . . . . . . . 65

5.5.1 Implication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 655.5.2 Equivalence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665.5.3 Contrariety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665.5.4 Subcontrariety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665.5.5 Inconsistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665.5.6 Consistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665.5.7 Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

5.6 Relations Between Semantic Properties of Wffs . . . . . . . . . . . . . . . . 665.7 Semantic Classification of Sets of Wffs . . . . . . . . . . . . . . . . . . . . . 67

5.7.1 Equivalence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675.7.2 Inconsistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675.7.3 Consistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675.7.4 Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

5.8 Testing Validity of Sequents Using Truth Tables . . . . . . . . . . . . . . . 675.8.1 Deduction Theorem from the Semantic Viewpoint . . . . . . . . . . 695.8.2 Short-cuts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

5.9 Using Short-Cuts in Testing for Other Semantic Properties . . . . . . . . . 705.10 Evaluating Long Conjunctions or Disjunctions . . . . . . . . . . . . . . . . . 715.11 Sheffer and Nicod Strokes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 715.12 Review Exercises on Chapter 5 . . . . . . . . . . . . . . . . . . . . . . . . . 73

6 Propositional Semantics: Truth Trees 756.1 Branches and Stacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 756.2 Paths . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 766.3 Stacking and Branching Rules . . . . . . . . . . . . . . . . . . . . . . . . . . 77

6.3.1 Substitution of Equivalent Expressions . . . . . . . . . . . . . . . . . 786.4 Checking Semantic Properties of Formulas with Trees . . . . . . . . . . . . 78

6.4.1 Tautology, Contingency, and Consistency . . . . . . . . . . . . . . . 786.4.2 Equivalence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

6.5 Checking Validity of Sequents . . . . . . . . . . . . . . . . . . . . . . . . . . 806.6 Trees or Tables? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 816.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 826.8 Questions for Further Research . . . . . . . . . . . . . . . . . . . . . . . . . 82

7 Predicate Logic: Symbolization and Translation 837.1 Why We Need Predicate Logic . . . . . . . . . . . . . . . . . . . . . . . . . 837.2 Predicate Logic Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

7.2.1 Universe of Discourse . . . . . . . . . . . . . . . . . . . . . . . . . . 847.2.2 Terms and Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . 847.2.3 Monadic Predicates . . . . . . . . . . . . . . . . . . . . . . . . . . . 857.2.4 Relations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85

Page 10: Symbolic Logic Text

x CONTENTS

7.2.5 Propositions and Propositional Functions . . . . . . . . . . . . . . . 867.2.6 Quantifiers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 867.2.7 Order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 897.2.8 Categorical Forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . 897.2.9 Don’t Be Fooled. . . ! . . . . . . . . . . . . . . . . . . . . . . . . . . . 907.2.10 Combinations of Properties . . . . . . . . . . . . . . . . . . . . . . . 907.2.11 Use of “Except” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 907.2.12 Use of “Only” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 917.2.13 Variants of Categorical Forms Using Relations . . . . . . . . . . . . 917.2.14 The Indefinite Article . . . . . . . . . . . . . . . . . . . . . . . . . . 92

7.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94

8 Predicate Logic: Natural Deduction 958.1 Duality Relations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 958.2 Introduction and Elimination Rules . . . . . . . . . . . . . . . . . . . . . . . 95

8.2.1 Universal Elimination (UE) . . . . . . . . . . . . . . . . . . . . . . . 958.2.2 Universal Introduction (UI) . . . . . . . . . . . . . . . . . . . . . . . 968.2.3 Fallacy Alert! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 978.2.4 Existential Introduction (EI) . . . . . . . . . . . . . . . . . . . . . . 978.2.5 Existential Elimination (EE) . . . . . . . . . . . . . . . . . . . . . . 978.2.6 Fallacy Alert! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

8.3 Summary of Introduction and Elimination Rules for Quantifiers . . . . . . . 1008.4 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1018.5 Distribution Rules for Quantifiers . . . . . . . . . . . . . . . . . . . . . . . . 102

8.5.1 Summary of Distribution Rules . . . . . . . . . . . . . . . . . . . . . 1038.6 Proofs of Duality Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1038.7 Rules of Passage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1048.8 Interderivability ⇔ Equivalence Theorem . . . . . . . . . . . . . . . . . . . 1058.9 What to Memorize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1068.10 A Shortcut . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1068.11 Fallacy Alert! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1068.12 Fallacy Alert! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1068.13 Review Exercises for Chapter 8 . . . . . . . . . . . . . . . . . . . . . . . . . 107

9 Predicate Logic With Identity 1099.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1109.2 Fallacy Alert! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1119.3 Expressing Numerical Relations . . . . . . . . . . . . . . . . . . . . . . . . . 1119.4 Superlatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1129.5 Uniqueness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1129.6 Numerically Definite Quantifiers . . . . . . . . . . . . . . . . . . . . . . . . 1129.7 Definite Descriptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

9.7.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1149.8 The Definite Article . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1159.9 Review Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

Page 11: Symbolic Logic Text

CONTENTS xi

10 Solutions to Selected Exercises 119Solutions for Chapter 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119Solutions for Chapter 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121Solutions for Chapter 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126Solutions for Chapter 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129Solutions for Chapter 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136Solutions for Chapter 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137Solutions for Chapter 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138Solutions for Chapter 9 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

Page 12: Symbolic Logic Text

xii CONTENTS

Page 13: Symbolic Logic Text

Chapter 1

Introduction to Logic: GettingYour Feet Wet

1.1 Logic, Logical Form, and Validity

What is logic? There is much disagreement among professionals about this question. I willoffer two definitions which, while no doubt debatable, give us something to work with:

• Logic is the art and science of reasoning.

• Logic (especially symbolic logic) is the mathematics of language.

As the very word “logic” suggests (its root is the Greek term “logos,” meaning word),logic has a lot to do with language. But it also points to something beyond language,for logic instantiates deep laws of form that appear in many guises in many subjects anddisciplines [5]. One can get into interesting philosophical debates about the nature of thesemathematical forms. (The Platonists, who believe in the independent reality of mathe-matical objects, have been debating for millennia with the nominalists, who insist thatthe “truths” of logic and mathematics are nothing more than linguistic conventions.) Wecould also ask which comes first, logic or mathematics? The school of logicism, headed byBertrand Russell and Gottlob Frege (who themselves contributed in important ways to thediscipline that we now call modern symbolic logic) argued that all of mathematics is basedon laws of pure logic. But it is now generally felt that this ambitious project did not work,and that it is more likely the other way around; i.e., that logic is a sort of applied math-ematics. Unfortunately, we cannot get into such fascinating disputes in an introductorywork like this. For more on the philosophy and foundations of mathematics and logic, see[1, 5].

We should begin by noting the important distinction between inductive and deductivelogic. In deductive logic we extract information already contained, or implicit, in a setof premisses (statements that are accepted as given in the context of an argument orinvestigation). We might accept the premisses for several reasons: they may be known factswhose implications we are trying to uncover, for instance, or they might be hypotheses orpossibilities whose consequences we wish to explore. But in deductive reasoning we nevergo beyond the information contained in the premisses; rather, we rearrange that implicitinformation into forms that may be more interesting or useful to us.

1

Page 14: Symbolic Logic Text

2 CHAPTER 1. INTRODUCTION TO LOGIC: GETTING YOUR FEET WET

Inductive reasoning is said to be ampliative in that it expands upon the informationgiven in the premisses. Inductive reasoning is usually implicitly or explicitly probabilistic,and it becomes quantitative in many scientific contexts. Induction, especially as it is usedto solve problems in science and engineering, also involves factors that are very difficult todefine formally, such as creativity, intuition, and the grasp of aesthetic relationships.

Most applications of reasoning in real-life situations involve a mixture of inductive anddeductive methods. However, it is very useful to study deductive methods in isolation,because of its importance as the foundation for an understanding of all other methods ofreasoning. This course is almost entirely concerned with deductive reasoning.

Now, why do we do symbolic logic? The advantages of symbols in logic are much thesame as their advantages in mathematics: precision, concision, and their ability to expressrelationships that are too complex to express efficiently or at all in words. Also, symbolshave the advantage of greater universality: an argument in different languages that has thesame logical form would be expressible in the same symbols.

A lot of what logicians do (though not all) is analyze arguments. For the logician,an argument is not a dispute or a shouting match; rather, it is an attempt to establish aconclusion on the basis of given premisses (singular, premiss). The premisses are statementsthat are assumed to be given; as logicians we don’t know where they come from.

Warning! Please spell the word “argument” correctly! One mark off anyassignment or test any time you write “arguement”!

Here’s a simple deductive argument:

If Sundin stays healthy, the Leafs have a chance.But if Sundin takes his vitamins, he will stay healthy.∴ If Sundin takes his vitamins, the Leafs have a chance.

(There’s our first symbol: “∴”, which means “therefore”.)Notice that we don’t worry too much about tense in the examples used here; there are,

however, “tensed” logics for which the point in time at which a proposition was uttered isimportant.

Now, we need to distinguish between validity and soundness. Consider the followingargument:

If you are in Lethbridge, then you are in Alberta.If you are in Alberta, then you are in Canada.∴ If you are in Lethbridge, you are in Canada.

Both premisses and the conclusion happen to be true statements. Substitute the word“France” for “Alberta” and we would have an argument with false premisses; but theconclusion would still seem to follow from the premisses. Therefore, there are argumentsthat intuitively seem to be valid in the sense that the conclusions somehow follow from thepremisses, but which still have something missing.

To clarify what is missing, we make a crucial distinction between validity and soundness:

Validity: a deductive argument (or argument form) is valid if and only if (or iff ) it isimpossible for its conclusion to be false when its premisses are true.

Page 15: Symbolic Logic Text

1.1. LOGIC, LOGICAL FORM, AND VALIDITY 3

Soundness: A deductive argument is sound iff it is valid and has true premisses.

This is the modern usage of the terms “valid” and “sound” as they are applied to argumentsor argument forms. Some older books, such as Lemmon’s Beginning Logic [17], use the word“sound” where we will use “valid;” that terminology is now out of date.

The following rough-and-ready concept of validity is also useful:

An argument is valid if it follows from its premisses by correct application ofcertain rules of deduction.

Eventually, we have to ask whether or not the rules of deduction are themselves valid inthe deeper sense given above. In fact, they are, so that these two definitions of validityturn out to be the same thing in the long run. Later in the course we will learn otherways of expressing the notion of validity, all of which are equivalent to the first definitiongiven above. In a sense, this course is simply about all the ways of defining and testing fordeductive validity, and for validly deriving conclusions from premisses.

The practical value of valid deductive reasoning stems from the fact that if a validargument has true premisses, the conclusion is guaranteed to be true as well. Validitytransmits truth from premisses to conclusions.

Surprisingly, there are some arguments that can be valid, for formal reasons, but whichcan never be sound. Later on we will see some examples of rather suspicious-looking argu-ments that are, in fact, valid.

Common parlance often confuses the terms “valid” and “true.” Be careful to avoidthis ambiguity when you are doing formal logic. Arguments can be valid or invalid, andpropositions can be true or untrue, but there are no such things as “valid” statements or“true” arguments. As we shall see, however, valid arguments can be transformed by thetechniques of natural deduction into certain kinds of true statements called logical truths.

As logicians, we do not worry about where the premisses came from or whether or notthey are true; we are only concerned about the validity of arguments, not their soundness.In this respect, logicians might be compared to automotive mechanics, who ensure that yourcar runs properly but do not ask where you drive it. As Bertrand Russell once quipped,“Pure mathematics [which Russell saw as an extension of logic] can be defined as thatsubject in which we do not know what we are talking about, nor whether what we aresaying is true.” [20].

Now consider the following argument:

If you are in Hobbiton, then you are in the Shire.If you are in the Shire, then you are in Middle Earth.∴ If you are in Hobbiton, you are in Middle Earth.

Compare this to the argument given above about being in Alberta. What is commonto these two arguments is their form, which we could represent in a more or less obviousway, without even defining the symbols, as follows:

P → QQ→ R∴ P → R.

Page 16: Symbolic Logic Text

4 CHAPTER 1. INTRODUCTION TO LOGIC: GETTING YOUR FEET WET

So we can distinguish between particular arguments in a given natural language (suchas English, French, Russian, etc.), and their forms, which are more universal. One of themain skills that you will develop in learning logic is the ability to perceive logical form.Our main business as logicians is to analyze the validity of arguments by determining thevalidity of argument forms.

There are different kinds of logical form that capture different ways in which an argu-ment can be valid. We can intuitively recognize some kinds of validity by means of whatI shall call “linguistic common sense;” but the powerful tools of logic allow us to test thevalidity of arbitrarily complex argument forms.

A deductive form is very much like a computer program which accepts input (the pre-misses) and outputs a conclusion. In fact, computers can easily be programmed to reproducethe steps of a deductive argument. The validity of a deductive form can be compared to thereliability of a piece of software. A reliable accounting program, for instance, will correctlytell you whether you have a profit or a loss in your business so long as you correctly enter allincome and expenses. This is like a valid argument form that can be guaranteed to return atrue conclusion if the premisses are true. Of course, if some of the premisses are false thenthere is no telling whether the conclusion will be true or false, just as if you put incorrectdata into your accounting program you might, or might not, get the right answer back fromit. Just as the job of software engineers and programmers is to write reliable code, the jobof logicians (so far as deductive logic is concerned) is to devise valid arguments.

Another helpful analogy between logic and computing is that we can think of logic asbeing like a computer game with levels, where we have to master each level before we cango on to the next. In this course we cover three main levels, each one of which gives usmore machinery to capture validity. They are:

1. Propositional logic — also known as sentential logic, sentential calculus, or proposi-tional calculus.

2. Elementary first-order predicate logic.

3. Elementary first-order predicate logic with identity.

Each level has extra machinery that allows us to represent a kind of validity that cannotbe represented by the lower level.

We begin, in the next chapter, with propositional logic, which is an absolutely basicdiscipline that appears in a hundred useful contexts, and serves as the foundation for allhigher-level work in logic.

1.2 How to Teach Yourself Logic

I entitled this section “how to teach yourself logic”, not “how to learn logic” or “how tobe taught logic”, because I want to emphasize that logic, like any other complex set ofinter-related skills, is ultimately self-taught no matter what courses you take or what booksyou read.

The secret of learning any complex set of skills is attentive repetition, together with thecultivation of a calm intensity that does not lose its focus on the desired goal, but which atthe same time awaits improvement with almost infinite patience.

Page 17: Symbolic Logic Text

1.3. QUESTIONS FOR DISCUSSION 5

I once had the temerity to take a course called Introduction to Theoretical Physics,given at the University of Toronto by a great teacher and scholar named Professor LynnTrainor. The very first equation he presented to us will be the first formula of this text aswell:

Understanding = Familiarity.

What Professor Trainor meant was that in learning a complex technical subject such aslogic or physics, one must accept the fact that fully confident understanding may not comeimmediately; rather, it follows from becoming so familiar with the operations of the subjectthat they become second nature. In fact, learning logic is very much like learning a language;in the beginning, you have to work very hard simply to familiarize yourself with the basicdefinitions and rules, some of which at first may seem arbitrary and strange. After a whilethe structure of the language becomes so obvious that you cannot quite figure out why youonce could not understand it. Yes, in order to learn logic you have to memorize things,and then diligently practice their use. Learning a subject involves putting the details of thesubject into your head, and becoming adept in their use by dint of long practice.

There are, however, also some important differences between learning logic and learninga language. Although there are things that must be memorized in learning logic, there arefar fewer of them than in learning a language, where one must memorize thousands of wordsas a basis for full fluency. Furthermore, as one becomes more experienced and familiar withthe use of logic, the inter-relations between its concepts become more and more obvious,and mere memorization becomes less important, until it finally dawns on you that all oflogic is in a deep sense just many different ways of saying the same thing.

1.2.1 How To Use This Text

• Work through on paper the demonstrations given in the text.

• There are a generous number of practice exercises, and solutions to about half of themare given in the back. Try to work as many of these as you can. Try the A’s first, andif you cannot do all of them you have missed something important in the previoussection of the text. When you can do the A’s, move on to the B’s.

• If you do all of the exercises in this text — do them again! — and perhaps again, tothe point at which they become as obvious to you as the streets and alleys of yourhome town. I repeat, repetition is the secret of learning.

• You may also find it very helpful to try and make up your own problems.

• It is very important to be neat and tidy when writing out your proofs, truth tables,and so forth. A certain amount of doing logic correctly is nothing more than clericalaccuracy. Writing things out completely, correctly, and legibly will facilitate carefuland accurate thought — and will also encourage our over-worked teaching assistantto give you better marks.

1.3 Questions for Discussion

1. If all deductive logic is simply a process of reworking information given in the pre-misses, why aren’t all deductive arguments circular? Or are they?

Page 18: Symbolic Logic Text

6 CHAPTER 1. INTRODUCTION TO LOGIC: GETTING YOUR FEET WET

2. Is logic just a branch of mathematics? Or the other way around? Or neither?

3. Are logic and mathematics discovered or invented?

Page 19: Symbolic Logic Text

Chapter 2

Propositional Logic: Symbolizationand Translation

2.1 Notation in Propositional Logic

As explained in Chapter 1, propositional logic is the first “level” of logic. In propositionallogic we study the logical behavior of propositions, which are linguistic expressions or ut-terances that are intended by their user to indicate a fact or state of affairs. Propositions(also sometimes called statements) are distinct from questions, requests, commands, or ex-clamations in that they can have a truth value (though as we shall see, it may be sometimesdifficult to say just what that value is). Examples of propositions are “This is an exampleof a proposition,” “2 is a number,” “D-Day was June 6, 1944,” and “E = mc2.” Propo-sitions are so-called because in effect the speaker “proposes” that something is the case.Propositional logic is also known as propositional calculus, sentential logic, or sententialcalculus.

We will represent propositions by letters such as P,Q,R, . . . P1, P2, . . ..Some logicians will say that “It is raining” and “Il pleut” represent the same proposition

expressed in different languages; others (perhaps more commonsensically) will say that theserepresent different propositions that happen to point to the same state of affairs. We neednot concern ourselves with this subtle distinction in an introductory course.

Propositions can be joined together by certain symbols called propositional connec-tives. These are also called truth functional connectives because, as we shall eventually see,the truth value of compound propositions formed by linking simpler propositions togetherwith these particular connectives is strictly a function of the truth values of the simplerpropositions. Not all ways of forming compound propositions from simpler propositions aretruth-functional. However, truth-functional combinations of propositions are the main oneswe are concerned with here because all that propositional logic is really concerned with arethe truth values of propositions; it is really a calculus of truth values.

We shall be mainly concerned with the following truth-functional connectives: − (not,or negation); & (and), ∨ (inclusive or), → (the material conditional), and ↔ (the bicon-ditional). Common translations of these symbols are given in the next section.

This is by no means the smallest set of connectives with which one can do propositionallogic. These connectives have been chosen because they are fairly close to the less formallogically connective concepts that are used in many natural languages. It is possible to

7

Page 20: Symbolic Logic Text

8 CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

define other truth-functional connectives, and we will briefly discuss this in Chapter 5.Combinations of propositional letters and connectives will be called formulas. Those

strings of symbols that can represent propositions are called well-formed formulas or wffs.It is possible to give a precise recursive definition of the set of all possible wffs, but we willnot do that in this course. (See, for instance, Lemmon’s Beginning Logic [17].) Instead,we will simply rely on our knowledge of English to tell us which formulas can stand forcomplete sentences and which cannot.

Propositions can be atomic or molecular . Atomic propositions are those that cannotbe broken down into truth functional combinations of other propositions, while molecularpropositions are those that can. An example of an atomic proposition is “Today is Tuesday,”while an example of a molecular proposition is “Today is Tuesday and I have to go to ameeting.”

2.2 Common English Equivalents for Propositional Connec-tives

Below I list some of the most commonly encountered English equivalents for the standardpropositional connectives. It is impossible to list all of the possible English expressions thatcould be correctly translated into the standard symbols such as & . We have to rely onour knowledge of the language, what might be called “linguistic common sense,” in orderto work out a speaker or author’s implicit logical sense. Expressions in natural languagessuch as English can be ambiguous, imprecise, or capable of multiple meanings and subtlerhetorical nuances, and may not always have a unique logical translation.

The rules for translation that we present here certainly do not capture all of the con-ceivable ways that logical structure can be expressed in a natural language. However, manyyears of practical experience shows that the rules we present here (or ones very similar tothem) allow us to define a system of logic that is very useful and which can form a basisfor more advanced logics.

In the examples in this chapter we’ll use the following propositional symbols:

B := “Bob goes to the store”A := “Alice goes to the store”C := “Cheryl goes to work”T := “Tad stays home”

The symbol := means “is defined as . . . ”, and we will use the double-arrow⇔ to mean “canbe translated as . . . ”.

2.2.1 Negation

To negate a proposition is to deny that it is true or that it is the case. Thus, −P can beread as “not P ,” or “it is not the case that P ,” “P is not true,” or “P is false.”

Negation is a unary connective, in that it only applies to the proposition written im-mediately to its right.

Example: Bob does not go to the store. ⇔ −B.

The negated proposition could be a combination of other propositions, however:

Page 21: Symbolic Logic Text

2.2. COMMON ENGLISH EQUIVALENTS FOR PROPOSITIONAL CONNECTIVES 9

Example: It is not the case that both Bob and Alice go to the store. ⇔−(A&B)

2.2.2 Conjunction

To conjoin two propositions is to assert or to hold them jointly. We will write conjunctionin the form P &Q. The two conjoined propositions P and Q are called the conjuncts.Although conjunction is usually taken to be binary , meaning that it applies to only twopropositions, and while we will usually consider cases where only two propositions areconjoined, there is no reason in principle why any number of conjuncts cannot be linked inthe same conjunction (although we will not prove that fact formally in this course). Themost general conjunction can be written in the form P1 & . . . &Pn.

Here are some common expressions that would all be translated into symbols as P &Q:

P and Q.Both P and Q.P but Q.P although Q.P despite Q.P even though Q.P whereas Q.P in spite of Q.In addition to Q,P .etc.!

Note that the subtle rhetorical distinctions between “but,” “and,” “whereas,” etc., are alllost in moving to propositional notation. If these distinctions make a difference to the logicof an argument, then they must somehow be expressed clearly enough that they can betranslated into propositional notation.

Example 1 : Cheryl goes to work and Tad stays home ⇔ C &T .

Example 2 : Alice goes to the store while Tad stays home ⇔ A&T .

Example 3 : Bob and Alice both go to the store ⇔ B&A.

Example 4 : Bob and Alice go to the store while Cheryl goes to work ⇔ B&A&C.

Note that order does not matter in any of these examples, or, indeed, in any correctusage of &. Thus, Example 2 could be translated just as well as T &A.

2.2.3 Disjunction

To disjoin two propositions is to say that one or the other, or possibly both, is or are thecase. We write disjunction in the form P ∨ Q. The two disjoined propositions P and Q arecalled the disjuncts. Like conjunction, disjunction is usually taken to be binary, but alsolike conjunction it is perfectly admissible to write disjunctions of any number of disjuncts.The most general disjunction can be written in the form P1 ∨ . . . ∨ Pn.

Here are common expressions that would be translated as P ∨ Q:

Page 22: Symbolic Logic Text

10CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

P or Q.Either P or Q.P or Q or both.

In propositional logic we will almost always take “or” in the inclusive sense, meaningthat if we accept “P or Q” as true, we are willing to grant that either P or Q is true ontheir own, or that P and Q may be true together. This is how “or” is usually used inEnglish and many other natural languages. Here’s a typical example: “Either negligence orincompetence accounted for the sinking of the Titanic.” Clearly, it might have been both.

Occasionally, it will be useful to express the concept of exclusive disjunction, also knownas exclusive or , or in circuit theory as XOR. Later on we will see how to construct exclusive“or” in terms of the usual connectives. Here’s an example in which the fact that a disjunctionis exclusive might make a logical difference: “Every natural number is either even or odd.”Again, we can usually take “or” to be inclusive unless clearly stated otherwise.

In Latin there are two distinct words for the two senses of “or”: (“vel”) for the inclusivesense, and (“aut”) for the exclusive sense — and thus less chance of confusing the twoconcepts. The usual symbol for inclusive “or,” ∨, was probably adapted from the Latin vel .

Example 1 : Bob or Alice go to the store ⇔ B ∨ A.

Example 2 : Either Cheryl goes to work or Bob goes to the store ⇔ C ∨ B.

Again, as with &, order does not matter. Thus, Example 2 could just as correctly havebeen written as B ∨ C.

Example 3 : Either Bob goes to the store, Alice goes to the store, or Cheryl goes towork ⇔ B ∨ A ∨ C.

Example 3 shows that, as with &, any number of propositions can be disjoined.

2.2.4 Material Implication

It is probably fair to say that all or virtually all types of logic, formal or informal, involvesome notion of logical consequence — the idea that one thing may be taken to follow fromcertain others. In propositional logic the consequence relation is called material implication,and it is crucial to understand its properties, which are sometimes counterintuitive. Materialimplication is a strictly binary connective, and is symbolized in this course by →.

In a statement of the form P → Q the proposition P is the antecedent, and Q is theconsequent . The form P → Q is often called the conditional .

Here are common translations of P → Q:

If P then Q.P only if Q.Q if P .If P , Q.P is a sufficient condition for Q.Q is a necessary condition for P .P implies Q.Q is implied by P .

Page 23: Symbolic Logic Text

2.2. COMMON ENGLISH EQUIVALENTS FOR PROPOSITIONAL CONNECTIVES11

P entails Q.Q is entailed by P .Q while P .While P , Q.Given P , Q.

Innumerable minor linguistic variations of these forms are possible.It is very important not to confuse “P only if Q” with “P if Q”. In these two forms

the implication goes in opposite directions.

Example 1 : If Bob goes to the store then Tad stays home ⇔ B → T .

Example 2 : Bob goes to the store only if Tad stays home ⇔ B → T .

Example 3 : Bob goes to the store if Tad stays home ⇔ T → B.

Example 4 : In order for Alice to go to the store, it is necessary for Cheryl to go to work⇔ A→ C.

Note carefully that if P → Q, it does not always follow that Q → P . Also, unlike ∨and &, implication is strictly binary.

2.2.5 The Biconditional

The symbol↔ is called the biconditional , and is defined such that P ↔ Q if and only if bothP implies Q and Q implies P . It is, of course, not generally true of any two propositionschosen at random that they entail each other.

Here are the most common translations of P ↔ Q.

P if and only if Q. (This is often abbreviated as “P iff Q.”)P is a necessary and sufficient condition for Q.P is equivalent to Q.P implies Q and Q implies P .

Again, there are many possible ways of expressing this notion in natural languages.

Example: Bob goes to the store if and only if Tad stays home ⇔ B ↔ T .

2.2.6 Commutativity and Associativity

Commutativity: An operation is commutative if it returns the same result regardlessof the order in which it is applied. For example, P &Q has the same truth value asQ&P .

Associativity: An operation is associative if it returns the same result regardless of howwe group the terms to which it is applied. For instance, if we take bracketing toindicate that the bracketed term is evaluated first, it can be shown that (P ∨ Q) ∨ Rhas the same truth value as P ∨ (Q ∨ R).

Conjunction, disjunction, and the biconditional are commutative and associative, whilematerial implication is not.

Page 24: Symbolic Logic Text

12CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

2.2.7 “Unless”

“Unless,” like “or,” is a word that is used in more than one way in the English language, andsome confusion has grown up over the right way to translate it into propositional symbols.

Like “or,” “unless” can be used in an inclusive or exclusive sense.Here is an example of the inclusive sense of “unless:”

Johnny will fail the exam unless he studies.

Sadly, we know that Johnny studying for the exam does not exclude the possibility ofJohnny failing the exam.

Here is an example of the exclusive use of “unless:”

We will have a picnic unless it rains.

The very thing we wish to deny is that we are going to have a picnic in the rain, while atthe same time we affirm that we will have a picnic if it does not rain. This is most easilytranslated as exclusive “or”.

In the chapter on propositional semantics we will examine the truth table behavior ofthe various senses of “or” and “unless”.

It would be very rare that the logic of an argument would depend on whether or nota usage of “unless” was exclusive. Therefore, in the large majority of cases, you can followthe advice of most logic books, which is to translate “P unless Q” as P ∨ Q. But checkthe context carefully.

2.2.8 Exercises

A

1. Translate the following into symbols:

(a) Tad stays home while Cheryl goes to work.

(b) Tad stays home but Cheryl does not go to work.

(c) If Tad stays home then Cheryl does not go to work.

(d) Either Bob goes to the store or Tad stays home.

(e) Alice going to the store is implied by Bob not going to the store.

(f) Bob does not go to the store only if Alice does.

(g) Cheryl goes to work if Bob goes to the store.

(h) For Cheryl to go to work it is sufficient that Tad stays home.

(i) For Cheryl not to go to work it is necessary that Tad not stay home.

(j) Cheryl going to work implies Alice going to the store, and Alice going to thestore implies Cheryl going to work.

(k) Bob does not go to the store if Alice does.

(l) Bob going to the store is equivalent to Tad not staying home.

(m) Tad stays home unless Cheryl does not go to work.

2. Translate the following into clear, grammatically correct, idiomatic English:

Page 25: Symbolic Logic Text

2.3. TRUTH TABLE DEFINITIONS OF THE PROPOSITIONAL CONNECTIVES 13

(a) B → −A.

(b) B&C.

(c) −B ∨ C.

(d) −C ↔ T .

(e) −A→ B.

(f) −A& − T .

(g) −A ∨ −T .

(h) B → T .

(i) T → B.

(j) −T → −B.

2.3 Truth Table Definitions of the Propositional Connectives

All of the propositional connectives we have used in natural deduction can be defined interms of their characteristic truth tables. We introduce truth tables now because they willhelp us to understand why some of the connectives are translated in certain ways.

Here is the simplest truth table, that of a single proposition P :

P

TF

This merely expresses the assumption that any proposition is taken to be either definitelyT or definitely F. The table is constructed simply by listing all the possible truth values ofthe proposition in a column below it.

Tables for propositions containing two variables have four lines, in order to representall possible combinations of truth values of the atomic propositions. We can call these the“input” values. The truth value of the whole expression is written in a column underneaththe connective; call these values the “outputs”.

Here are the truth-table definitions of the five most commonly used propositional con-nectives:

Negation:− P

F TT F

Conjunction:(P & Q)T T TT F FF F TF F F

Disjunction:(P ∨ Q)T T TT T FF T TF F F

Page 26: Symbolic Logic Text

14CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

Material Implication:(P → Q)T T TT F FF T TF T F

Biconditional:(P ↔ Q)T T TT F FF F TF T F

It is absolutely necessary for beginners in logic to memorize these basic tables, becausethey are the basis for a large part of what we do in this text. We will study truth tables inmuch more detail in Chapter Four, where we will see that they offer a powerful method forthe analysis of the validity of arguments.

2.3.1 Exercises

A

1. If it is true that Bob goes to the store and false that Cheryl goes to work, then whatare the truth-values of the following propositions?

(a) Bob goes to the store and Cheryl goes to work.

(b) If Bob goes to the store then Cheryl goes to work.

(c) Either Bob goes to the store or Cheryl goes to work.

(d) Bob goes to the store if and only if Cheryl does not go to work.

(e) For Bob to go to the store it is sufficient that Cheryl not go to work.

2. If P is T and (P &Q) is F, then state the truth values of

(a) P ∨ Q(b) P → Q

(c) −P ∨ Q(d) P ↔ Q

(e) Q→ P

2.4 Interpreting the Material Conditional

2.4.1 “Ifs” versus “Only ifs”

As we learn more about how deduction works, it will become apparent that the truth tablefor → is the backbone of deductive logic. However, this table has some counterintuitivefeatures which you may only become fully comfortable with after a lot of experience withlogic. We can note right away, however, that the table will help us to understand the variousways in which → is translated into English. The reading “if P then Q” should be clear:from the table, we can see that on all the lines where P → Q is T, Q is true whenever P isT. In other words, for Q to be T, it is sufficient for P to be true.

The reading “P only if Q” sometimes confuses students (and a few professors from timeto time as well). However, it makes perfect sense if you notice that on all the lines of the

Page 27: Symbolic Logic Text

2.4. INTERPRETING THE MATERIAL CONDITIONAL 15

table where P → Q is T, that P is T only in the lines in which Q is true also. Hence, also,the reading “Q is a necessary condition for P”: given P → Q, P cannot be T unless Q isT.

The concepts of necessity and sufficiency are easy to mix up. Concrete examples mayhelp to make them clear. For a person to be adequately nourished, it is sufficient that theyeat East Indian food. However, there are many other kinds of nourishing food, and it isnot necessary to eat East Indian in order to live. (Lovers of curry might disagree with meabout this.) On the other hand, it is necessary to ingest a certain amount of protein inorder to be adequately nourished.

Here is a nice example (thanks to a student!) which illustrates the proper reading ofP → Q. Consider the following statements:

(1) You will win the lottery only if you buy a ticket.

(2) You will win the lottery if you buy a ticket.

These are two very different statements. The first is the generally true claim that you haveto buy a ticket in order to have a chance to win; but this, of course, does not guaranteethat you will win the lottery. The second says that you will win if you buy a ticket, whichis certainly false for most lotteries!

Statement (1) above is equivalent to the following:

To win the lottery it is necessary that you buy a ticket.

On the other hand, statement (2) is equivalent to this:

To win the lottery it is sufficient that you buy a ticket.

Again, the truth conditions for these two statements are clearly quite different.

2.4.2 The “Paradoxes” of Material Implication

Consider the following garden-variety implication:

If you change your oil regularly then your engine will last longer.

We recognize this as true because we know about the causal relevance of regular oil changesto engine longevity. Here’s another example:

If a party in a Parliamentary system wins the most seats in an election, it getsto form the government.

Here the relevance of antecedent to consequent is not causal, but legal. Still, anyone familiarwith the way Parliamentary governments work will recognize the statement as true. Butnow consider the following:

If Ottawa is the capital of Canada then Caesar crossed the Rubicon.

Both antecedent and consequent are true, and so by the truth table for “if-then” the wholestatement is true. And yet, there seems to be little if any relevance of antecedent toconsequent. This illustrates a fact that is often called the Paradox of Material Implication:

Page 28: Symbolic Logic Text

16CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

The truth value of a material conditional has absolutely nothing to do with themeaning of the antecedent and consequent, beyond their truth values.

This is a “paradox” only because it is an affront to our linguistic common sense. In proposi-tional logic the only “meaning” the propositional letters P, Q, . . . have is their truth values;propositional logic is purely an algebra of truth values, and any formula is true or falsedepending only on the truth values of the input variables and the logical relations betweenthem.

The challenge to common sense becomes even more acute when we consider cases inwhich the antecedent is false, for the table says that such conditionals are true regardlessof the truth value of the consequent. Thus, both of the following are true statements:

If 2 + 2 = 17π, then Canada is in North America.If 1 = 0 then 1 = 2.

These examples are an illustration of the following principle of classical two-valued logic:

Ex Falso Quodlibet: from a falsehood anything follows.

A free translation of the Latin phrase is, “from a falsehood, whatever. . . ” (This proves thatthere were teenagers in ancient Rome.) This principle is sometimes abbreviated Ex Falso,and we shall encounter it again in the next chapter.

There is an advanced branch of logic called relevance (or relevant) logic in which logi-cians attempt to construct notions of implication which can express relevance of antecedentto consequent in a way that is closer to ordinary language. However, long experience hasshown that the simple, stripped-down notion of material consequence presented here is onthe whole the most useful notion of implication we have, since it can be adapted in so manyways.

2.5 Syntactic Conventions

2.5.1 Scope of a Connective

The scope of a propositional connective is the set of formulas that it operates on. Forinstance, in −P , the scope of the negation sign is just P ; in −P &Q, the scope of thesign of conjunction is −P and Q; and in −(P &Q) the scope of the negation sign is all of(P &Q).

2.5.2 Main Connective

The main connective of a propositional formula is the connective whose scope is the wholeformula. For instance, in (P ∨ Q) → (P &Q), the → is the main connective; while in−(P &Q) the main connective is the negation sign. As these examples indicate, scope canbe fixed by brackets if there is any danger of ambiguity.

Is there a main connective in a conjunction or disjunction with more than two terms?Since conjunction and disjunction are fully associative, it would seem that no one connectivein (P1 . . . & . . . Pn) or (P1 . . . ∨ . . . Pn) should be privileged. This question illustrates thedrawback of the usual notation we use for disjunction and conjunction, which (as GeorgeSpencer Brown pointed out [5]) are usually represented as if they were binary when in

Page 29: Symbolic Logic Text

2.6. COMBINATIONS 17

fact they are not restricted to two variables. The highly flexible notation used by SpencerBrown in his calculus of indications [5] gets around this problem elegantly. It is also possible,although we will not rely on it here, to represent conjunctions and disjunctions respectivelyby writing something like this:∧

(P1, P2, . . . , Pn) =∧i

Pi and∨

(P1, P2, . . . , Pn) =∨i

Pi,

where the connectives in front operate on the whole list of propositions given in bracketsand thus, in effect, are the main connectives for the whole conjunction or disjunction.

In Chapter 5 we will introduce the highly flexible Reverse Polish Notation (RPN), andshow how it can be used to streamline the evaluation of complex formulas, especially thosecontaining conjunctions or disjunctions with more than two terms.

2.5.3 Bracketing

There are almost as many bracketing conventions as there are logic books. In this book wewill use a very simple convention whose sole purpose is to avoid ambiguity.

Brackets indicate the scope of a connective. Bracket off a sub-formula if there is anypossibility of confusion about which connective applies to which part of the formula. Forinstance, in (P &Q) → R, the & connects P and Q, and the → connects the wholesubformula (P &Q) with R. Brackets can be nested, and the general rule is that oneevaluates inner brackets first, and then works outwards. Putting brackets around a wholeformula is optional.

Example: In P & (Q ∨ R), the subformula Q ∨ R is considered as a unit, and thisformula would be translated as “P and either Q or R.” By comparison, (P &Q) ∨ R wouldbe translated as “Either P and Q or R;” or, perhaps more clearly as “R or both P and Q.”

Be careful! In English and many other natural languages, the scope of logical con-nectives is ambiguous and must be understood from context and a knowledge of commonusage.

Example: “If Kerry wins then Bush loses and Cheney has to retire” would usuallybe understood to be bracketed as follows: “Kerry wins → (Bush loses & Cheney has toretire).” Strictly speaking, it could also be read as “(Kerry wins → Bush loses) & Cheneyhas to retire.” However, few experienced speakers of English would read it this way. InChapter Four we will learn how to show that these two versions of the sentence are notlogically equivalent. I discuss further examples like this in the section on Combinationsbelow.

Remember that we allow unlimited associativity for conjunction, disjunction, and thebiconditional. This means that we can write conjunctions of n formulas as (P1 & . . . &Pn),disjunctions of n formulas as (P1 ∨ . . . ∨ Pn), and equivalences of n formulas as (P1 ↔ . . .↔Pn), without having to bracket off pairs of conjuncts or disjuncts. (There is an exceptionto this when we do truth tables, which will be explained in Chapter 5.)

2.6 Combinations

Truth-functional combinations of propositions can be created in literally an infinite varietyof ways. As indicated in the discussion of bracketing above, a sensitivity to context and

Page 30: Symbolic Logic Text

18CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

conventional usage is sometimes required to translate combined expressions either from anatural language to a formula or the reverse.

Example 1: If Bob and Alice go to the store then Tad doesn’t stay home ⇔ (B&A)→−T .

Example 2: Either Bob goes to the store or Tad stays home and either Alice goes tothe store or Cheryl goes to work ⇔ (B ∨ T ) & (A ∨ C).

Example 3: Either Bob goes to the store only if Alice goes to the store or else Cherylgoes to work only if Tad stays home ⇔ (B → A) ∨ (C → T ).

Propositional logic can express relationships that would be far too complicated to stateconveniently in ordinary language.

Example: The expression (P1 & . . . &Pn)→ (Q1 ∨ . . . ∨ Qm) is a perfectly legitimateformula, and yet few people would try and say anything like this in words if n and m weremuch larger than, say, 2 or 3.

2.6.1 Exercises

A

1. Using the propositional letters defined at the beginning of this chapter, translate thefollowing English sentences into formulas:

(a) If Bob goes to the store and Tad stays home then Alice goes to the store only ifCheryl goes to work.

(b) Bob and Alice don’t go to the store if and only if Tad doesn’t stay home.

(c) If Tad stays home then Cheryl goes to work but Cheryl goes to work only if Tadstays home.

(d) When Bob goes to the store, either Alice also goes to the store or Cheryl doesn’tgo to work.

(e) Either Bob or Alice goes to the store if Tad doesn’t stay home.

2. Translate the following formulas into clear, grammatically correct, idiomatic English:

(a) (A&B&C)→ −T(b) (A&B)↔ (C ∨ −T )

(c) A→ (B → C)

(d) (A→ B)→ C

(e) (A→ B)→ (C → T )

2.7 Negated Forms

It is not always obvious how to translate English expressions involving negations that op-erate over or within conjunction, disjunction, or elimination. Here are a few of the morefrequently encountered negated forms:

Neither P nor Q: −(P ∨ Q).

Page 31: Symbolic Logic Text

2.8. INTENSIONAL CONTEXTS 19

Either −P or −Q: −P ∨ −Q.

Not both P and Q: −(P &Q).

Both not P and not Q: −P & −Q.

P does not imply Q: −(P → Q).

P implies not-Q: P → −Q.

P is not equivalent to Q: −(P ↔ Q).

P is equivalent to not-Q: P ↔ −Q.

2.7.1 Exercises

A

1. Using the propositional letters defined at the beginning of this chapter, translate thefollowing English sentences into formulas:

(a) Neither Bob nor Alice goes to the store.

(b) Either Cheryl doesn’t go to work or Tad doesn’t stay home.

(c) If Tad stays home it does not necessarily mean that Alice goes to the store.

(d) Cheryl going to work is insufficient to guarantee that Tad stays home.

(e) Cheryl goes to work only if Tad doesn’t stay home.

2. Translate the following formulas into clear, grammatically correct, idiomatic English:

(a) −(−A ∨ B)

(b) −(B → (−C ∨ T ))

(c) −(A↔ B)

(d) A↔ −B

(e) −(A ∨ B)

2.8 Intensional Contexts

Intensional contexts are statements that report on how a proposition is represented to or ina person or thing capable of representation. For instance, they might report on someone’sstate of mind or utterances. Here are typical examples:

Joe believes that Santa is real.Joe wishes that Santa could be real.“Santa came last night,” said Joe.Science proves that Santa does not exist.

Page 32: Symbolic Logic Text

20CHAPTER 2. PROPOSITIONAL LOGIC: SYMBOLIZATION AND TRANSLATION

How should we translate these? The key is to see that the truth value of the whole statement— for instance, whether or not it is true that Joe said that Santa came last night — isentirely independent of the truth value of the statement itself that Joe made. Therefore, theconnection between Joe’s statement itself and the claim that he made a certain statement isnot truth-functional. If we are given any statements like the four examples above to translateinto propositional symbols, we have no choice but to treat them as atomic propositions, andgive them simply a single propositional letter such as P .

2.9 Alternative Notation for Propositional Connectives

Virtually every elementary calculus text published anywhere in the world uses nearly thesame notation for the basic operations of the theory. Logic, however, has not achieved thesame standardization, which is a reflection of the fact that logic is still a subject that isvery much under development.

Here are some other notations that one might encounter for the symbols we use in thistext.

−P is often written as ∼ P , ¬P , or P . (The symbol ∼ is called the tilde.)

P → Q is frequently written as P ⊃ Q. (The symbol ⊃ is pronounced “horseshoe.”)

P &Q is often written as P ∧Q and sometimes as P ·Q, or occasionally as PQ.

P ∨ Q is often written as P +Q.

P ↔ Q is often written as P ≡ Q.

Page 33: Symbolic Logic Text

Chapter 3

Propositional Logic: Introductionto Natural Deduction

We begin by setting forth rules and procedures for what logicians call natural deduction.Some students may well find it to be decidedly unnatural at first (though familiarity willbring ease of use). However, this system of deductive reasoning is, in fact, a very usefuldistillation of the most important deductive principles that we use in ordinary reasoning.In fact, they represent more than merely a sort of anthropology of reasoning: they arenormative, in the sense that they are an attempt to codify, clarify, and make more preciseour day-to-day reasoning. (A normative rule is one that says what we should do; forinstance, “Honour thy father and mother,” or “Change your oil every 3000 km.”) The sortof logic we set forth here by no means exhausts all possible useful logics. The justificationfor the approach we take, out of the many approaches that are mathematically possible, ispragmatic, in that extensive experience shows that what we set forth here can be appliedto a great deal of the day-to-day reasoning that we actually carry out, and can also serveas a very useful foundation for further developments.

Logicians call this way of reasoning natural deduction because it is meant to come asclose as one reasonably can to the way people tend to reason deductively in ordinary naturallanguages (such as English). There are also numerous axiomatic systems of logic that arefocused on logical economy and generality. Such axiomatic systems play an important rolein research into the basis of mathematics and logic, and in discovering new systems of logicthat might be useful. However, a natural deductive approach designed more for ease ofuse and closeness to intuitive methods of reasoning is much more helpful for beginners inlogic, especially those who might want to learn logic mainly because of its numerous usefulapplications in other fields such as philosophy, mathematics, computing, and science.

We shall symbolize the logical forms of arguments by sequents, which are expressionsof the general form

P1, P2, . . . , Pn ` Q, (3.1)

where the propositions Pi are the premisses (the statements that are taken for granted inthe context of the problem), and Q is the conclusion (the proposition to be arrived at byvalid deductive steps from the premisses).

The symbol ` (called the turnstile) can be read as “therefore,” “proves,” or (morefussily) as “syntactically entails.” If P ` Q we can also say that Q is derivable from P .

21

Page 34: Symbolic Logic Text

22 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

It amounts to the claim that the stated conclusion can be derived validly from the givenpremisses by accepted rules of deduction.

Since the premisses of a sequent are taken to be held conjointly, a sequent of the form3.1 above is just the same thing as saying

(P1 & . . . &Pn) ` Q. (3.2)

Note carefully that in general deductive entailment goes only one way; that is, giventhat we can show P ` R, we cannot automatically conclude that R ` P . There is animportant class of formulas, however, for which the entailment goes both ways — eachformula entails the other. In such a case, we will write a sequent of the form

P a` Q. (3.3)

Such a formula expresses an interderivability between P and Q. To establish interderivabil-ity we generally have to carry out two proofs, one showing P ` Q, and the other showingQ ` P , and two such proofs are often quite different in form.

It will be shown that some formulas can be derived without any premisses left over.The sequents for such formulas can be written in the form

` P, (3.4)

which can be read “Provable P ,” or “P is a theorem.” Theorems are propositions that canbe asserted without the qualification of any premisses at all. Such theorems are sometimesreferred to by the grand name of logical truths.

There is no reason we can’t concatenate sequents together to get something like this:

P,Q ` R ` S ` T, (3.5)

in which a sequence of conclusions are derived one after the other. This reflects commonlinguistic practice, where it would not be uncommon to make an argument of the form “Pand Q, therefore R; therefore S, etc.” However, in this book we will be almost entirelyconcerned with sequents with only one turnstile.

The proof or derivation of a sequent will consist of a sequence of lines, beginning withthe statement of the premisses as assumptions, and ending (one hopes) with the desiredconclusion. Each line has to show four things: the dependencies (if any), the line number,the proposition asserted on the line, and the justification for asserting that proposition.

The dependencies are the line numbers of the assumptions on which the line is based;if the line itself is merely an assumption, the dependency is the number of the line itself. Ifthe line is a theorem there will be no dependencies. We shall say that a proposition dependsupon the assumptions listed to the left of it; or equivalently, that the assumptions to theleft of a proposition in a line govern that proposition.

The justification is a statement of the rule used to get the line, together with thenumbers of the previous lines (if any were needed) from which the line was derived bymeans of the indicated rule.

The meaning of these terms will be clearer when we consider examples of proofs.

Page 35: Symbolic Logic Text

3.1. BASIC RULES OF DERIVATION 23

3.1 Basic Rules of Derivation

The following list of rules is very similar to that in Lemmon’s Beginning Logic [17], exceptthat we will derive MT instead of taking it as basic. Also, I have made certain assumptionsthat Lemmon uses more explicit than he does.

In the following, the proposition letters P , Q, etc., may denote either atomic or molec-ular formulas.

1. Rule of Assumption (A)We may introduce any proposition anywhere in a proof that we wish — so long as wenever forget that it is an unproven assumption!Dependency : The assumption so introduced depends only on itself, and the line num-ber at which it is introduced appears in the assumption column to its left.Form of justification: A

2. Modus Ponens (MP)From P and P → Q we can derive Q.Dependency : The conclusion Q depends upon whatever assumptions P and P → Qdepend upon.Form of justification: m,n MP, where m and n are the lines for the premisses.

3. Double Negation (DN)From P we can derive −− P , or from −− P we can derive P .Dependency : The double negation of a line in a proof always depends upon the sameassumption as the line double-negated.Form of justification: m DN, where m is the line double-negated.

4. & -Introduction (&I)Given P and Q separately, we can derive P &Q or Q&P .Dependency : the conclusion depends on whatever assumptions P and Q dependedupon.Form of justification: m,n &I, where m and n are the lines for the premisses.

5. & -Elimination (&E)Given the conjunction P &Q, we can derive either of the conjuncts P or Q separately.Dependency : the conclusion depends upon whatever assumptions the original con-junction depended upon.Form of justification: m &E, where m is the line containing the conjunction used.

6. v-Introduction (vI)Given P , we may derive P ∨ Q or Q ∨ P , where Q is any proposition whatever.Dependency : The conclusion depends upon whatever assumptions the premiss de-pended upon.Form of justification: m vI, where m is the line of the proposition “or-ed” onto.

7. v-Elimination (vE)From a disjunction P1 ∨ . . . ∨ Pn, we may conclude Q, so long as we can find aderivation of Q from all n of the disjuncts separately.

Page 36: Symbolic Logic Text

24 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

Dependency : the conclusion Q depends upon any assumptions that the starting dis-junct depended upon, together with any additional assumptions that had to be in-troduced (but which could not be again discharged) in the derivations of Q from theindividual disjuncts.General form of justification: k, l1−−m1, . . . ln−−mn vE, where K is the line of thestarting disjunction, li is the first line of the i-th subproof, and Mi is the last line ofthe i-th subproof. See examples below.

8. Conditional Proof (CP)If Q follows from the assumption P , then we can conclude P → Q.Dependency : The conclusion P → Q depends upon whatever assumptions P dependedupon, together with any additional assumptions that had to be introduced (but whichcould not again be discharged) in the derivation of Q from P .Form of justification: m − −n CP, where m is the line number of the antecedent ofthe conditional to be proven, and n is the line number of the consequent.

9. Reductio ad Absurdum (RAA)If from the assumption P (or −P ) we can derive a contradiction of the form Q& −Q,we can conclude −P (or P respectively).Dependency : The conclusion depends upon any assumptions that governed any linesused in the derivation of the contradiction, but which were not themselves dischargedin that derivation.Form of justification: m − −n RAA, where m is the line at which we assume thenegation of the conclusion to be established, and n is the line at which we arrive atthe contradiction that negates the initial assumption.

10. Reflection (Rfl)A very basic assumption of the sort of natural deduction we use here, which Lemmondoes not explicitly state himself but which he certainly uses, is that any propositionmay be taken to follow from itself. (If it is true that my name is Kent, then it wouldbe an odd sort of logic in which I could not immediately though perhaps redundantlyinfer that my name is Kent.) This allows us to cite a single proposition as both thebeginning and end of certain kinds of sub-proofs in vE, RAA, and CP, and enormouslystreamlines certain derivations. I’ll call this the principle of Reflection, just to give ita name; though we need not cite it explicitly when it is used.

11. Reiteration (R)Any line may be reiterated or repeated anywhere it is needed in a proof.Dependency : The reiterated line has the same dependencies as the line it was copiedfrom.Form of justification: m R, where m is the number of the line being reiterated.

12. Definition Introduction (DfIntro)If P := Q, then P may be substituted for Q wherever Q occurs, and vice versa.

3.1.1 Alternative Names for the Basic Rules.

MP is also known as Detachment , Modus Ponendo Ponens, or Conditional Elimination.

Page 37: Symbolic Logic Text

3.2. COMMENTS ON BASIC RULES 25

&I is also sometimes called Conjunction

&E is also sometimes called Simplification.

vI also sometimes called Addition.

RAA is sometimes called Indirect Proof or Negation Introduction.

CP is sometimes called Conditional Introduction, → Intro, or ⊃ Intro.

3.2 Comments on Basic Rules

Some students may feel that it is a little fishy to be allowed to assume anything we want atany time. The fact is that an assumption is nothing more than that — an unjustifiedadoption of a proposition, perhaps merely for the sake of argument. There is nothinglogically illegitimate about doing this, so long as you remember that it was just anassumption. Adopting an assumption is rather like borrowing money on a credit cardthat has no credit limit, but on which all debts must ultimately be paid. Once wehave adopted an assumption and use it in the subsequent derivation, everything fromthat point on is dependent on the assumption and cannot be asserted unconditionally.Assumptions can be discharged by CP, RAA, and vE; this is analogous to the processof paying back borrowed money.

MP is the pattern for all deductive inferences. It expresses the fundamental notion oflogical consequence, that one proposition may follow from another.

If you cannot think of any other way to get a letter into the proof, vI may be the best wayto do it. To some people vI seems a bit like cheating, since it allows us to introduce anyproposition whatsoever into a derivation whenever we need it. You should be able tosee that nothing illegitimate is going on, however, if you can see that the propositionadded to or “or-ed” onto P is not being asserted categorically. An example mighthelp: if it is true that my name is Kent, it is also safe (although perhaps odd) to saythat either my name is Kent or the moon is made of green cheese.

vE allows us to derive conclusions from disjunctions. Although the general statement ofthe rule given above allows us to derive conclusions from disjuncts containing anynumber of terms, usually in the propositional logic section of this course it will beapplied to disjunctions of the form P ∨ Q containing only two terms. When we dopredicate logic we will use a rule called Existential Elimination (EE) which is, ineffect, a form of vE over arbitrary numbers of disjuncts. Students often find vE to bedifficult at first, but the difficulties are mostly notational. Practice will make perfect.

RAA is one of the most powerful and useful rules. If you cannot think of any other wayof getting to the desired conclusion, introduce as an assumption the negation of thedesired conclusion and then try to get a contradiction. Sometimes even very welleducated people are surprisingly reluctant to use RAA, since they are uncomfortableassuming for the sake of argument a proposition that they know or wish to believe isfalse. This is unfortunate, since the ability to freely entertain hypotheses, no matterhow outrageous they may seem, is one of the most useful tools of thought. RAA would

Page 38: Symbolic Logic Text

26 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

not be valid in a logic that was not bivalent (i.e., a logic in which not all propositionswere either definitely true or definitely false).

Reiteration is, of course, a consequence of Reflection. Strictly speaking, Reiteration isnot needed in a system like ours because one can always cite a line that has alreadybeen written (so long as you don’t forget to also cite the assumptions upon whichit depends) anywhere else in the proof. Despite this, beginners will sometimes findReiteration to be helpful, and so I have allowed it as a rule even though it is redundant.Think of it as analogous to training wheels on a bicycle, which you will come to dowithout as your skill increases.

It will become increasingly evident as the course progresses that these rules of deductionare by no means the simplest rules upon which a valid system of propositional logiccould be based. These rules are chosen because they are fairly close to the way peopleactually reason in ordinary natural languages; they are therefore designed mainly forease of use, not logical economy. It should also become evident that other rules couldjust as easily be taken as basic. For instance, the reader might find it a good exerciseto show that we could have taken Disjunctive Syllogism (DS, defined below) as a basicrule instead of MP, introduced → as a defined symbol (taking the rule of Implication,also given below, as a definition instead of a derived result), and then derived MPusing DS.

3.3 Simple Examples of the Basic Rules

In this section I state and prove several sequents which demonstrate the use of the BasicRules in especially simple and clear ways.

1 (P &Q)→ R,P,−−Q ` R (This uses MP, DN, and &I.)1 (1) (P &Q)→ R A2 (2) P A3 (3) −−Q A3 (4) Q 3 DN2,3 (5) P &Q 2,4 &I1,2,3 (6) R 1,5 MP

This derivation illustrates the basic pattern for all natural deductions. The centrecolumn lists the propositions that lead one after the other to the conclusion; the numbersin brackets are the line numbers. The left-hand column lists the dependencies, that is,the assumptions that the proposition in the centre column depends upon. The right handcolumn lists the rule that justifies the proposition written in that line, together with thelines to which the rule was applied.

Those who have some familiarity with accountancy can think of the left column asanalogous to a tabulation of debit, and the centre column as credit. Making an assumptionis like borrowing money; the assumptions we make allow us (by means of the rules) toassert other propositions, but we must keep track of the fact that we are relying on certainassumptions just as we must keep track of money we owe. Debits and credits must balancein the end. If we discharge all the assumptions and still have a proposition left over, that

Page 39: Symbolic Logic Text

3.3. SIMPLE EXAMPLES OF THE BASIC RULES 27

proposition counts as a theorem, that is, a proposition that can be held unconditionally —like a property with no mortgage on it!

2 P &Q,P → R ` R (This uses &E and MP.)

1 (1) P &Q A2 (2) P → R A1 (3) P 1 &E1,2 (4) R 2,3 MP

3 P ` P ∨ (Q&R) (This uses vI.)

1 (1) P A1 (2) P ∨ (Q&R) 1 vI

4 P ∨ Q,P → R,Q→ R ` R (This uses vE and MP.)

1 (1) P ∨ Q A2 (2) P → R A3 (3) Q→ R A4 (4) P A (vE)2,4 (5) R 2,4 MP6 (6) Q A (vE)3,6 (7) R 3,6 MP1,2,3 (8) R 1,4–5,6–7 vE

This is one of the simplest examples I have been able to think of that demonstrates theuse of vE. In some treatments of propositional logic, this sequent is actually called “v-Elimination.” Here we will call it inference pattern Convergence, since variations on it willoccur often enough that it will be useful to give it a name.

The whole idea of vE is that we use it to establish the claim that from either P or Qwe can get to the desired conclusion. It is like a claim that we can get to Calgary fromLethbridge via either Highway 2 or the Vulcan Highway. To demonstrate the claim, youhave to test both routes — because the claim is that either route works. This is why we use asubproof for each disjunct in the starting disjunction. Each subproof starts by temporarilyassuming the disjunct to be tested; when the subproof is done, that temporary assumptionis discharged.

Notice also that in lines 4 and 6 I annotate the assumptions that start the subproofswith an indication of the rule that I am trying to use in the subproof. This is an entirelyoptional step, and whether or not you do it does not affect the validity of the proof. However,it can help you keep track of which assumption is part of which subproof; this is especiallyhelpful when you use nested subproofs. (See the proof of DS below.) I have borrowed thisdevice from Jacquette [13].

5 P ` Q→ P (This demonstrates CP and R.)

1 (1) P A2 (2) Q A (CP)1 (3) P 1 R1 (4) Q→ P 2,3 CP

Page 40: Symbolic Logic Text

28 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

As in sequent 5, I have marked the assumption in line 2 with the rule that I intend touse in the subproof.6 P → Q,P ` Q (This demonstrates RAA.)

1 (1) P → Q A2 (2) P A3 (3) −Q A (RAA)1,2 (4) Q 1,2 MP1,2,3 (5) Q& −Q 3,4 &I1,2 (6) Q 3–5 RAA

Of course, this sequent is merely the statement of MP. But it demonstrates in a veryclear way how to use RAA.

3.3.1 Exercises

A

1. Find natural deduction proofs for the following sequents.

(a) P, P → −Q ` −Q

(b) −− P, P → Q ` Q

(c) P &Q,P → R ` R

(d) P, P → R,P → S ` R&S

(e) −− P ` P ∨ R

B

1. Construct proofs of the following sequents.

(a) P → (Q&R), P ` R ∨ S

(b) P → Q,R→ Q,P ∨ R ` Q ∨ S

(c) P ∨ Q,Q→ R ` P ∨ R

(d) P, (P &Q)→ R ` Q→ R

(e) P → Q,Q→ R,−R ` −P (Hint: try RAA)

3.4 Some Useful Sequents

In this section we derive from the Basic Rules a series of sequents that are very widely usedthroughout logic, and which will be found to be very useful in our future propositional andpredicate natural deductions. Many of these sequents have common names, and you shouldlearn these names and acquire the ability to recognize these forms in their innumerablevariations.

Page 41: Symbolic Logic Text

3.4. SOME USEFUL SEQUENTS 29

3.4.1 One-Way Sequents

7 P → Q,−Q ` −P Modus Tollens (MT)

1 (1) P → Q A2 (2) −Q A3 (3) P A (RAA)1,3 (4) Q 1,3 MP1,2,3 (5) Q& −Q 2,4 &I1,2 (6) −P 3–5 RAA

MT is one of the most widely used rules in deductive logic.

Verbal example: If Bob goes to work then Alice stays home. But Alice did not stayhome. Therefore, Bob did not go to work.

8 P ∨ Q,−Q ` P Disjunctive Syllogism (DS)

1 (1) P ∨ Q A2 (2) −Q A3 (3) P A (vE)4 (4) Q A (vE)5 (5) −P A (RAA)2,4 (6) Q& −Q 2,4 &I2,4 (7) P 5–6 RAA1,2 (8) P 1,3–3,4–7 vE

Example in Words: Either Bob or Alice paid the phone bill; however, it was not Bob.Hence, it was Alice who paid the phone bill.

The proof of DS will repay careful study, since it illustrates the use of vE and RAA. Theobject is to derive P from P ∨ Q given the condition −Q. The first disjunct is simply thedesired conclusion; from the assumption of the second disjunct we can derive a contradictionand thereby prove P — or, indeed, anything we want. The conclusion follows by vE. Inthe justification for the conclusion at line (8), 1 is the line at which the premiss P ∨ Qwas introduced; 3 is the line at which the working assumption P was introduced and alsothe line at which it is discharged (since P is the very conclusion we are aiming at); 4 is theline at which the second working assumption was introduced and 7 the line at which it wasdischarged. If we were working with a disjunction with more than two terms, there wouldhave to be two line numbers for each working assumption introduced and then discharged;but we shall rarely use vE on disjunctions with more than two terms.

Note the use of Reflection in line 3, in which P is both assumption and conclusion ofa sub-proof. Reflection also allows us, for example, to establish the theorem ` P → P asfollows:

9 ` P → P Law of Identity (Id)

Page 42: Symbolic Logic Text

30 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

1 (1) P A(2) P → P 1–1 CP

Example in Words: Paul Martin is Prime Minister of Canada only if Paul Martin isPrime Minister of Canada. More generally, every proposition implies itself; this theoremcould be just as well called “self-implication”.

This example illustrates the fact that the utter triviality of many of the basic rules oflogic becomes apparent when they are put into words. It is fascinating how powerful theseapparently “trivial” inference rules can be, however.

Here is another simple sequent that can sometimes come in handy:

10 −P ` −(P &Q) Denial (Den)

1 (1) −P A2 (2) P &Q A (RAA)2 (3) P 2 &E1,2 (4) P & − P 1,3 &I1 (5) −(P &Q) 2–4 RAA

The reader will notice that the peculiar way in which we used RAA in the proof ofDS would apparently, in principle, allow us to deduce any proposition whatsoever froma contradiction. This is, indeed, the case, and this phenomenon, which we have alreadyencountered in the last chapter, is important enough to deserve a name of its own: Ex falsoquodlibet, or more briefly Ex Falso. This Latin phrase literally means “from a contradiction(or falsehood), whatever. . . ”, or “from a contradiction anything follows.”

11 P,−P ` Q Ex Falso Quodlibet (ExF)

1 (1) P A2 (2) −P A3 (3) −Q A (RAA)1,2 (4) P & − P 1,2 &I1,2 (5) Q 3–4 RAA

Example in Words: If 2 + 2 = 4 and 2 + 2 6= 4 then Captain Kangaroo is the PrimeMinister of Australia.

Here is a slightly different version of ExF that helps to illustrate how the methods work:

12 P & − P ` Q

There are at least two ways to prove this. A “learner’s permit” method uses Reiteration:

1 (1) P & − P A2 (2) −Q A (RAA)1 (3) P & − P 1 R1 (4) Q 2–3 RAA

It can be done more compactly as follows:

Page 43: Symbolic Logic Text

3.4. SOME USEFUL SEQUENTS 31

1 (1) P & − P A2 (2) −Q A (RAA)1 (3) Q 2–1 RAA

Note that line 1 is only “derived” from line 2 in the sense that it can be asserted when2 is asserted simply because it has already been asserted. But that is enough to make RAAwork. Ex Falso will seem less odd when we come to truth table semantics in Chapter 5.

13 P → Q ` P → (P &Q) Absorption (Abs)

1 (1) P → Q A2 (2) P A1,2 (3) Q 1,2 MP1,2 (4) P &Q 2,3 &I1 (5) P → (P &Q) 2–4 CP

Example in Words: Given that Bob works only if Alice stays home, then if Bob worksthen Bob works and Alice stays home.

This proof is a typical usage of CP. Note how the assumption on line 2 is dischargedwhen we arrive at the conclusion.

14 P → Q,Q→ R ` P → R Hypothetical Syllogism (HS)

1 (1) P → Q A2 (2) Q→ R A3 (3) P A (CP)1,3 (4) Q 1,3 MP1,2,3 (5) R 2,4 MP1,2 (6) P → R 3–5 CP

Example in Words: If Bob works only if Alice stays home, and if Alice stays home onlyif she is off shift, then Bob works only if Alice is off shift.

The next derivation is a good illustration of the use of vE:

15 P → Q,R→ S, P ∨ R ` Q ∨ S Constructive Dilemma (CD)

1 (1) P → Q A2 (2) R→ S A3 (3) P ∨ R A4 (4) P A (vE)1,4 (5) Q 1,4 MP1,4 (6) Q ∨ S 5 vI7 (7) R A (vE)2,7 (8) S 2,7 MP2,7 (9) Q ∨ S 8 vI1,2,3 (10) Q ∨ S 3,4–6,7–9 vE

Page 44: Symbolic Logic Text

32 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

Example in Words: If Bob works only if Alice stays home, and if Tad works only ifCheryl stays home, and if either Bob or Tad works, then either Alice or Cheryl stays home.

The following variant on Constructive Dilemma is easily established:

16 P → Q,R→ S,−Q ∨ −S ` −P ∨ −R Destructive Dilemma (DD)

The proof is left as a B exercise.

3.4.2 Interderivability Results

In this section we state and prove a number of interderivability results of the general form

P a` Q. (3.6)

It can be shown that for every interderivability of the form (3.6), there is a theorem of theform

` P ↔ Q. (3.7)

We shall say that any wffs P and Q for which ` P ↔ Q holds are provably equivalent . InChapter 5 we will see that expressions are provably equivalent iff they have the same truthtables.

In syntactic terms (that is, in terms of the use that this fact can be put in derivations),this implies that any occurrence of a formula of the form P can be substituted for bya formula of the form Q wherever they occur, or vice versa. We will call this metaruleEquivalence Introduction, and this rule, along with some other introduction rules, will bediscussed in greater detail in the next chapter. For now, we merely note the fact that becauseof Equivalence Introduction these interderivabilities are very powerful tools of deduction.We will use Equivalence Introduction in some of the proofs further on in this chapter.

De Morgan’s Laws

The following rules, called de Morgan’s Laws, are very important and widely used in logic:

17 −(P &Q) a` −P ∨ −Q de Morgan’s Laws (deM)

(a) −(P &Q) ` −P ∨ −Q1 (1) −(P &Q) A2 (2) −(−P ∨ −Q) A (RAA)3 (3) −P A (RAA)3 (4) −P ∨ −Q 3 vI2,3 (5) (−P ∨ −Q) & − (−P ∨ −Q) 2,4 &I2 (6) P 3–5 RAA7 (7) −Q A (RAA)7 (8) −P ∨ −Q 7 vI2,7 (9) (−P ∨ −Q) & − (−P ∨ −Q) 2,8 &I2 (10) Q 7–9 RAA2 (11) P &Q 6,10 &I

Page 45: Symbolic Logic Text

3.4. SOME USEFUL SEQUENTS 33

1,2 (12) (P &Q) & − (P &Q) 1,11 &I1 (13) −P ∨ −Q 2–12 RAA

This is an adaptation of the ingenious proof given by Lemmon for his Sequent 36(a). Itshows that RAAs can be nested.

(b) −P ∨ −Q ` −(P &Q)1 (1) −P ∨ −Q A2 (2) −P A (vE)3 (3) P &Q A (RAA)3 (4) P 3 &E2,3 (5) −P &P 2,4 &I2 (6) −(P &Q) 3–5 RAA7 (7) −Q A (vE)8 (8) P &Q A (RAA)8 (9) Q 8 &E7,8 (10) −Q&Q 7,9 &I7 (11) −(P &Q) 8–10 RAA1 (12) −(P &Q) 1,2–6,7–11 vE

18 −(P ∨ Q) a` −P & −Q de Morgan (deM)

(a) −(P ∨ Q) ` −P & −Q1 (1) −(P ∨ Q) A2 (2) P A (RAA)2 (3) P ∨ Q 2 vI1,2 (4) (P ∨ Q) & − (P ∨ Q) 1,3 &I1 (5) −P 2–4 RAA6 (6) Q A (RAA)6 (7) P ∨ Q 6 vI1,6 (8) (P ∨ Q) & − (P ∨ Q) 1,7 &I1 (9) −Q 6–8 RAA1 (10) −P & −Q 5,9 &I

(b) −P & −Q ` −(P ∨ Q)1 (1) −P & −Q A2 (2) P ∨ Q A (RAA)1 (3) −P 1 &E1,2 (4) Q 2,3 DS1 (5) −Q 1 &E1,2 (6) Q& −Q 4,5 &I1 (7) −(P ∨ Q) 2–6 RAA

This can, of course, be proven without the use of the derived rule DS by simply, in effect,filling in the proof of DS between lines (3) and (4).

Page 46: Symbolic Logic Text

34 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

The de Morgan Laws amount to rules for the distribution of the negation sign; theyshow that when we distribute negation over conjunctions they turn into disjuncts of thenegated terms, and when we distribute negation over disjunctions they become conjunctsof the negated terms.

Implication Rules

The following two rules about the implication relation → are frequently used. They canboth be cited the same way.

19 P → Q a` −P ∨ Q Implication (Imp)

(a) P → Q ` −P ∨ Q1 (1) P → Q A2 (2) −(−P ∨ Q) A (RAA)2 (3) −− P & −Q 2 deM2 (4) P & −Q 3 DN2 (5) P 4 &E1,2 (6) Q 1,5 MP2 (7) −Q 4 &E1,2 (8) Q& −Q 6,7 & I1 (9) −P ∨ Q 2–8 RAA

The converse is left as a B exercise.

20 P → Q a` −(P & −Q) Implication (Imp)

The proofs of this version of Imp use Sequent 18, and they are left as a B exercise.

Example: Here is an example of a valid use of Imp and deM. It is justifiable by themetarule of Equivalence Introduction, although you do not have to cite that explicitly.

1 (1) (P → Q) & (R→ S) A1 (2) (−P ∨ Q) & (R→ S) 1 Imp1 (3) −(P & −Q) & (R→ S) 2 deM

This example illustrates the fact that an interderivability rule such as deM allows us tosubstitute an interderivable formula for a subformula in a larger formula.

Transposition is also very useful:

21 P → Q a` −Q→ −P Transposition (Trans)

P → Q ` −Q→ −P1 (1) P → Q A2 (2) −Q A (CP)1,2 (3) −P 1,2 MT1 (4) −Q→ −P 2–3 CP

The converse is left as a B exercise.

Page 47: Symbolic Logic Text

3.4. SOME USEFUL SEQUENTS 35

Idempotency

22 P &P a` P Idempotency (Idem)

These easy proofs are left as A exercises.

23 P ∨ P a` P Idempotency (Idem)

P ∨ P ` P1 (1) P ∨ P A2 (2) P A (vE)3 (3) P A (vE)1 (4) P 1,2–2,3–3 vE

This is the shortest possible application of vE. The converse is left as an A exercise.

Commutative Laws

The commutative laws are really just a restatement of the Basic Rules of &I and vI, whichallow us to “and” and “or” two propositions together in any order. Their proofs illustratethat something which is “obvious” still may require a little work in order to be correctlydemonstrated from the Basic Rules.

24 P &Q a` Q&P Commutativity (Comm)

P &Q ` Q&P

1 (1) P &Q A1 (2) P 1 &E1 (3) Q 1 &E1 (4) Q&P 2,3 &I

25 P ∨ Q a` Q ∨ P Commutativity (Comm)

P ∨ Q ` Q ∨ P1 (1) P ∨ Q A2 (2) P A (vE)2 (3) Q ∨ P 2 vI4 (4) Q A (vE)4 (5) Q ∨ P 4 vI1 (6) Q ∨ P 1,2–3,4–5 vE

Associative Laws

26 P & (Q&R) a` (P &Q) &R Associativity (Assoc)

This proof is left as a B exercise.

Page 48: Symbolic Logic Text

36 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

27 P ∨ (Q ∨ R) a` (P ∨ Q) ∨ R Associativity (Assoc)This proof is a little tricky, since it requires nested vEs. It is left as a C exercise.As noted previously, Associativity can be extended to conjunctions and disjunctions

of arbitrary numbers of terms, but the proof of this fact requires mathematical induction,which is beyond (though not far beyond) the scope of this course.

Rules of Dominance

The Rules of Dominance may seem a little strange, but they have a very natural interpre-tation from the semantic point of view. We will review them in Chapter 5. These rulessimplify the derivation of Equiv (below).

The proofs of the Rules of Dominance are simple but quite instructive. I give somehere, and leave the rest as exercises.

28 P & (Q ∨ −Q) a` P Dominance (Dom)

P ` P & (Q ∨ −Q)1 (1) P A2 (2) −(P & (Q ∨ −Q)) A (RAA)2 (3) −P ∨ −(Q ∨ −Q) 2 deM2 (4) −P ∨ (−Q& −−Q) 3 deM2 (5) P → (−Q& −−Q) 4 Imp1,2 (6) (−Q& −−Q) 1,5 MP1 (7) P & (Q ∨ −Q) 2–6 RAA

The proof of the converse is left as an A exercise.

29 P & (Q& −Q) a` Q& −Q Dominance (Dom)

30 P ∨ (Q ∨ −Q) a` Q ∨ −Q Dominance (Dom)

31 P ∨ (Q& −Q) a` P Dominance (Dom)

P ∨ (Q& −Q) ` P1 (1) P ∨ (Q& −Q) A2 (2) P A (vE)3 (3) Q& −Q A (vE)3 (4) P 3 ExF1 (5) P 1,2–2,3–4 vE

The converse is left as an A exercise.

Distributive Laws

32 P & (Q ∨ R) a` (P &Q) ∨ (P &R) Distributive Laws (Dist)

(a) P & (Q ∨ R) ` (P &Q) ∨ (P &R)

Page 49: Symbolic Logic Text

3.4. SOME USEFUL SEQUENTS 37

1 (1) P & (Q ∨ R) A2 (2) −((P &Q) ∨ (P &R)) A (RAA)2 (3) −(P &Q) & − (P &R) 2 deM2 (4) (−P ∨ −Q) & (−P ∨ −R) 3 deM2 (5) (P → −Q) & (P → −R) 4 Imp2 (6) P → −Q 5 &E2 (7) P → −R 5 &E1 (8) P 1 &E1 (9) Q ∨ R 1 &E1,2 (10) −Q 6,8 MP1,2 (11) −R 7,8 MP1,2 (12) −Q& −R 10,11 &I1,2 (13) −(Q ∨ R) 12 deM1,2 (14) −(Q ∨ R) & (Q ∨ R) 9,13 &I1 (15) (P &Q) ∨ (P &R) 2–14 RAA

Note the use of RAA.

(b) (P &Q) ∨ (P &R) ` P & (Q ∨ R)1 (1) (P &Q) ∨ (P &R) A2 (2) P &Q A (vE)2 (3) Q 2 &E2 (4) P 2 &E2 (5) Q ∨ R 3 vI2 (6) P & (Q ∨ R) 4,5 &I7 (7) P &R A (vE)7 (8) P &E7 (9) R &E7 (10) Q ∨ R 9 vI7 (11) P & (Q ∨ R) 8,10 &I1 (12) P & (Q ∨ R) 1,2–6,7–11 vE

This is a straightforward application of vE.

33 P ∨ (Q&R) a` (P ∨ Q) & (P ∨ R) Distributive Laws (Dist)

(P ∨ Q) & (P ∨ R) ` P ∨ (Q&R)1 (1) (P ∨ Q) & (P ∨ R) A1 (2) (−P → Q) & (−P → R) 1 Imp1 (3) −P → Q 2 &E1 (4) −P → R 2 &E5 (5) −P A (CP)1,5 (6) Q 3,5 MP1,5 (7) R 4,5 MP1,5 (8) Q&R 6,7 &I1 (9) −P → (Q&R) 5–8 CP1 (10) P ∨ (Q&R) 9 Imp

Page 50: Symbolic Logic Text

38 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

The converse is left as a B exercise.The Distributive Laws show how “ORs” can be distributed over “ANDs,” and vice

versa. There is an important branch of advanced logic called quantum logic in which theseemingly “obvious” distributive laws break down. (For a nicely accessible introduction tothis intriguing subject, see [12].)

Equivalence Rules

There are two Equivalence rules that are widely used.

34 P ↔ Q a` (P → Q) & (Q→ P ) Equivalence (Def↔)

This follows by Definition Introduction, since P ↔ Q := (P → Q) & (Q→ P )

35 P ↔ Q a` (P &Q) ∨ (−P & −Q) Equivalence (Equiv)I give the proof of both directions here, since they are simple but instructive.

(a) P ↔ Q ` (P &Q) ∨ (−P & −Q)

1 (1) P ↔ Q A1 (2) (P → Q) & (Q→ P ) 1 Def↔1 (3) (−P ∨ Q) & (−Q ∨ P ) 2 Imp1 (4) (−P & (−Q ∨ P ))∨

(Q& (−Q ∨ P )) 3 Dist (× 2)1 (5) ((−P & −Q) ∨ (−P &P ))∨

((Q& −Q) ∨ (Q&P )) 4 Dist (× 2)1 (6) (−P & −Q) ∨ (Q&P ) 5 Dom1 (7) (P &Q) ∨ (−P & −Q) 6 Comm (× 2)

Note how Dom allows us to “cancel” out contradictory expressions such as −P &P .

(b) (P &Q) ∨ (−P & −Q) ` P ↔ Q

1 (1) (P &Q) ∨ (−P & −Q) A2 (2) (P &Q) A (vE)2 (3) P 2 &E2 (4) Q 2 &E2 (5) −Q ∨ P 3 vI2 (6) −P ∨ Q 4 vI2 (7) Q→ P 5 Imp2 (8) P → Q 6 Imp2 (9) (P → Q) & (Q→ P ) 7,8 &I10 (10) (−P & −Q) A (vE)10 (11) −P 10 &E10 (12) −Q 10 &E10 (13) −P ∨ Q 11 vI10 (14) −Q ∨ P 12 vI10 (15) P → Q 13 Imp10 (16) Q→ P 14 Imp

Page 51: Symbolic Logic Text

3.4. SOME USEFUL SEQUENTS 39

10 (17) (P → Q) & (Q→ P ) 15,16 &I1 (18) (P → Q) & (Q→ P ) 1,2–9,10–17 vE1 (19) P ↔ Q 18 Def↔

Negated Arrows

The following are easy, given the rules we have now developed.

36 −(P → Q) a` P & −Q Negation of Implication (Neg→)

−(P → Q) ` P & −Q1 (1) −(P → Q) A1 (2) −(−P ∨ Q) 1 Imp1 (3) −− P & −Q 2 deM1 (4) P & −Q 3 DN

We get the converse essentially by reversing these steps.

37 −(P ↔ Q) a` (P & −Q) ∨ (−P &Q) Negation of Equivalence (Neg↔)

−(P ↔ Q) ` (P & −Q) ∨ (−P &Q)1 (1) −(P ↔ Q) A1 (2) −((P → Q) & (Q→ P )) Def↔1 (3) −(P → Q) ∨ −(Q→ P ) 2 deM1 (4) (P & −Q) ∨ (Q& − P ) 3 Neg→1 (5) (P & −Q) ∨ (−P &Q) 4 Comm

Again, we obtain the converse by reversing the steps. The derivation of this sequent byusing only the Basic Rules would run to many more than five lines.

Distributive Laws for →

It is possible to prove interderivability relations that act, in effect, like distributive laws forimplication. (Thanks to Mark Smith for drawing this to my attention.)

38 P → (Q&R) a` (P → Q) & (P → R) Distributive Law for → (Dist→)

The proofs are left as B exercises.

39 P → (Q ∨ R) a` (P → Q) ∨ (P → R) Distributive Law for → (Dist→)

P → (Q ∨ R) ` (P → Q) ∨ (P → R)1 (1) P → (Q ∨ R) A1 (2) −P ∨ (Q ∨ R) 1 Imp1 (3) (−P ∨ Q) ∨ R 2 Assoc1 (4) ((−P ∨ Q) ∨ R) ∨ −P 3 vI1 (5) (−P ∨ Q) ∨ (R ∨ −P ) 4 Assoc1 (6) (−P ∨ Q) ∨ (−P ∨ R) 5 Comm1 (7) (P → Q) ∨ (P → R) 6 Imp

Page 52: Symbolic Logic Text

40 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

(In the next chapter we will show that it is possible to combine steps in proofs like this.)The converse is left as a B exercise.Finally, here is a rule that looks almost like a distributive law for →, but isn’t quite.

40 (P &Q)→ R a` P → (Q→ R) Exportation (Exp)

(P &Q)→ R ` P → (Q→ R)1 (1) (P &Q)→ R A2 (2) P A (CP)3 (3) Q A (CP)2,3 (4) P &Q 2,3 &I1,2,3 (5) R 1,4 MP1,2 (6) Q→ R 3–5 CP1 (7) P → (Q→ R) 2–6 CP

This proof of Exp demonstrates nested CPs. The converse is left as a B exercise.

3.4.3 Form of Justification When Using Derived Rules

The general structure of a justification notation in a proof line is that we state the rulewe used to get the line, and give the numbers of the lines we used. RAA, CP, and vE areslightly different, in that we only have to give the beginning and end lines of the subproofs;but they are the only rules for which the usual pattern does not hold.

If you are using a Derived Rule such as MT, HS, or CD, the same basic pattern holds,except that you need to mention the line numbers of all of the propositions that were usedas inputs to the use of the rule. For instance, a line that uses a rule that uses three lineswill need to have those three lines mentioned in its justification.

Here is an example to make this clear:

41 P ∨ Q,−P ∨ R,−Q ∨ R ` R

1 (1) P ∨ Q A2 (2) −P ∨ R A3 (3) −Q ∨ R A2 (4) P → R 2 Imp3 (5) Q→ R 3 Imp1,2,3 (6) R ∨ R 1,4,5 CD1,2,3 (7) R 6 Idem

Since CD requires three premisses, three lines must be cited in the justification.

3.5 Summary of Basic Sequents and Rules

In this section, I present for convenience a concise summary of the Basic and Derived rulesand sequents we have covered in this chapter. These elementary sequents can occur asfragments of longer derivations, and it is very important that you learn to recognize themwhen they appear. One of the most important skills in logic is the ability to recognize

Page 53: Symbolic Logic Text

3.5. SUMMARY OF BASIC SEQUENTS AND RULES 41

logical form; for instance, to spot that a very complicated-looking argument is really just acase of a simple form such as MP, MT, or HS.

Don’t forget that in the following statement of the rules, the propositional variables P ,Q, R, etc., could be either atomic or molecular propositions.

3.5.1 Basic Rules and Sequents

Some of the Basic Rules are really metarules, meaning that they are statements about whatsort of inferences we can make, given that certain other inferences are possible. Other BasicRules can be expressed as sequents.

Basic Metarules

Assumption (A):Any proposition whatever may be assumed, so long as we do not forget that it wasan assumption.

∨ -Elimination (vE):If P ` R and Q ` R are provable separately, then we can conclude P ∨ Q ` R.

Conditional Proof (CP):From P,Q ` R we can conclude P ` Q→ R.

Reductio ad Absurdum (RAA):From P ` R& −R we can conclude −P .

Reiteration (R):Any line may be repeated anywhere in a proof, so long as we carry its dependencieswith it.

Definition Introduction (DefIntro):If a set of symbols is defined as a certain truth-functional combination of propositionalvariables, then one may be substituted for the other wherever they occur.

Basic Sequents

Modus Ponens (MP):P, P → Q ` Q

&-Introduction (&I):P,Q ` P &QP,Q ` Q&P .

&-Elimination (&E):P &Q ` PP &Q ` Q

v-Introduction (vI):P ` P ∨ QP ` Q ∨ P .

Page 54: Symbolic Logic Text

42 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

Double Negation (DN):−− P a` PNote that DN is the only Basic Rule that is an interderivability result.

3.5.2 Summary of Derived Sequents

I summarize here the sequents stated and (mostly) derived earlier in this chapter. There isa denumerable infinity of propositional sequents that can be derived from the Basic Rules.The rules in this list are simply those that are used often enough to be given names.

One-Way Sequents

Modus Tollens (MT):P → Q,−Q ` −P

Disjunctive Syllogism (DS):P ∨ Q,−Q ` P

Denial (Den):−P ` −(P &Q)

Ex Falso Quodlibet (ExF):P,−P ` Q

Hypothetical Syllogism (HS):P → Q,Q→ R ` P → R

Constructive Dilemma (CD):P → Q,R→ S, P ∨ R ` Q ∨ S

Destructive Dilemma (DD):P → Q,R→ S,−Q ∨ −S ` −P ∨ −R

Convergence (Conv):P ∨ Q,P → R,Q→ R ` R

Absorption (Abs):P → Q ` P → (P &Q)

Interderivability Sequents

de Morgan’s Laws (deM):−(P &Q) a` −P ∨ −Q−(P ∨ Q) a` −P & −Q.

Implication Rules (Imp):P → Q a` −P ∨ QP → Q a` −(P & −Q)

Transposition (Trans):P → Q a` −Q→ −P

Page 55: Symbolic Logic Text

3.6. FALLACY ALERT! 43

Idempotency (Idem):P &P a` PP ∨ P a` P .

Commutativity (Comm):P &Q a` Q&PP ∨ Q a` Q ∨ P

Associativity (Assoc):P & (Q&R) a` (P &Q) &RP ∨ (Q ∨ R) a` (P ∨ Q) ∨ R

Dominance (Dom):P & (Q ∨ −Q) a` PP & (Q& −Q) a` Q& −QP ∨ (Q ∨ −Q) a` Q ∨ −QP ∨ (Q& −Q) a` P

Distributivity (Dist):P & (Q ∨ R) a` (P &Q) ∨ (P &R)P ∨ (Q&R) a` (P ∨ Q) & (P ∨ R)

Definition of Equivalence (Def↔):P ↔ Q a` (P → Q) & (Q→ P ).

Equivalence (Equiv):P ↔ Q a` (P &Q) ∨ (−P & −Q).

Negation of Implication (Neg→):−(P → Q) a` P & −Q

Negation of Equivalence (Neg↔):−(P ↔ Q) a` (P & −Q) ∨ (−P &Q)

Exportation (Exp):(P &Q)→ R a` P → (Q→ R)

Distribution for → (Dist→):P → (Q&R) a` (P → Q) & (P → R)

Distribution for → (Dist→):P → (Q ∨ R) a` (P → Q) ∨ (P → R)

3.6 Fallacy Alert!

A fallacy, loosely speaking, is simply some sort of error in reasoning. We fallible humanscommit certain characteristic errors so often and so spontaneously that it has been founduseful to give them names; they are almost like optical illusions of the mind. They are com-mon precisely because, like optical illusions, they look believable if they are not examinedclosely. Many of these fallacies belong to informal logic or inductive logic, and they are

Page 56: Symbolic Logic Text

44 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

discussed in books on rhetoric and critical thinking. (See [6, 16, 8] for good introductionsto critical thinking and fallacy theory.) There are a few fallacies of deductive reasoning thatoccur rather often as well, and I will mention a few of them here. In Chapter 5 we will learnsystematic techniques for diagnosing fallacious deductive reasoning.

3.6.1 A Common Misuse of vE

It is very common for students to attempt the following move:

1 (1) P ∨ Q A1 (2) P 1 vE (???)

This is a glaring error. You can convince yourself that this pattern of reasoning is invalidby considering the following example:

The number 3 is either even or odd. (This is true, of course.)Therefore the number three is even. (False!)

The rule of vE does not work like &E!This example also illustrates the fact that we can check the validity of a sequent by

trying to find specific examples of it in which obviously true premisses lead to obviouslyfalse conclusions. If you can do this, the sequent must be invalid.

If you cannot find an example that invalidates a sequent, that doesn’t mean that thesequent is valid! It just means that you can’t think of an example to show that it is not. InChapter 5 we learn general and reliable methods for checking the validity of propositionallogic sequents and arguments.

3.6.2 Affirming the Consequent

Another very common fallacy is Affirming the Consequent :

P → Q,Q ` P (????) (3.8)

To see that this is invalid, consider the following example:

If today is Tuesday, we have a logic class.We are having a logic class.Therefore, today must be Tuesday.

Even if the premisses are correct, the conclusion could still be false, simply because logicmight be held on other days of the week besides Tuesday. This error is seductive, however,since it looks so much like the valid rule MP.

3.6.3 Denying the Antecedent

This is an error which is very similar to Affirming the Consequent, except that it is morelike a misuse of MT:

P → Q,−P ` −Q. (3.9)

A simple example demonstrates invalidity:

Page 57: Symbolic Logic Text

3.6. FALLACY ALERT! 45

If today is Tuesday, we have a logic class.Today is not Tuesday.Therefore we don’t have logic.

Again, the problem is just that we might have logic on other days of the week.

3.6.4 Avoid Circularity!

From this point onward the student can take any of the derived sequents above and usethem in the proof of further sequents. But you should not forget that these were proven ina certain order. For instance, de Morgan’s Laws are proven first here, and then used in theproof of the Implication rules. It is possible to prove Implication using only Lemmon’s BasicRules as well. (See Sequent 35 in Lemmon.) It would also be possible to use Implication inthe derivation of the de Morgan Laws — but you must not do that if you used de Morganto get Implication! That would be a circular procedure, or “begging the question,” as itis sometimes called in informal logic. A circular proof is no proof at all, since it merelyamounts (though perhaps in a roundabout way) to a restatement of its assumptions.

It would also be quite legitimate to prove Implication first from the Basic Rules andthen use it in the derivation of de Morgan. (You might want to try this as an exercise.) Butonce you have chosen an order of proof you should stick to it, in order to avoid circularity.

Page 58: Symbolic Logic Text

46 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

3.7 Exercises on Chapter 3

A

1. Find natural deduction proofs for the following named sequents:

(a) P &P ` P (Idem)

(b) P ` P &P (Idem)

(c) P ` P ∨ P (Idem)

(d) P & (Q ∨ −Q) ` P (Dom)

(e) P ` P ∨ (Q& −Q) (Dom)

(f) Q& −Q ` P & (Q& −Q) (Dom)

(g) P & (Q& −Q) ` Q& −Q (Dom)

(h) Q ∨ −Q ` P ∨ (Q ∨ −Q) (Dom)

2. Find natural deduction proofs for the following miscellaneous sequents:

(a) P,Q ` P → Q

(b) Prove the following without the use of R. Your proof should be only three lineslong.P ` Q→ P

(c) P → (Q ∨ R),−(Q ∨ R) ` −P(d) P ∨ −Q,Q ` P(e) (Q→ R),−(Q→ R) ` R(f) (M ∨ N)→ Q ` (M ∨ N)→ (Q& (M ∨ N))

(g) P → (Q→ R), (Q→ R)→ T ` P → T

(h) −(P & −Q) ` −P ∨ Q(i) −(P ∨ −(R→ S)) ` −P & (R→ S)

(j) (X ∨ Y )→ Q ` (−X & − Y ) ∨ Q(k) −M ∨ N,M ` N(l) −(X → Y )→ −P, P ` (X → Y )

(m) P → (Q ∨ Q) ` −(P & −Q)

(n) P & (Q→ R) ` (P & −Q) ∨ (P &R)

(o) X → (P ↔ Q), (P & −Q) ∨ (−P &Q) ` −X

B

1. Find natural deduction proofs for the following named sequents:

(a) P → Q,R→ S,−Q ∨ −S ` −P ∨ −R (DD)

(b) −P ∨ Q ` P → Q (Imp) (Hint: use DS)

(c) P → Q ` −(P & −Q) (Imp)

Page 59: Symbolic Logic Text

3.7. EXERCISES ON CHAPTER 3 47

(d) −(P & −Q) ` P → Q (Imp)

(e) −Q→ −P ` P → Q (Trans)

(f) P & (Q&R) ` (P &Q) &R (Assoc)

(g) P ∨ (Q&R) ` (P ∨ Q) & (P ∨ R) (Dist)

(h) (P ∨ Q) & (P ∨ R) ` P ∨ (Q&R) (Dist)This sequent is proven in the text using CP. Try it using RAA.

(i) P → (Q→ R) ` (P &Q)→ R (Exp)

(j) P → (Q&R) ` (P → Q) & (P → R) (Dist→)

(k) (P → Q) & (P → R) ` P → (Q&R) (Dist→)

(l) P ∨ (Q ∨ −Q) ` Q ∨ −Q (Dom)

(m) −(P & −Q) ` P → Q (Neg→)

(n) (P → Q) ∨ (P → R) ` P → (Q ∨ R) (Dist→)

2. The following questions require you to find natural deduction proofs for a variety ofsequents:

(a) P,Q ` P ↔ Q

(b) P ↔ Q,Q↔ R ` P ↔ R

(c) (P → R), (Q→ R) ` (P &Q)→ R

(d) −P ∨ R,−(Q& − S),−P → Q,−R ` S(e) R,R↔ P,−Q→ −P ` Q(f) P → Q,Q→ R,R→ S,−S ∨ −R ` −P(g) P ↔ S, P ∨ Q,P ∨ R,−Q ∨ −R ` S(h) P → (Q ∨ R), Q→ S,R→ T,−(S ∨ T ) ` −P(i) P → (Q ∨ R), (P → Q)→ X, (P → R)→ Y ` X ∨ Y(j) P → R,−Q→ S,−(R ∨ S) ` −(Q→ P )

C

1. MT is a basic pattern of inference which is very important in scientific reasoning.Suppose we have a new scientific theory, and suppose this entire theory is expressed asa single proposition T . Suppose also the theory deductively entails a certain predictionP . Finally, suppose that (to our great disappointment) experimentation shows that Pis false. What does MT imply about T? (Philosophers of science refer to this processas falsification.) Does this mean that we have to junk all of T and start all over again?If not, why not?

2. Find a natural deduction proof for the following sequent:

(a) P ∨ (Q ∨ R) ` (P ∨ Q) ∨ R

Page 60: Symbolic Logic Text

48 CHAPTER 3. INTRODUCTION TO NATURAL DEDUCTION

3. Suppose we had a system of Natural Deduction that was exactly the same as theone we use in this book, except that, in the place of Modus Ponens (MP), it usedDisjunctive Syllogism (DS) (P ∨ Q,−Q ` P ) as a Basic Deductive Rule.

(a) Show that in this new system we can derive MP using only DS and other BasicRules, so long as we define P → Q as −P ∨ Q. (If we don’t have MP as a BasicRule, we have to treat → as a defined symbol, since in our usual system → isdefined implicitly by the way it behaves in MP.)

(b) Now suppose, instead, that we define P → Q as −(P & − Q), as is done innumerous logic texts. (See, e.g., Copi and Cohen [6].) This definition capturesthe intuition that to say P implies Q is to say that we never simultaneously haveP and −Q. Show that in this system we can, as well, derive MP using BasicRules. Hint: try RAA.

Page 61: Symbolic Logic Text

Chapter 4

Techniques and Applications ofPropositional Natural Deduction

In this chapter we introduce several techniques that greatly simplify propositional naturaldeductions, and indicate some of its possible applications.

Don’t forget that in this chapter the propositional variables P , Q, R, etc., can be eitheratomic or molecular unless stated otherwise.

4.1 Recognition of Logical Form

One of the most important skills to develop in symbolic logic is the ability to recognizelogical form. Suppose, for instance, you were asked to deduce a conclusion from the followingseemingly-complicated premisses: ((R ↔ S) & (S ↔ T )) ∨ (R → −S) and R&S. A littlethought will convince you that this is merely an instance of DS, so that an immediateconclusion is (R↔ S) & (S ↔ T ). Formally, we could write this out as follows:

1 (1) ((R↔ S) & (S ↔ T )) ∨ (R→ −S) A2 (2) R&S A2 (3) −(R→ −S) 2 Neg→1,2 (4) (R↔ S) & (S ↔ T ) 1,3 DS

Something that threatened to be forbiddingly complicated in fact turns out to be verysimple.

Try to “chunk” complicated formulas together in your mind and see if they start tolook like instances of simpler rules before you start blindly calculating.

4.1.1 Exercises

A

1. Identify the simple rule used in each of the following deductions:

(a) (X ∨ Y )→ (Y ∨ Z), (X ∨ Y ) ` (Y ∨ Z)

(b) ((X ∨ Y )→ Q)→ (P &Q),−(P &Q) ` −(((X ∨ Y )→ Q)→ (P &Q))

(c) P ` ((M ↔ N) ∨ (M ↔ (R&S))) ∨ P

49

Page 62: Symbolic Logic Text

50 CHAPTER 4. TECHNIQUES AND APPLICATIONS

(d) P → (Q&X), (Q&X)→ T ` P → T

(e) ((P → Q)→ R)→ (P ↔ (Q&R)) ` −((P → Q)→ R) ∨ (P ↔ (Q&R))

4.2 Combining Steps

As you become more experienced in doing proofs you will naturally want to combine stepswhen they seem to be obvious. For instance, you might do a fragment of a larger proof likethis:

1,2 (16) −(−P ∨ Q) (from previous lines)1,2 (17) P & −Q 16 deM,DN

This is perfectly okay, so long as you don’t get ahead of yourself and make a mistake.Just as you can use more than one rule to get a given line, you can use the same rule

more than once. For instance, you could use Imp twice on the same line as follows:

1,2 (1) P → (P → Q) (from previous lines)1,2 (2) −P ∨ (−P ∨ Q) 1 Imp (×2)

If you have the slightest doubt about the steps you are taking, it is better painstakinglyto write them all out one by one and thereby reduce the chance that you will make amistake. Remember, too, that on tests and assignments the onus is on you to convince yourinstructor that you understand how the proof works!

4.2.1 Exercises

A

1. State the justifications for each of the following steps:

(a) (n) P → (Q→ P ) TI(n+1) −P ∨ (−Q ∨ P ) ?

(b) (n) P → (Q&R)(n+1) (−P ∨ Q) & (−P ∨ R) ?

(c) (n) −P → (Q&R)(n+1) P ∨ (Q&R) ?

(d) (n) (P → Q) & (P → R)(n+1) −P ∨ (Q&R) ?

(e) (n) −M → N(n+1) −N(n+2) M ?

(f) (n) (P ∨ −P ) ∨ (Q ∨ −Q)(n+1) (−P ∨ Q) ∨ (−Q ∨ P ) ?

Page 63: Symbolic Logic Text

4.3. THEOREMS 51

4.3 Theorems

In the last chapter we introduced the concept of the theorem, meaning a wff that can beproven with no assumptions left over. If P is a theorem, we assert

` P, (4.1)

(read as “Provable P” or “P is a theorem”), which means that P can stand on its own,without the support of any premisses. In the next chapter we will see that the theorems ofpropositional logic are just the same as its tautologies, which are the formulas that are Tin all lines of their truth tables.

For now, I will give a few simple examples of theorems, to demonstrate typical tech-niques that can be used to prove them.

42 ` P → P

Here are two proofs of this elementary theorem. The first uses CP and R:

1 (1) P A (CP)1 (2) P 1 R

(3) P → P 1–2 CP

We can also do this without using R:

1 (1) P A(2) P → P 1–1 CP

This is no doubt the shortest possible application of CP!If we want to derive a result that depends on no premisses from a set of premisses, we

have to have a way of discharging assumptions, and that means that we are going to useeither CP or RAA. (The rule vE also discharges assumptions, but only in a sub-proof.)

Here is an important theorem that we can prove using RAA:

43 ` P ∨ −P Law of Excluded Middle (ExM)

1 (1) −(P ∨ −P ) A (RAA)1 (2) −P & −−P 1 deM

(3) (P ∨ −P ) 1–2 RAA

This illustrates the power of RAA: if you can’t think of any other way to get a result,assume its negation and try to derive a contradiction.

Here’s another way to get ExM:

1 (1) P A (CP)(2) P → P 1–1 CP(3) P ∨ −P 2 Imp,Comm

This theorem is called the Law of Excluded Middle because it says that every propositionis either true or false, with nothing in between. This is sometimes expressed in Latin astertium non datur — “a third is not given”.

Page 64: Symbolic Logic Text

52 CHAPTER 4. TECHNIQUES AND APPLICATIONS

Here is another example: Show that any given proposition P is implied by any otherproposition Q. I am going to give five ways of proving this theorem, in order to demonstratethe variety of proof strategies that can be used for theorems.

44 ` P → (Q→ P )

1 (1) P A (CP)2 (2) Q A (CP)1 (3) P 1 R1 (4) Q→ P 2–3 CP

(5) P → (Q→ P ) 1–4 CP

This is a straightforward use of CP. The use of R in line 3 is redundant; see if you can doit without using R.

1 (1) P A (CP)1 (2) −Q ∨ P 1 vI1 (3) Q→ P 2 Imp

(4) P → (Q→ P ) 1–3 CP

The above proof reminds us of how we can use vI to bring new propositional letters into aproof.

1 (1) −(Q→ P ) A (CP)1 (2) Q& − P 1 Neg→1 (3) −P 2 &E

(4) −(Q→ P )→ −P 1–3 CP(5) P → (Q→ P ) 4 Trans

The above proof illustrates the idea that if we want to prove a conditional, we might tryto prove the negation of its consequent first and then use Transposition to get the desiredconditional.

1 (1) −(P → (Q→ P )) A (RAA)1 (2) P & − (Q→ P ) 1 Neg→1 (3) P 2 &E1 (4) −(Q→ P ) 2 &E1 (5) Q& − P 4 Neg→1 (6) −P 5 &E1 (7) P & − P 3,6 &I

(8) P → (Q→ P ) 1–7 RAA

The above proof is a bit longer than the others, but it illustrates the fact that if all else failswe can use RAA; that is, assume the negation of what you want to prove, and then derivea contradiction by brute force.

Page 65: Symbolic Logic Text

4.4. SUBSTITUTION 53

(1) P ∨ −P ExM(2) (P ∨ −P ) ∨ −Q 1 vI(3) −P ∨ (P ∨ −Q) 2 Comm,Assoc(4) P → (P ∨ −Q) 3 Imp(5) P → (Q→ P ) 4 Comm,Imp

In line 1 of the above proof we introduce a theorem, the Law of Excluded Middle. TheoremIntroduction is discussed further below; the basic idea is that a theorem can be stated atany point in a proof without justification. We then expand line 1 using vI, rearrange theresults, and apply Imp to get what we wanted. This is the deepest proof, since it buildsdirectly from ExM. In fact, it would be possible to derive all theorems, and thereby all ofdeductive logic, directly from ExM if we wanted to.

4.3.1 The “Laws of Thought”

There is no limit to the number of propositional theorems that could be proved. However,three simple theorems are important enough that they have come to be known as the “lawsof thought.”

The Law of Excluded Middle (ExM):` (P ∨ −P )

The Law of Identity (Id):` (P → P )

The Law of Non-Contradiction (NC):` −(P & − P ).

We have seen the proofs of the first two; the proof of NC is left as an A exercise.

4.4 Substitution

Under certain circumstances, it is valid to substitute one wff for the whole or part of anotherwff. There are two kinds of substitution, uniform and nonuniform.

4.4.1 Uniform Substitution

We will say that a wff P is substituted uniformly for a wff Q if every instance of Q in alarger formula is replaced by P . Example: Consider the formula (P &R) → (Q&R). A

uniform substitution of (M ∨ N) for R would yield (P & (M ∨ N))→ (Q& (M ∨ N)).

4.4.2 Nonuniform Substitution

We will say that a wff P is substituted nonuniformly for a wff Q if some but not all instancesof Q in a larger formula are replaced by P . Example: Consider again (P &R) → (Q&R).

A nonuniform substitution of (M ∨ N) for R would yield either (P & (M ∨ N))→ (Q&R)or else (P &R)→ (Q& (M ∨ N)).

Page 66: Symbolic Logic Text

54 CHAPTER 4. TECHNIQUES AND APPLICATIONS

4.5 Introduction Rules

Introduction Rules are metarules that allow us to introduce previously accepted results intoa derivation in order to simplify it. In this section we state without formal justification(since that would be beyond the scope of this course) a number of Introduction Rules thatwill greatly facilitate proofs of sequents and theorems. We have already used all of theserules except Theorem Introduction in Chapter 3.

1. Definition Introduction (DefIntro): Any defined symbol can be replaced by its defini-tion, or vice versa.So far, the only symbol we are taking as defined in terms of the basic set of symbols is

↔. But there is no reason that we could not define other symbols as combinations ofbasic symbols or other previously defined symbols. For instance, suppose we stipulatethat P ∗Q := (P ∨ Q)→ (P &Q). Then any occurrence of P ∗Q could be substitutedfor by its definition, and any wff that was interderivable with its definition could besubstituted for by P ∗ Q. Any such usage can be justified by citing the line numberthat was operated on and then putting down either DefIntro or Def∗.

2. Sequent Introduction (SI): Any uniform substitution instance of a proven sequent canbe introduced in a proof.We have already used SI many times in Chapter 3, when we applied sequents such as

MT, HS, DS, and so forth in the proofs of further sequents. Do not forget that thesubstitution has to be uniform; only uniform substitution instances of sequents canbe guaranteed to be valid. If the sequent you are using has a special name (such asMP or HS) then just cite the name in the justification column; otherwise, write SIand give a reference to where the sequent was previously proven (such as in an earlierexercise).

3. Equivalence Introduction (EqI): Given an equivalence theorem ` P ↔ Q, we maysubstitute P for Q (or the other way around) in any occurrences of Q in a validsequent; this kind of substitution need not be done uniformly.If the equivalence rule you are using has a special name (such as deM or Dist) then just

cite the name in the justification column; otherwise, write EqI and give a reference towhere the rule was previously proven (such as in an earlier exercise).

4. Theorem Introduction (TI): Any uniform substitution instance of a theorem may bewritten as a line anywhere in a proof. No assumptions will be listed in the assumptioncolumn (because theorems do not depend on assumptions) and the notation TI shouldbe put in the justification column (together with the name of the theorem if it has onein brackets).A theorem is a proposition that we have already shown can be proven with no as-

sumptions left over. Hence, we could introduce it, or a substitution instance of it, atany point in a proof; in effect, doing so is just shorthand for a statement of the entireproof of the theorem. If the theorem you are using has a special name (such as ExMor Id) then just cite the name in the justification column; otherwise, write TI andgive a reference to where the theorem was previously proven (such as in an earlierexercise).

Page 67: Symbolic Logic Text

4.6. THE DEDUCTION METATHEOREM 55

Example of TI:

45 P → Q,−P → R ` Q ∨ R

1 (1) P → Q A2 (2) −P → R A

(3) P ∨ −P ExM1,2 (4) Q ∨ R 1,2,3 CD

4.5.1 Exercises

A

1. Prove the Law of Non-Contradiction, ` −(P & − P ).

B

1. Prove the following theorems:

(a) ` (P & − P )→ Q

(b) ` P → (−P → Q)

(c) ` P ∨ (P → Q)

(d) ` (P → Q) ∨ (Q→ P )

(e) ` (P → Q) ∨ (Q→ R)

(f) ` (−P → P )→ P

C

1. Prove ExM using only Basic Rules.

4.6 The Deduction Metatheorem

The following metatheorem is one of the most important results in deductive logic. Itcannot be proven in all generality with the tools we use in this course. However, we willwork out proofs for a few simple cases, and they should make apparent the plausibility ofthis centrally important theorem.

Deduction Theorem:The sequent P1, P2, . . . , Pn ` Q is valid iff ` (P1 &P2 & . . . &Pn)→ Q.

Example: Given P ` R, show ` P → R; and given ` P → R, show P ` R. The proofsof these results are very simple, but they nicely illustrate the application of SI and TI.

(a) Given P ` R, show ` P → R.1 (1) P A (CP)1 (2) R 1 SI

(3) P → R 1–2 CP

Page 68: Symbolic Logic Text

56 CHAPTER 4. TECHNIQUES AND APPLICATIONS

In line 2 we apply the given sequent P ` R. It doesn’t have a special name (like MT, HS,etc.), but we can substitute it into the proof as required. Note that we have to cite line 1,since this is the line that the introduced sequent is applied to.

(b) Given ` P → R, show P ` R.1 (1) P A

(2) P → R TI1 (3) R 1,2 MP

On the assumption that P → R is a theorem, we can substitute it in wherever we want;and we do not have to cite any previous assumptions or lines since a theorem holds withoutqualification.

The Deduction Theorem gives us a better understanding of a very useful result that weused in the previous chapter:

P a` Q if and only if ` P ↔ Q.

Here is an informal argument to establish this result:

(a) Given P a` Q, show ` P ↔ Q.The expression P a` Q simply means P ` Q and Q ` P . By the Deduction Theorem,

this means that we have ` P → Q and ` Q → P . We can then prove ` P ↔ Q asfollows:

(1) P → Q TI(2) Q→ P TI(3) (P → Q) & (Q→ P ) &I(4) P ↔ Q 3 Def↔

(b) Given ` P ↔ Q, show P a` Q.From ` P ↔ Q we get ` P → Q and ` Q→ P separately; by the Deduction Theorem

this gives us P ` Q and Q ` P , which is just the same as P a` Q.

The main use of the Deduction Theorem, for the purposes of this introductory course,is simply the fact that whenever we have an interderivability result, we have an equivalence;and such equivalences can be used freely in transforming expressions into equivalent formswhich may help us get to where we want to go in a deduction.

4.6.1 Exercises

B

1. Given P,Q ` R, show ` (P &Q)→ R.

2. Given ` (P &Q)→ R, show P,Q ` R. (These two questions in combination show theDeduction Theorem for the case of two premisses.)

Page 69: Symbolic Logic Text

4.7. BOOLEAN ALGEBRA 57

4.6.2 A Turnstile is not an Arrow!

There is a natural tendency to think of P → Q as saying just the same thing as P ` Q.This is a mistake, one which can sometimes be glossed over in an elementary course suchas this, but which in a course involving metatheory would be serious. The symbol ` is ametalogical symbol. It is a claim about what can be derived, while → is a logical symbol,meaning (as far as propositional logic is concerned) that it denotes a relation between truthvalues.

Now, the Deduction Theorem assures us that given P ` Q, it is a theorem that P → Q.But P ` Q and P → Q are different claims. The first says that Q can be derived from Pby the rules of natural deduction; and the Deduction Theorem says that this is the samething as saying that P → Q can be derived with no assumptions left over. But P → Qonly says that given P , Q is true. It so happens that elementary propositional logic has theproperty of completeness, meaning that every logically true statement can be proven withinthe system. Because of completeness, we can sometimes in propositional logic gloss overthe fine distinction between “true” and “provable”. But (as Godel showed) there are logicalsystems that are incomplete; therefore, once we move beyond elementary propositional logicthe distinctions between what can be proven from a given set of rules, and what is true,have to be kept carefully in mind.

4.7 Boolean Algebra

We can greatly simplify many deductions that involve the use of equivalence relations bymeans of Boolean algebra. For our purposes, we will mean by “Boolean algebra” themanipulation of propositional formulas in a manner very similar to ordinary algebraic ma-nipulations.

If all we are doing is trying to go from one expression to another which is equivalentto it, we can write our calculations in a much simpler and more direct way that does notrequire the full apparatus of natural deduction. This will seem very familiar to those whohave some experience with the manipulation of algebraic identities in ordinary mathematics.

Here is a simple example:

−(P → Q) ↔ −(−P ∨ Q) Imp↔ −− P & −Q deM↔ P & −Q DN

This establishes the theorem ` −(P → Q)↔ (P & −Q).As with regular natural deduction, we can combine steps:

−(P → Q) ↔ −(−P ∨ Q) Imp↔ P & −Q deM,DN

Again, be careful that you don’t get ahead of yourself. If you aren’t quite sure of what youare doing, don’t combine steps.

We can substitute equivalent expressions for subformulas within a larger formula eitheruniformly or nonuniformly:

(−P ∨ Q) & (P → R) ↔ (P → Q) & (P → R) Imp↔ P → (Q&R) Dist→

Page 70: Symbolic Logic Text

58 CHAPTER 4. TECHNIQUES AND APPLICATIONS

4.7.1 Exercises

B

1. Use Boolean algebra to demonstrate the following equivalence theorems:

(a) ` (P ↔ Q)↔ (−(P → −Q) ∨ −(−Q→ P ))

(b) ` ((Q ∨ R)→ P )↔ ((Q→ P ) & (R→ P ))

(c) ` (P → (Q ∨ R))↔ ((P → Q) ∨ (P → R))

(d) ` (P → (Q&R))↔ ((P → Q) & (P → R))

(e) ` (P ∨ (Q→ R))↔ ((Q→ P ) ∨ (−P → R))

4.8 Filling in the Gaps in an Argument

It is unavoidable that most of this text has to be devoted to explaining the basic techniquesof symbolic logic, and we can’t devote as much space as we should to its possible applications.This could lead an attentive student to wonder what is the point of going through all thepain of finding a deductive proof of a conclusion that has already been worked out —especially when the semantic techniques of the next two chapters allow us to check thevalidity of an argument quickly and efficiently without having to construct a proof at all.

The fact remains that when we use natural deduction in “real life” (and despite allappearances to the contrary, there actually are some people who attempt to reason theirway through real-life problems) we do not have a conclusion worked out in advance. Rather,we are given some evidence (which we express in premisses) and we try to work out whatthat evidence implies. We may not know the conclusion in advance any more than we knowthe sum of a column of figures before we add them up. Often, we do have some idea ofwhat requirements the conclusion should satisfy; for instance, a detective trying to solve amurder mystery wants to find out who dunnit. Or else we may have a fairly definite idea ofthe conclusion, but we may not have all of the premisses required to establish it deductively;we might have to work backwards to fill in the gaps in the argument.

In this section I will introduce a few exercises in which we have to work out naturaldeduction proofs in which part of the argument is missing.

Example: Suppose we know the following: either Bob or Alice paid the phone bill; ifBob paid the bill he would have the receipt; if Alice paid the bill then she would have thereceipt; Bob does not have the receipt. Who paid the phone bill?

It is pretty obvious that Alice got stuck with the bill. But let’s go through the symbol-ization and proof to illustrate the general method of attack.

First, let’s define our symbols:

A := Alice pays the bill.B := Bob pays the bill.R := Bob has the receipt.S := Alice has the receipt.

We can put down all of the premisses we have on hand, and try a few steps to see wherethey lead:

Page 71: Symbolic Logic Text

4.8. FILLING IN THE GAPS IN AN ARGUMENT 59

1 (1) B ∨ A A2 (2) B → R A3 (3) A→ S A4 (4) −R A2,4 (5) −B 2,4 MT1,2,4 (6) A 1,5 DS

We quickly come to the conclusion that it was Alice who paid the bill, and we stop therebecause this answers the question. We have therefore established the sequent B ∨ A,B →R,A → S,−R ` A. Remember, in this sort of problem we usually don’t know in advancewhat sequent we are going to be proving; rather, we explore a set of premisses to see wherethey will lead. Note, also, that one of the premisses (line 3) turned out to be irrelevant; andthat, of course, is just like real life, where we often have to sort out the useful informationfrom the noise.

Example: We are given the following facts: If either Paul or Peter did it then eitherRanda helped or Michael wasn’t entirely innocent. If the weapon was in Paul’s office he didit. On the other hand, if the weapon wasn’t in Paul’s office, then Michael wasn’t entirelyinnocent. Can we determine whether Randa was involved?

Symbolization:

P := Paul did itQ := Peter did itR := Randa helpedM := Michael was innocentN := The weapon was found in Paul’s office

Write down the premisses:

1 (1) (P ∨ Q)→ (R ∨ −M) A2 (2) N → P A3 (3) −N → −M A

There is no way we can draw a definite conclusion about the truth or falsity of R from this,since there is no way to extract it from its conditional. But now suppose that independentevidence shows that Michael has an alibi — he was out of town for a meeting at the timeof the incident. Then we can continue:

4 (4) M A3 (5) M → N 3 Trans2,3 (6) M → P 2,5 HS2,3,4 (7) P 4,6 MP1 (8) (P → (R ∨ −M)) & (Q→ (R ∨ −M)) 1 SI (Exercise 4.7.1, 1(b))1 (9) P → (R ∨ −M) 8 &E1,2,3,4 (10) R ∨ −M 7,9 MP1,2,3,4 (11) R 4,10 DS

Page 72: Symbolic Logic Text

60 CHAPTER 4. TECHNIQUES AND APPLICATIONS

Page 73: Symbolic Logic Text

Chapter 5

Propositional Semantics: TruthTables

5.1 Syntax versus Semantics

The method of natural deduction that we have learned so far are instances of what arecalled syntactic procedures. These are procedures defined by rules that tell us the allowablecombinations of symbols that may be written down starting from given wffs. For instance,MT says that given P → Q and −Q, we may (if it suits our purposes) then correctly writedown −P . Such rules are concerned only with syntax . We commonly interpret the P s andQs of propositional logic as meaningful statements that can bear truth values, but thesesymbols could be given other interpretations as well — or no interpretation at all.

If you make a mistake when typing in commands to a computer program, it may tellyou that you have committed a “syntax error.” This simply means that you have brokenthe rules that say which combinations of symbols are allowed. The computer has no notionwhatsoever of the possible meanings of the symbols being used by the program.

In this chapter we introduce truth tables, the first of two main semantic techniques wewill learn in this course. When we do semantics, we are concerned with the interpretationor meaning of the symbols we write down. In propositional logic, the most common inter-pretation of the propositional symbols is that they represent truth values of propositions;propositional logic, in effect, is a calculus of truth values.

In propositional logic we make the simplifying assumptions that there are only twopossible truth values, which we may represent as T (true) and F (false); this is calledbivalence. (These are often symbolized as > and ⊥, or 1 and 0.) Bivalence is an idealization,since we know that in real life there are often various shades, modes, and degrees of truth.We also, at this stage at least, assume that every proposition has one and only one of thesetwo possible truth values; i.e., that every proposition is either definitely T or definitely F.We will see later that this assumption also breaks down under certain interesting conditions.Nevertheless, the logic based on these two somewhat idealized assumptions turns out to beextremely powerful and useful, and is the basis for most or all further and more nuancedelaborations of logic.

Bivalence should not be confused with the theorem derived in the previous chaptercalled the Law of Excluded Middle, ` P ∨ −P . See [7] for an illuminating discussion ofthis point.

61

Page 74: Symbolic Logic Text

62 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

5.2 Truth Table Definitions of the Propositional Connectives

I reproduce here the basic truth tables from Chapter Two, for convenience:

AtomicProposition:

P

TF

Negation:

− P

F TT F

Conjunction:

(P & Q)T T TT F FF F TF F F

Disjunction:

(P ∨ Q)T T TT T FF T TF F F

Material Implication:

(P → Q)T T TT F FF T TF T F

Biconditional:(P ↔ Q)T T TT F FF F TF T F

5.2.1 How Many Possible Truth Tables Are There?

The five familiar truth tables above are not the only possible truth tables for binary con-nectives. Consider the general truth table for any binary connective ∗:

(P ∗ Q)T ? TT ? FF ? TF ? F

There are two ways to fill each space in the centre column and four lines per table,meaning that the total number of ways to fill in the centre column will be just 2×2×2×2 =24 = 16. In other words, there are 16 possible truth functions of two propositional variables.While the ones we define above are the ones most commonly used in elementary logic, othersare possible, and some (such as exclusive OR) have practical applications in circuit theory.

5.2.2 Expressive Completeness

Clearly, some of the truth functions we have defined are redundant, in the sense that theycan be written as combinations of other symbols. For instance, we can show that P → Q isequivalent to −P ∨ Q; hence our use of the symbol → is largely for linguistic convenience.Some sets of symbols would not be sufficient to produce in combination all possible truthtables. However, it can be shown that combinations of {−, ∨} and {−, & } are sufficientto generate all possible tables. It is also possible to define a single connective that isexpressively complete in this sense, and for this purpose one can use either the Shefferstroke P |Q (defined as −P ∨ −Q) or the Nicod stroke P ↓ Q (defined as −P & −Q). Wewill examine some of the properties of these symbols later on.

Page 75: Symbolic Logic Text

5.2. TRUTH TABLE DEFINITIONS OF THE PROPOSITIONAL CONNECTIVES 63

5.2.3 Evaluating Complex Wffs

Evaluate wffs as follows:

1. Be sure you have written the input values for all the occurrences of each variable inthe same order.

2. Evaluate all negations of atomic formulas first; then,

3. starting with the innermost brackets, work outward until you come to the main con-nective.

Arbitrarily complicated wffs can be evaluated mechanically by truth tables. Truthtables therefore constitute what computer scientists call an effective procedure — meaningsimply a recipe that can be followed in a completely unambiguous manner that is guaranteedto produce a unique result. The practical drawback of truth tables is that the number oflines in a table for a formula containing n propositional variables is 2n. This is a rapidlyincreasing function; recall the old fable about the wise man who demanded

∑64i=1 2i grains

of rice from his king in return for a favour. (See [3]. A quick estimation will show whythe king may have been tempted to cut off the wise man’s head.) Large truth tables maytherefore take impractically long times to evaluate.

Many of the truth table problems encountered in elementary propositional logic can besolved by means of short-cuts, and we will consider these below. There is no guarantee thata short cut exists for a given problem. On the other hand, if in an introductory course youfind yourself faced with the task of evaluating a formula that would contain 8, 16, 32, ormore lines if fully written out, take a careful look at it before beginning blindly to fill inthe whole table. There is probably a short-cut!

Example: do the full truth table for the wff −(P → (−Q ∨ R)) &P .Write out the formula as neatly as possible, with fairly wide, even spacing between the

symbols. Then enter all the possible combinations of truth values for the three letters inthe formula, P , Q, and R. (You can think of these as “inputs”.) There is nothing sacredabout the order in which I have written the inputs below, but it is a standard order thatis commonly used in logic, and it would be helpful to stick to this order. Since P appearstwice in the formula you will have to enter its input values twice. Whenever a letter appearsmore than once in a table, it is essential that you write the values in its column of inputvalues in the same order for each instance of the letter; otherwise, the truth table will bemeaningless. There will be 8 = 23 lines in the table since there are three variables. Thenproceed to evaluate the various subformulas in the order indicated above, ending with themain connective, which in this case is the & symbol.

− (P → (− Q ∨ R)) & P

F T T F T T T F TT T F F T F F T TF T T T F T T F TF T T T F T F F TF F T F T T T F FF F T F T F F F FF F T T F T T F FF F T T F T F F F

Page 76: Symbolic Logic Text

64 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

We see that the formula is T only in the case in which P and Q are T and R is F.

5.3 Semantic Classification of Single Wffs

Well-formed formulas of propositional logic can be classified according to their semanticproperties:

5.3.1 Tautology

A wff is said to be tautologous, or to be a tautology , if and only if it comes out T under allpossible combinations of input values. Tautologies can also be referred to as logical truthsor necessary truths.

Example: Law of Excluded MiddleIn the previous chapter we proved the Law of Excluded Middle as a theorem. The tableshows that it is a tautology as well, as it should be.

P ∨ − P

T T F TF T T F

Example: (P → (Q→ P ))In the previous chapter we gave several proofs of this formula as a theorem. One can easilysee that it is a tautology as well. The only way it could be F would be if P were T and(Q→ P ) were F. But if P is T, then (Q→ P ) is T and the whole wff is T. Hence, it is notpossible for this wff to be false.

5.3.2 Inconsistency

A wff is said to be an inconsistency , or to be inconsistent , if it comes out F for all possiblecombinations of input values. Such a wff is sometimes loosely called a contradiction, butthat term is properly reserved for an inconsistent wff such as P & − P containing only onepropositional variable.

Example: Law of Non-ContradictionThe Law of Non-Contradiction states that it is never the case that a proposition is bothtrue and false; i.e., that (P & − P ) is always false. This is supported by the truth table:

P & − P

T F F TF F T F

5.3.3 One-to-One Relation Between Inconsistencies and Tautologies

It is easily seen by de Morgan’s Law that the Law of Non-Contradiction is simply thenegation of the Law of Excluded Middle. It can be shown in general that for every tautologythere is a contradiction, generated by taking the negation of the tautology, and vice versa.

Page 77: Symbolic Logic Text

5.4. RELATION OF SYNTAX TO SEMANTICS 65

5.3.4 Contingency

A wff is said to be contingent or to be a contingency iff it has occurrences of both T andF in its truth table. Its truth or falsity in a given situation therefore depends on the facts,not on logical form. The simplest example of a contingency is just P ; other examples areP &Q, P ∨ Q, and P → Q.

5.3.5 Consistency

A wff is said to be consistent if it is T for at least some of its input values; in other words, awff is consistent, or is a consistency, iff it is either a tautology or a contingency. Examples:P , P → Q, P → P .

5.4 Relation of Syntax to Semantics

A logical system is said to be sound iff every theorem that it can generate is a tautology. Alogical system is said to be complete iff every tautology can be proven as a theorem. Oneof the most important things done in metalogic (the logic of logic) is to investigate whethervarious possible logical systems are sound or complete.

We do not attempt to prove the soundness or completeness of propositional logic inthis course. However, it can be shown that propositional logic is, in fact, both sound andcomplete. (See Lemmon [17], Sect. 2.4 and 2.5, for one way of doing this.) This fact haspractical significance for us because it tells us that there is a nice one-to-one correspondencebetween theorems and tautologies: every theorem we can prove using natural deduction willturn out to be a tautology if we do truth table analysis on it, while every tautology can beproven as a theorem.

It may seem too “obvious” to bear mentioning that propositional logic is both completeand sound. In fact, it turns out that many important systems of logic that are more powerfulthan propositional logic (in the sense that they can express and derive results about logicalor mathematical structure that is invisible to propositional logic) are incomplete. In 1930,Kurt Godel shocked the mathematical world by proving that any system of logic thatcontains enough machinery to define the natural numbers is, if it is consistent, incomplete.(See [10, 19].) This tells us that there is no finite set of formal rules from which one canderive all the possible truths about the natural numbers; or to put it another way, therecould never be a computer program that could by itself generate all possible truths aboutthe numbers.

5.5 Semantic Classification of Pairs of Wffs

There are useful semantic properties that can be identified for pairs of wffs.

5.5.1 Implication

Proposition P is said to imply Q iff P → Q is a tautology.

Page 78: Symbolic Logic Text

66 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

5.5.2 Equivalence

One proposition P is equivalent to another, Q, iff P ↔ Q is a tautology.

5.5.3 Contrariety

A proposition P is contrary to Q if they are never both T. They could, however, both befalse. Two propositions P and Q are contraries iff −(P &Q) is a tautology.

Examples: The propositions P &Q and P & −Q are contraries, since they are both Fwhen P is F, but never both T. It is easily seen that the negation of their conjunction is atautology.

5.5.4 Subcontrariety

Two propositions P and Q are subcontraries if they are never both false. They could,however, both be true. Formally, P and Q are subcontrary iff P ∨ Q is a tautology. Theconcept of subcontrariety is not widely used these days, though it was important in medievallogic.

Examples: The propositions P ∨ Q and P ∨ −Q are subcontraries since one of themis T even if the other is F; however, they are both T if P is T. It is also easily seen thattheir disjunction is a tautology.

5.5.5 Inconsistency

Two propositions P and Q are inconsistent iff they never agree in truth value. This is thesame thing as saying that −(P ↔ Q) is a tautology.

5.5.6 Consistency

Two propositions P and Q are consistent iff their conjunction P &Q is consistent.

5.5.7 Independence

Two propositions are said to be independent iff none of the above relations apply to them.The simplest example of independent propositions are two atomic propositions, say P andQ.

5.6 Relations Between Semantic Properties of Wffs

It is possible to use natural deduction to establish relationships between the semantic prop-erties of pairs of wffs.

Example: Show that if P and Q are contrary then one implies the negation of the other.The given propositions are contrary if and only if −(P &Q) is a tautology. This suggests

that to test the implications of contrariety, we assume that −(P &Q) is true and carry outa simple natural deduction:

Page 79: Symbolic Logic Text

5.7. SEMANTIC CLASSIFICATION OF SETS OF WFFS 67

1 (1) −(P &Q) A1 (2) −P ∨ −Q 1 deM1 (3) P → −Q 2 Imp1 (4) −Q ∨ −P 2 Comm1 (5) Q→ −P 4 Imp

Lines 3 and 5 express the required properties of P and Q.

5.7 Semantic Classification of Sets of Wffs

Some of the above properties of pairs of wffs can be usefully extended to sets of any numberof wffs.

5.7.1 Equivalence

A set of wffs is said to be equivalent iff all its members are pair-wise equivalent.

5.7.2 Inconsistency

A set of wffs is said to be inconsistent iff its conjunction is always false; or, equivalently, iffthe negation of its conjunction is a tautology. E.g.: {P, P → Q,−Q}.

5.7.3 Consistency

A set of wffs is said to be consistent iff it is not inconsistent; this is the same thing as sayingthat its conjunction is consistent; i.e., either a tautology or a contingency.

5.7.4 Independence

A set of two or more propositions are independent of each other iff they are independentpairwise.

5.8 Testing Validity of Sequents Using Truth Tables

Suppose you are asked to find a proof of a sequent such as P,Q,R ` S. If you can’t comeup with a proof, that doesn’t mean that the sequent is invalid; it may simply mean thatyou are unable to think of a route from premisses to conclusion. If we can find a proof thatmakes correct use of the rules, then the sequent is valid; but if we can’t find a proof, thatdoesn’t mean that it is not valid. We need a general procedure for testing the validity ofsequents.

One of the most important uses of truth tables is in testing the validity of argumentforms. A sequent is said to be valid if and only if the conclusion is never false when thepremisses are true. To check the validity of a sequent, therefore, one thing we can do is setup the truth table for the sequent, with separate columns for premisses and conclusion, andsee if there is any line in the truth table in which the premisses are all T and the conclusionis F. If this is not the case, then the sequent is valid.

Here is an example of a valid sequent (MP):

Page 80: Symbolic Logic Text

68 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

P P → Q Q

T T T T TT T F F FF F T T TF F T F F

(I use a double vertical line to separate the conclusion from the premisses.) In each linein which the conclusion is F, at least one of the premisses is F as well. This is enough toshow that the sequent is valid.

By contrast, here is the table for Affirming the Consequent:

Q P → Q P

T T T T TF T F F TT F T T FF F T F F

In the third line, both premisses are T while the conclusion is F. This proves thatAffirming the Consequent is invalid. You need only find one line in which this happens inorder to show invalidity. Note very carefully that we cannot conclude that the argumentis valid just because there are some lines (such as the first) in which everything comes uptrue. That is just like an invalid computer programme that occasionally outputs the correctanswer. The program is valid only if it never makes a mistake, not if it is sometimes butnot always right.

Here is an example that may seem counter-intuitive. Is P,−P ` Q valid or invalid?This is just ExF, which we already know how to prove using RAA. But the truth table isinstructive:

P − P Q

T F T TT F T FF T F TF T F F

The sequent is valid since there is no line in which the conclusion is false and all thepremisses are T. But the sequent can never be sound , meaning valid with all premisses true.

Suppose now we have a sequent in which the conclusion is a tautology, such as P →Q ` P → P . Here is the table:

P → Q P → P

T T T T T TT F F T T TF T T F T FF T F F T F

As before, there is no line with the premiss T and the conclusion F, and the sequent isvalid. But we need hardly have done the whole table to see this, since the conclusion is a

Page 81: Symbolic Logic Text

5.8. TESTING VALIDITY OF SEQUENTS USING TRUTH TABLES 69

tautology and must always come out T.We can summarize this as follows: suppose F is a contradictory proposition, T is a

tautology, and X is any proposition whatsoever, true or false. Then any sequents of thefollowing form are valid:

F ` X (5.1)X ` T (5.2)

5.8.1 Deduction Theorem from the Semantic Viewpoint

From the semantic viewpoint, the Deduction Metatheorem can be expressed as follows:

The sequent P1, P2, . . . , Pn ` Q is valid iff the conditional (P1 &P2 & . . . &Pn)→Q is a tautology.

Our method for checking for validity is designed to take advantage of this fact.

5.8.2 Short-cuts

The truth-table check of validity is what logicians call an effective procedure, meaning thatit is always guaranteed to produce a definite answer. However, as noted, the size of a truthtable increases very rapidly as a function of the number of propositional variables involved;the table for a problem with 6 letters, for instance, will have 26 = 64 lines. It would bevery good, therefore, to have short-cuts that would make it unnecessary for us to write outevery line of the table in order to check the validity of a sequent.

The key idea of the short-cut method is to work backwards from the conclusion to thepremisses. Recall that to show that a sequent is valid we have to show that there are nolines in the truth table where all the premisses are T and the conclusion F; conversely,to show that a sequent is invalid it is sufficient to find just one line in its table with allpremisses T and the conclusion F. What we do, therefore, is set up the truth table for thesequent in the manner suggested above, with the conclusion set off to the right. Then setthe conclusion to be F, and see whether this forces any or all of the premisses to be T. Somesequents can be checked in one line; some require more than one.

There can be no guarantee that a short-cut can be found for every problem; some maybe irreducibly complex and will require a full or nearly full truth table for their analysis.But it is always a good idea to see if a short-cut can be found before starting to write outa huge truth table.

Here are a few examples to illustrate the method.Example: show that MP is valid by the short-cut method.

P P → Q Q

T T F F FF F T F F

The only lines that need concern us are the two lines in which the conclusion is F.However, we immediately see that in both of these lines one of the premisses is F; thesequent is therefore valid. Writing what I have shown here constitutes a complete solutionto the problem of testing the validity of MP; it is not necessary to write out the whole table.

Page 82: Symbolic Logic Text

70 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

Example: show that Affirming the Consequent is invalid by the short-cut method.

Q P → Q P

T F T T F

Set the purported conclusion P to be F; this forces the premiss P → Q to be Tregardless of the value of Q; and this means that we can choose the other premiss Q tobe T. Therefore, there is a line with all premisses T and the conclusion F; therefore, thesequent is invalid.

Example: let’s check Constructive Dilemma, which is more complicated. This sequenthas 4 propositional variables, and so its full table would have 24 = 16 lines.

P ∨ Q P → R Q → S R ∨ S

F F F F T F F T F F F F

This can be checked with just one line. Set the conclusion to be F. That forces bothR and S to be F. The values of P and Q remain to be determined. But any possibleinvalidating line has to have P → R and Q→ S come out T, and which means that P andQ would have to be F. However, this makes the first premiss P ∨ Q F as well. Hence, thereis no invalidating line, and the argument is valid.

There will sometimes be sequents in which you have to check more than one line.Example: check (−P ∨ −Q)→ R,−R ` P &Q.

(− P ∨ − Q) → R − R P & Q

F T T T F F F T F T F FT F T F T F F F F TT F T T F F F F F F

It is only necessary to consider lines on which −R is T, which makes R false. However,there are three ways in which P &Q can be false. Plugging R = F into the first premiss, wesee that all possible combinations of the truth values of P and Q which make the conclusionF also make the first premiss F. Therefore, the argument is valid.

5.9 Using Short-Cuts in Testing for Other Semantic Proper-ties

Short-cut truth tables can be used sometimes to test for semantic properties other thanvalidity as well.

It is impossible to run through all of the possible situations that might arise. Here isan example which will suggest the sort of approaches that one could take.

Example: Determine whether (P &−P )→ (P → (Q ∨R)) is contingent, contradictory,or tautologous.

The antecedent (P & − P ) is always F, and therefore by the truth table for → theformula is always T — i.e., it is a tautology. The table need only show enough informationto express this fact:

Page 83: Symbolic Logic Text

5.10. EVALUATING LONG CONJUNCTIONS OR DISJUNCTIONS 71

(P & − P ) → (P → (Q ∨ R))F T

The general approach to this sort of problem is to see what it would take to make thewff T and F. If it can be both T and F, then it is contingent; if it can only be F, it iscontradictory; and if it can only be T, it is a tautology.

5.10 Evaluating Long Conjunctions or Disjunctions

By a “long” conjunction or disjunction I simply mean one containing more than two formu-las. We accept conjunctions and disjunctions of more than two formulas as wffs. However,truth table analysis as we have set it up here (and as it is usually done in most elementarytexts) deals only with & and ∨ as binary connectives. In order to evaluate a formulacontaining a long conjunction or disjunction, therefore, we have to bracket off the termstwo at a time, and arbitrarily pick one of the connectives as the main connective. In fact,which one we pick makes no difference to the outcome, although we cannot prove that factin all generality in this course.

Example: Evaluate P ∨ Q ∨ R when P and Q are T and R is F , and show that we getthe same result regardless of what order we associate the terms.

P ∨ (Q ∨ R) (P ∨ Q) ∨ R

T T T T F T T T T F

(I’ve only shown two of the possible ways to associate this formula; the others canbe found by applying commutation.) In the left-hand formula, the first ∨ is the mainconnective; in the right-hand formula, the last ∨ is the main connective.

In this course you will rarely if ever have to evaluate a conjunction or disjunction withmore than two terms, so this is not something that you will frequently need to worry about.

5.11 Sheffer and Nicod Strokes

At the beginning of this chapter I mentioned that there are other possible truth tables fortwo propositional variables than the familiar ones we use in elementary logic. The latter arechosen (i) because they are expressively complete, and (ii) because they come reasonablyclose to capturing the sense of the logical connectives that we use in ordinary language.It is, however, possible to do the logic of propositions with only one symbol, and for thispurpose one can use either the Sheffer or the Nicod strokes. As the practice examples atthe end of the chapter will convince you, these connectives are almost entirely useless forpractical calculations in logic. But their existence does tell us something about how logicworks, and they have important applications in circuit theory.

The Sheffer stroke is defined as “not both P and Q”, and has the following truth table:

Page 84: Symbolic Logic Text

72 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

P | Q

T F TT T FF T TF T F

In circuit theory, this is called a NAND gate (“not-and”).

The Nicod stroke is defined as “neither P nor Q,” and has the following table:

P ↓ Q

T F TT F FF F TF T F

In circuit theory, this is called a NOR gate (“not-or”).

Page 85: Symbolic Logic Text

5.12. REVIEW EXERCISES ON CHAPTER 5 73

5.12 Review Exercises on Chapter 5

A

1. Write the complete truth table for each of the following wffs:

(a) −(P ∨ Q) & − (P &Q)

(b) −(−(P → Q) & − (Q→ P ))

(c) −(P & (Q&R))

(d) P → (Q→ R)

2. Evaluate the following wffs, when P = T and Q = F .

(a) −(P → Q)

(b) −(P ↔ Q)

(c) (−P ↔ Q)

(d) (−P & −Q) ∨ (P &Q)

3. Use truth tables to characterize the following wffs as tautologous, inconsistent, orcontingent.

(a) P → P

(b) −P → P

(c) P → (−P → P )

(d) (P ∨ −P )→ (P & − P )

4. Check the following sequents for validity. Use short-cuts where they are suitable.

(a) P ∨ Q,−Q ` P (DS)

(b) P → Q,−Q ` −P (MT)

(c) P → Q ` P → (P &Q) (Abs)

(d) P → Q,−P ` −Q (Denying the Antecedent)

B

1. Use truth tables to classify the following wffs as either tautologous, inconsistent, orcontingent:

(a) P ∨ (Q&R)

(b) P → (R&S)

(c) P → (Q→ P )

(d) (P → R)→ (P → (P &R))

(e) (P → (Q→ P ))→ R

(f) (P ↔ Q)↔ (Q↔ −P )

(g) P → (R& −R)

Page 86: Symbolic Logic Text

74 CHAPTER 5. PROPOSITIONAL SEMANTICS: TRUTH TABLES

(h) (P → R)→ −P(i) (P → (Q&R)) & (P → (Q& −R))

(j) −(P → Q) & − (Q→ P )

2. Use truth tables to check the following sequents for validity:

(a) P → Q,Q→ R ` P → R (HS)

(b) P ↔ Q,Q↔ R ` P ↔ R

(c) P → Q,R→ Q ` P → R

(d) P → Q ` −Q→ P

(e) P → Q,R→ S,−Q ∨ −S ` −P ∨ −R (DD)

(f) P ∨ Q,P → R,Q→ R ` R (Conv)

(g) P → Q ` P → (P ∨ Q)

(h) P → Q ` P → (P &Q)

(i) P ∨ R,P → (R ∨ S),−S ` −P(j) P → (M &N),M → R,N → S,−(R ∨ S) ` −P ∨ M

3. The following questions deal with the analysis of semantic properties.

(a) Show that if P and Q are contrary then −P and −Q are subcontrary.

(b) Show that P and Q are equivalent if and only if P and −Q are contradictory.

(c) Show that −P and −Q are equivalent if and only if P and Q are equivalent.

(d) Show that two wffs are inconsistent iff they are both contraries and subcontraries.

(e) If P,Q ` R is a valid sequent, is it possible for R to be false? Explain youranswer.

(f) If P,Q ` R is invalid, is it impossible for P , Q, and R to all be true? Explainyour answer.

C

1. (a) Show how to write −P , P ∨ Q, P &Q, and P → Q in terms of the Sheffer stroke.

(b) Do the same for the Nicod stroke.

(c) Check the following sequent for validity by constructing its full truth table:(P |P )|(Q|Q), P |(R|R) ` R.

(d) Translate the sequent in part (c) into ordinary propositional notation.

Page 87: Symbolic Logic Text

Chapter 6

Propositional Semantics: TruthTrees

Truth trees are one of the most efficient ways of checking the semantic properties of propo-sitional formulas. In particular, it gives a very easy way of checking the validity of sequents.

The basic idea of truth trees is that they give a graphic way of displaying whether ornot a set of formulas is inconsistent.

6.1 Branches and Stacks

Truth trees are graphic objects built up out of combinations of two basic structures, stacksand branches. Stacks show the decomposition of conjunctions into their separate conjuncts,and branches show the decomposition of disjunctions into their separate disjuncts.

P &QXPQ

Basic Stack

P ∨ QX

�� @@P Q

Basic Branch

We place a checkmark beside every formula that has been decomposed.Because any number of propositions can be conjoined, a stack can contain any number

of propositions. For instance, we could start with the formula (P &Q) & − R& (P ∨ Q),and break it down as shown on the top of the next page:

75

Page 88: Symbolic Logic Text

76 CHAPTER 6. PROPOSITIONAL SEMANTICS: TRUTH TREES

(P &Q) & −R& (P ∨ Q)X(P &Q)X−RPQ

P ∨ QX

�� @@P Q

Figure 6.1: A Typical Stack

Similarly, because any number of propositions can be disjoined, we can have brancheswith any number of stems.

P ∨ Q ∨ R���HHH

P

Q

R

Figure 6.2: A Triple Branch

However, we will rarely need branches with more than two stems in this book.Stacks and branches can be linked together into trees of arbitrary size. For instance,

the wff (P &Q) ∨ R can be decomposed as follows:

(P &Q) ∨ RX

�� @@P &QXPQ

R

Figure 6.3: A Typical Tree

Decomposition has gone as far as it can when all molecular formulas are broken downinto atomic formulas or their negations (which are collectively called literals).

6.2 Paths

A route going from the initial set of formulas at the top down to one of the atomic formulasat the bottom, while choosing only one side of each branch as one goes, will be called herea path. (In some logic books these are called branches, but this invites confusion with thesense of branch as a forked structure splitting off from a disjunction, as noted above.) Aseach formula is broken down into its corresponding branch or stack, it is checked off asshown so that we don’t forget that we have broken it down.

Page 89: Symbolic Logic Text

6.3. STACKING AND BRANCHING RULES 77

Here is another simple example:

P &QXP ∨ −QX

PQ

�� @@P −Q

×

To arrive at this we first broke down the conjunction (P &Q) into a stack, and thenbroke down the disjunction (P ∨ −Q) into a branch. It would have been perfectly validto do the branch first, and then the stack. (Try it.) As a rule, though, it is usually moreefficient to do as many stacks as one can first, and then do the branches, since this willminimize the number of branches to be drawn.

Truth trees are a kind of partially ordered set or poset . These terms will be explainedlater in this book.

My use of the term path here is analogous to the way it is used in DOS and UNIX,which possess hierarchical file structures. In the last tree above, the two paths could benotated in a form similar to the way we note paths in UNIX. That is, the two paths inthe above tree could be written more abstractly as ((P &Q) & (P ∨ −Q)) \ (P &Q) \ Pand ((P &Q) & (P ∨ −Q)) \ (P &Q) \ Q. However, it will not be especially useful to doit this way, since the main purpose of truth trees is to make the implications of a stack ofpropositions graphically explicit.

Notice that the path ending in −Q contains contradictory formulas, namely −Q itselfand Q. Any path that contains a contradiction is marked off with an × at the end, anddeclared to be closed . Any path that is not closed is open.

If all branches on a tree are closed, then the tree itself is said to be closed. If just onebranch remains open after decomposition has gone as far as it can go, the tree itself is open.

Although we will usually have to break formulas all the way down to the literal level,we can stop decomposing formulas along a path as soon as a contradiction is found. Forexample, in the above tree, replace Q with (R → S). (Try it.) You will obviously get aclosed path without having to break down (R → S) into its literal components, and youcan stop working on that path at that point. In general, once a path is closed, we do notneed to go any farther along it.

What we are doing, in effect, is treating all the formulas along a given path as if theywere true, and seeing whether or not this will lead to a contradiction. Testing the branchesthat sprout from a disjunction is similar to vE in that we go exploring down each branchof the disjunction, one at a time, to see where it will lead. We can start from any numberof initial formulas and break them down as far as we can. If all branches of the resultingtree are closed, the set of formulas is inconsistent.

6.3 Stacking and Branching Rules

We need to know how to decompose other formulas besides simple conjunctions and dis-junctions. The rules for doing this can be divided into stacking rules and branching rules.

Page 90: Symbolic Logic Text

78 CHAPTER 6. PROPOSITIONAL SEMANTICS: TRUTH TREES

P &QXPQ

−− P XP

−(P ∨ Q) X−P−Q

−(P → Q) XP−Q

Stacking Rules

P ∨ QX�� @@

P Q

P → QX�� @@

−P Q

−(P &Q) X�� @@

−P −Q

P ↔ QX�� @@

PQ

−P−Q

−(P ↔ Q) X�� @@

P−Q

−PQ

Branching Rules

These rules follow from equivalence formulas (such as ` (P → Q)↔ (−P ∨ Q)) whichcan be easily established by natural deduction, and checked by truth tables.

6.3.1 Substitution of Equivalent Expressions

To further simplify the use of trees, we can also make it a rule that we can replace formulaswith their equivalents, either piecewise or as a whole, in stacks. Thus, a valid stack couldlook like this:

(P → Q) X−P ∨ Q

Be sure to check off the formula you have replaced. It would also be a courtesy to our belea-guered marker to jot down, off to the side, the rule you are using (in this case Implication)that justified the replacement.

The stacking rules cover all possible cases you could encounter in propositional logic,so strictly speaking replacement of equivalents like this is not necessary. However, in thespirit of the approach in this text we will allow any method that makes the work easier —which might be the case if, say, you have forgotten a stacking or branching rule!

6.4 Checking Semantic Properties of Formulas with Trees

Although the main purpose of truth trees is to reveal inconsistency, all of the major semanticproperties of sets of propositional formulas can be checked using truth trees.

6.4.1 Tautology, Contingency, and Consistency

To check whether or not a wff is a tautology, we run the tree for its negation. The formulaP is a tautology if and only if the tree for −P closes. For example, to show that (P →(Q→ P )) is a tautology, we can run the following tree:

−(P → (Q→ P )) XP

−(Q→ P ) XQ−P×

Page 91: Symbolic Logic Text

6.4. CHECKING SEMANTIC PROPERTIES OF FORMULAS WITH TREES 79

There is only one path, and since it is closed we see that the negation of the formula isinconsistent, meaning that the formula itself is tautologous.

If we ran the a tree on the formula (P → (Q→ P )) itself, we would get this:

(P → (Q→ P )) X�� @@

−P (Q→ P ) X�� @@

−Q P

All paths on this tree are open. However, this does not show that the formula is atautology, only that it is consistent ; and we already know that if all we know about aformula is that it is consistent then it could be either a tautology or a contingency.

A set of formulas {P1, . . . , Pn} is inconsistent if and only if (P1 & . . . &Pn) is alwaysfalse, which is the same thing as saying that −(P1 & . . . &Pn) is a tautology. To checkwhether a set of formulas is inconsistent, stack them up and run a tree. The set is incon-sistent iff the tree closes.

To check whether a formula P is contingent , we have to run trees for both P and −P .The formula is contingent if and only if neither tree closes. For instance, the following treesshow that (P → Q) is contingent:

P → QX�� @@

−P Q

−(P → Q) XP−Q

In summary: a set of formulas is inconsistent iff its stack has a closed tree, and isconsistent iff its stack has an open tree. A formula is contingent iff the tree for both theformula and its negation are open.

6.4.2 Equivalence

To check whether or not two formulas P and Q are equivalent, we use the fact that P andQ are equivalent iff (P ↔ Q) is a tautology, and this is the same thing as saying that thetree for −(P ↔ Q) must close. For instance, to demonstrate that −(P &Q)↔ (−P ∨ −Q)(one of de Morgan’s Laws), we run the following tree:

−(−(P &Q)↔ (−P ∨ −Q)) X���

���

HHHHHH

−(P &Q) X−(−P ∨ −Q) X

PQ

P &QX−P ∨ −QX

PQ

�� @@ �� @@−P×

−Q×

−P×

−Q×

Since all paths close, the formula is a tautology. This tree looks relatively complex,

Page 92: Symbolic Logic Text

80 CHAPTER 6. PROPOSITIONAL SEMANTICS: TRUTH TREES

but it follows straightforwardly by the application of the branching and stacking rules givenabove.

Of course, we could have also demonstrated the same result by doing two trees, one for−(P &Q) ` (−P ∨ −Q), and one for (−P ∨ −Q) ` −(P &Q). (Try it.)

Checking a set of more than two formulas for equivalence can be a little more compli-cated, and is in fact not something that one often has to do. But suppose we have to checkwhether all formulas in the set {P1, P2, P3} are equivalent. We could check each possiblepair for equivalence. However, we should keep in mind that equivalence is transitive, thatis, P1 ↔ P2, P2 ↔ P3 ` P1 ↔ P3. (This sequent can be proven by an easy application ofHS; make sure you try it!) Therefore, to check three formulas for equivalence, we only needto check two formulas for inconsistency: −(P1 ↔ P2) and −(P2 ↔ P3).

In many cases the formulas will be simple enough in structure that it will be obviouswhether some pair of them is not equivalent. If that is the case, you are done; since to showthat a whole set of formulas is inequivalent it is sufficient to show that any two members ofa set are inequivalent.

6.5 Checking Validity of Sequents

The most important application of truth trees is in checking the validity of sequents.A sequent P1, . . . , Pn ` Q is valid if and only if (P1 & . . . Pn & −Q) is inconsistent. This

simply expresses the notion of validity, which is that if the argument (or the sequent thatformally represents its propositional structure) is valid, then the premisses cannot lead tothe negation of the conclusion.

We can check the validity of any propositional sequent P1, . . . , Pn ` Q simply by makinga stack consisting of the premisses and the negation of the conclusion, and breaking downall of the formulas to see if we arrive at closed paths. The sequent is valid if and only if thewhole tree is closed. In other words, if even one path is open, the sequent is invalid; sincethat would express the fact that it is possible to get from the premisses to the negation ofthe purported conclusion without contradiction.

To check the validity of an interderivability result such as (P → Q) a` (−P ∨ Q) wehave to run two trees, one in each direction.

Here are some examples of trees for familiar valid sequents:

P → QXP−Q�� @@

−P×

MP

P → QXQ→ RX−(P → R) X

P−R�� @@

−Q R×�� @@

−P×

Q×HS

These sequents are shown to be valid by the fact that all paths close.

Page 93: Symbolic Logic Text

6.6. TREES OR TABLES? 81

Don’t forget that the order in which we choose to break down the formulas in our initialstack does not make any difference to the validity of the result. However, one often arrivesat a simpler tree structure by doing as much stacking as possible first. Also, don’t forgetthat as shown in the tree for HS, once a path is closed, no further breakdown is needed.

Here, for comparison, are the trees for some invalid sequents:

P → QXQ−P�� @@

−P Q

P ∨ QXQ ∨ RX−R��

����

HHHHHH

P Q

�� @@ �� @@Q R

×Q R

×

In the first example (which is just Affirming the Consequent), neither path closes. Inthe second example, two paths close, but two remain open, showing that the sequent isinvalid. Also, in the second example, you will see that if you decompose Q ∨ R first youwill get a slightly simpler tree structure.

To check whether or not a formula is a theorem is (by the soundness and completenessof propositional logic) the same thing as checking whether it is a tautology. Thus, forinstance, the tree above that checks whether (P → (Q → P )) is a tautology shows that` (P → (Q→ P )).

6.6 Trees or Tables?

Truth trees are usually quicker and simpler to use than truth tables, especially when eval-uating very complex arguments with several premisses. But there are cases in which truthtables are simpler to use. For example, let us check the sequent P → Q,Q→ R ` R → P.It should be apparent that this sequent is invalid. To demonstrate this fact by means of atruth tree, we would have to construct the following tree:

P → QXQ→ RX−(R→ P ) X

R−P�

�����

HHHH

HH−Q R�� @@

−P Q×

�� @@−P Q

Since the tree is open, the sequent is invalid. Strictly speaking, one could stop drawingthe tree as soon as even one open path was revealed since that would be sufficient to

Page 94: Symbolic Logic Text

82 CHAPTER 6. PROPOSITIONAL SEMANTICS: TRUTH TREES

demonstrate invalidity, but it is still a moderately complex tree.We could also demonstrate that the above sequent is invalid by using a short-cut truth

table, as follows:

P → Q Q → R R → P

F T T T T F F

While the full table for this sequent has eight lines, only one partial line is needed todemonstrate invalidity: assume the conclusion is false, and work backwards. We can seethat it is possible to make both the premisses T when the conclusion is F, regardless ofthe value of Q. This is clearly less work than the tree. The lesson is, therefore, that bothtrees and tables should be in the logician’s toolkit, and with experience the logician will getto know which is likely to be easier to try. If there are no more than (say) three or fourpremisses, and if the tree will involve a lot of branching, it may be easier to use a table.

6.7 Exercises

B

1. Use truth trees to check the following sequents for validity:

(a) P → Q,−P ` −Q (Denying the Antecedent)

(b) P ∨ Q,−P ` Q (Disjunctive Syllogism)

(c) P ∨ Q,P → R,Q→ S ` R ∨ S (Constructive Dilemma)

(d) P → R,Q→ S,−R ∨ −S ` −P ∨ −Q (Destructive Dilemma)

(e) P → Q ` P → (P &Q) (Absorption)

(f) ` (P → Q) ∨ (Q→ R)

(g) P ` (P → Q)

(h) ` −(P ∨ Q)↔ (−P & −Q) (de Morgan)

(i) ` P ∨ −P (Excluded Middle)

(j) P → (Q&R),−Q ` −P

6.8 Questions for Further Research

1. Set up general rule for checking the equivalence of a set of any number of wffs.

Page 95: Symbolic Logic Text

Chapter 7

Predicate Logic: Symbolizationand Translation

7.1 Why We Need Predicate Logic

Logic can be compared to a computer game in which there is a hierarchy of levels, suchthat once you have conquered a level you move onto the next higher level. In this chapterwe move to a higher level. It builds upon what we have learned before but gives us toolswith which to study the validity of a very large class of widely-used arguments that so farwe do not know how to analyze.

Consider the following elementary argument:

All philosophers are absent-minded.Kent is a philosopher.∴ Kent is absent-minded.

Intuitively, this argument seems valid. (Whether or not it is sound is another question.I have, in my roughly twenty year career, known two or three philosophers who were notabsent-minded, so I think that in fairness to the profession I would have to say that the firstpremiss is false.) However, we have no way of representing the validity of this argumentusing the tools of propositional logic we have learned so far, although it does seem as if itmight have something to do with MP.

Now consider a slight generalization of this argument:

All philosophers are absent-minded.There are some philosophers.∴ there are some who are absent-minded.

This again seems obviously valid, but, again, we have no tools with which to express itsvalidity in any precise way. Arguments like these seem to have a structure suggesting somesort of generalization of propositional sequents with which we are already familiar. (Forinstance, the above argument seems to have something in common with MP.) However,all that propositional logic as such could do for us would be to translate the sentences inthese arguments into logically independent symbols, and so both arguments would have thegenerally invalid form P,Q ` R.

83

Page 96: Symbolic Logic Text

84 CHAPTER 7. PREDICATE LOGIC: SYMBOLIZATION AND TRANSLATION

To represent the logical structure of these arguments we therefore need the followingtools:

1. A way to symbolize properties of individual persons or things (so that we can saysomething like “Kent is a philosopher” or “Mars is a planet”);

2. Ways to express the logic of quantifiers such as “all” and “some.”

7.2 Predicate Logic Notation

In the following I introduce the main types of logical structures that occur in predicatetranslation.

7.2.1 Universe of Discourse

Predicate notation is always assumed to refer to a domain or universe of discourse U . Thiscould be the set of all entities whatsoever, the set of all people, the set of all students inLogic 2003A, the set of all natural numbers, or any other well-defined collection of objects,persons, or entities. Sometimes the universe of discourse will be understood implicitly;sometimes it will need to be specified explicitly; and sometimes it does not need to bementioned at all. Usually we will assume that U is not empty.

7.2.2 Terms and Variables

We first need to introduce symbols to denote elements of the universe of discourse.Proper names are letters which serve the same function as proper names such as “Susan”

or “the planet Mars” in a natural language; that is, they denote unique individual entitieswithin U . By convention, proper names are usually written using lower-case letters chosenfrom the beginning or middle of the alphabet. For instance, we could set k := “Kent” andm := “the planet Mars”. If there are a large number of proper names, it may be useful tosubscript them, as n1, n2, . . ..

Arbitrary constants, also called arbitrary names, are letters that behave much likeproper names, or if you like as if they were proper names; but they are supposed to referto individuals picked from U at random, and about which we know nothing to distinguishthem from any other member of U . By contrast, if we use a letter as a proper name, itis assumed that we know something that distinguishes the individual to which the letterrefers from all other members of U . Proper names denote a unique individual in the domainof discourse, while arbitrary constants could denote any one of several individuals in thedomain who might share a certain property. Letters for arbitrary constants are, like propernames, usually taken from the beginning to the middle of the alphabet, and, again likeproper names, can be subscripted like a1, a2, . . . if it is convenient to do so. Arbitraryconstants are very important in predicate natural deduction, and, as we shall see in thenext chapter, their use requires some care.

Proper names and arbitrary constants are collectively called terms.Variables are letters which serve as place-holders where proper names or arbitrary

constants can be inserted in a formula. They serve a function very similar to that served byvariables in algebra. By convention, variables are usually written using lower-case letters

Page 97: Symbolic Logic Text

7.2. PREDICATE LOGIC NOTATION 85

from the end of the alphabet, such as x, y, z, t . . .. As with constants and names, they canbe subscripted if it is convenient to do so. For instance, we could have a set of variables ofthe form x1, x2, . . .. A variable is not a term, but rather a place-holder where a term canbe inserted.

7.2.3 Monadic Predicates

Suppose we want to translate “Mars is a planet” into predicate notation. We now knowthat we can treat “Mars” as a proper name, and choose some letter for it such as, say, m.We also need to know how to say that a term has a property, such as the property of being aplanet. To say things like this, we introduce predicate letters. These will usually be writtenas upper-case letters like A,B, . . . , F,G, . . . , P and so forth; and they can, if necessary, besubscripted as well. We define predicate letters by expressions such as

Px := “x is a planet”.

The terminology for properties that apply to single terms is diverse: they can be calledpredicates, monadic predicates, qualities, properties, attributes, adjectives, unary relations,or 1-place relations.

7.2.4 Relations

Predicates that apply to two or more terms, also called relations, polyadic relations, n-aryrelations, or n-place relations, can easily be accommodated in the notation of predicatecalculus.

Here are some examples:

Txy := “x is taller than y”Bxyz := “y is between x and z”

Proper names or arbitrary constants can be substituted for variables in relation-expressions.For instance, if we take (as before) k := “Kent”, and p := “Paul”, then we can say Tpk,meaning “Paul is taller than Kent”.

Note that the order in which the variables are expressed in a relation is usually crucial tothe meaning of the proposition. For example, while Tpk happens to be true, the propositionTkp is false.

In principle the notation can allow for relations with any number of arguments, althoughin this course we will rarely consider relations with more than two arguments. Predicatelogic thus allows us to express much more complex relational concepts than could be ex-pressed in any natural language. This way of expressing relational concepts seems to bevery obvious, but its introduction was a major step in the history of logic.

There is a detailed theory of relations, which studies the properties of special classes ofrelations, such as symmetric relations, equivalence relations, and so forth. We will not dorelation theory in this course (except in some C exercises); Lemmon’s Beginning Logic [17]contains an excellent introduction to relation theory.

Page 98: Symbolic Logic Text

86 CHAPTER 7. PREDICATE LOGIC: SYMBOLIZATION AND TRANSLATION

Alternate Notation

Some books write two-place relations Rxy as xRy, which conforms more closely to standardusage in English. (Unless we are Yoda of Star Wars, we would be more likely to say “x istaller than y” than “Taller is x than y.”) However, we will stick with the notation in whichthe variables are to the right of the relation letter (call this Yoda-notation, if you like), sinceit is more easily generalized.

7.2.5 Propositions and Propositional Functions

Expressions such as Px or Txy are called open sentences or (less correctly) propositionalfunctions. In themselves they have no truth value since they are, in effect, incompletesentences. We can’t evaluate “x is a philosopher” until we know who or what x is supposedto be. When we substitute proper names or arbitrary constants for the variables in opensentences then we get propositions which can be presumed to be either T or F, and whichcan be combined using the familiar connectives that we use in propositional logic. Forexample, if Gx := “x is a game-theorist”, Px := “x is a philosopher”, k := “Kent”, and p:= “Paul”, then we can write sentences such as Pk&Gp (“Kent is a philosopher and Paul isa game-theorist”), Gk → −Pp (“Kent is a game-theorist only if Paul is not a philosopher”),and so forth.

Note carefully that an expression like Txa is an open sentence, not a proposition, since itcontains one unassigned variable. Expressions like Txa or Px are often called propositionalfunctions, especially in older literature in analytical philosophy. Strictly speaking, this is amisnomer, since a function is not an expression but a mapping from a domain to a rangesuch that every set of input values from the domain maps into one and only one element ofthe range. Propositional functions therefore map elements of U into the set of all possiblepropositions; since propositions, in turn, map into truth values, propositional functions ineffect map elements of U into truth values. Despite this complication, it is often convenientto refer to expressions like Px or Txy as propositional functions, just as it is convenient inordinary mathematics to refer to an expression such as x2 as a function of x, even thoughproperly speaking it is just part of the function that takes real numbers into their squares.

7.2.6 Quantifiers

Quantifiers are powerful symbolic tools that allow us to refer to collections where we donot or cannot know all the members in advance. These could be large finite collectionswhere it would be impractical to name all members even if we knew them, or (especiallyin mathematics) infinite collections where it would be impossible to name all the memberseven if we knew what they were.

The introduction of quantifiers (principally by Gottlob Frege) toward the end of the19th century was one of the major innovations in the history of logic.

The Universal Quantifier

We always quantify over a domain or universe of discourse U . Suppose that U is the set ofstudents in Logic 2003A, in the Spring Term of 2004. Define Sx := “x studies logic”. If wehad the class list, then we could express the proposition “Everyone (in U) studies logic” as

S(Lindsey) &S(Ben) & . . . &S(Craig) (7.1)

Page 99: Symbolic Logic Text

7.2. PREDICATE LOGIC NOTATION 87

for all of the approximately 55 students in the class. A universal statement, which expressesthe concept that everything in the domain of discourse has a certain property, is thereforeequivalent to a conjunction over all members of the domain of discourse.

For a large class it is quite inconvenient to write a universal claim this way, and impos-sible if we do not know everyone’s name, if the membership in the class was unknown orundetermined, or if it was infinite. We can get around these difficulties by introducing theuniversal quantifier ∀x . . ., and write

∀xSx. (7.2)

This can be read in several ways, the most common of which are the following:

Everyone studies logic.For all x, Sx.For any x, Sx.

It is important to remember that the meaning of a quantificational statement is alwaysrelative to a universe of discourse, although U does not always need to be mentioned. If weknow in advance that U is a certain logic class, then we can write “Everyone studies logic”as in 7.2 above. If U were a larger set, such as the set of all students at a certain university,or the set of all people, then to express the same fact — that everyone in a certain classstudies logic — we would have to write

∀x(Lx→ Sx), (7.3)

where Lx := “x is in Logic 2003A”. The latter would be read “Everyone in Logic 2003Astudies logic,” or “If anyone is in Logic 2003A, then that person studies logic.”

The Existential Quantifier

Suppose now we wanted to express the claim that someone in Logic 2003A is wearing bluejeans. Let Bx := “x wears blue jeans”. If we know in advance that U is the membership ofthat particular class, then we could write

B(Lindsey) ∨ B(Ben) ∨ . . . ∨ B(Craig) (7.4)

As before, if the class is large it would be quite inconvenient to write this all out, andimpossible if we do not know everyone’s name, or if the membership in the class wasunknown, undetermined, or infinite. We get around these difficulties by introducing theexistential quantifier ∃x . . ., and write simply

∃xBx. (7.5)

This can be read in various ways:

There exists something such that it wears blue jeans.Someone wears blue jeans.At least one person wears blue jeans.

Page 100: Symbolic Logic Text

88 CHAPTER 7. PREDICATE LOGIC: SYMBOLIZATION AND TRANSLATION

Note carefully: In logic, the word “some” means nothing more than “at least one” or(in the case of continuous quantities such as liquids) “an amount greater than zero.” Thestatement “Some mammals are warm-blooded” is therefore consistent with the possibilitieseither that there is only one warm-blooded mammal or that all mammals are warm-blooded(and in fact the latter statement happens to be true, although as logicians we do not concernourselves with the mere facts of biology).

If U were a larger set than the membership of Logic 2003A, then we would have towrite

∃x(Lx&Bx), (7.6)

which can be read as “Someone in Logic 2003A wears blue jeans.”

Alternate Notation

Quantifiers are often written in enclosing brackets, such as (∀x)Fx. Many older logic bookswrite the universal quantifier without the ∀ symbol, so that (x)Fx would be read as “forall x, Fx”.

A few books (such as the widely-read Metalogic by G. Hunter [11]) use so-called Stanfordnotation, wherein universals are represented by

∧and existentials by

∨. This is inspired

by the fact that universals and existentials are generalized “ands” and “ors” respectively.

Bound and Free Variables

Variables in quantificational formulas are either bound or free. Bound variables are thosethat are linked to a quantifier, while free variables are those that do not appear in aquantifier. If a quantificational expression contains a free variable, it is an open sentence;it is a proposition only if all variables are bound.

Here are some examples:

In Px, the variable x is free, and the expression is an open sentence.In ∀xPx and ∃yTy the variables are bound, and the expressions are propositions

(which say, respectively, “Everything is P” and “Something is T”).In ∃yBayx, a is an arbitrary constant or name and y is bound; but x is unbound,

so that the expression is an open sentence with no truth value.

Scope of Quantifiers

It is possible to give a formal recursive definition of a well-formed formula of predicate logic(see Lemmon [17], for instance), but we will not bother to do so here since it is generallyobvious which formulas are properly formed and which are not.

Here are the main things to keep in mind: quantifiers, like negation signs, apply to theshortest possible string that follows them. Thus, for instance, in −(Pa&Qb), the wholeexpression (Pa&Qb) is negated, while in −Pa&Qb, only the sentence Pa is negated.Similarly, in ∃xFx&Gx, the quantifier binds only the x in Fx; this expression, therefore,is an open sentence. However, in ∃x(Fx&Gx), the quantifier applies to all of (Fx&Gx), .

Also, a quantifier only binds variables within its scope, and variables can be used overagain if there is no ambiguity about which quantifier they link with. Thus, ∃xFx&∃xGxmeans exactly the same thing as ∃xFx&∃yGy.

Page 101: Symbolic Logic Text

7.2. PREDICATE LOGIC NOTATION 89

7.2.7 Order

The predicate logic we study in this book is said to be first-order . This means that thevariables in the quantifiers apply only to elements of the universe of discourse U . In second-order logic we quantify over predicates and relations. This introduces further complicationsthat are well beyond the scope of this course.

7.2.8 Categorical Forms

Universal affirmative (A): All A are B.

∀x(Ax→ Bx) (7.7)

Universal negative (E): No A are B.

∀x(Ax→ −Bx) (7.8)

Particular affirmative (I): Some A are B.

∃x(Ax&Bx) (7.9)

Particular negative (O): Some A are not B.

∃x(Ax& −Bx) (7.10)

Note carefully that a universal statement does not involve an existence claim. Thatis, ∀x(Ax → Bx) does not claim that there are any A which are B. It says merely thatif there are any As, then they are B. (Thus, we can truly say “All orcs are nefarious”without, fortunately, there being any orcs in the real world.) By the truth table for thematerial conditional →, the conditional in the quantifier is true even if there are no As.Such universals are said to be vacuously true.

Existential claims such as “Some animals are mammals” do assert existence. The latterstatement can also be read “There are (or exist) some animals which are mammals.”

If we specifically wanted to express the true statement that “Some but not all animalsare mammals,” then we would have to write

∃x(Ax&Mx) & − ∀x(Ax→Mx), (7.11)

or equivalently

∃x(Ax&Mx) &∃x(Ax& −Mx), (7.12)

where Mx := “x is a mammal” and Ax := “x is an animal”.A negative existential such as “Some animals are not reptiles” also asserts existence,

for it is translated as

∃x(Ax& −Rx). (7.13)

Page 102: Symbolic Logic Text

90 CHAPTER 7. PREDICATE LOGIC: SYMBOLIZATION AND TRANSLATION

7.2.9 Don’t Be Fooled. . . !

Translation from natural languages such as English into predicate notation requires someskill and attention. The translations we have described here so far are not inflexible rulesto be followed blindly, but rather guidelines to be used with good judgement. You have tobe aware of the nuances of your language and of the intended logic of the statements youare translating.

Here is an example where “and” should not be translated as & : “All dogs and catsare carnivorous.” Let Mx := “x is carnivorous”. We might be tempted to translate this as

∀x((Dx&Cx)→Mx). (7.14)

But this cannot be right, since it literally says, “If anything is both a dog and a cat, thenit is carnivorous.” And that is not quite what we wanted to say. In fact, we wanted to say“All dogs are carnivorous and all cats are carnivorous.” This can be written as

∀x(Dx→Mx) &∀x(Cx→Mx). (7.15)

In the next chapter we will learn techniques that will enable us to show that this is equivalentto the following:

∀x((Dx ∨ Cx)→Mx). (7.16)

This can be read as “If anything is either a dog or a cat, then it is carnivorous”; and thatis what we actually wanted to say.

7.2.10 Combinations of Properties

The basic categorical forms can be easily extended to describe situations involving combi-nations of properties. It is impossible to anticipate all possible cases, but here are a fewtypical examples that will serve as guides:

1. All philosophers at the University of Lethbridge are either brilliant or witty.∀x((Px&Lx)→ (Bx ∨ Wx)).

2. Some philosophers at the University of Lethbridge are brilliant but not witty.∃x(Px&Lx&Bx& −Wx).

3. No philosopher at the University of Lethbridge is neither brilliant nor witty.∀x((Px&Lx)→ −− (Bx ∨ Wx))(By Double Negation, this is of course equivalent to∀x((Px&Lx)→ (Bx ∨ Wx)).)

7.2.11 Use of “Except”

Propositions of the general form “all A except B are C” are called exceptives. Not all logictexts agree on how they should be translated.

Suppose we want to translate into symbols the sentence “All dogs except chihuahuaslike the cold,” to borrow Lemmon’s example (see his Chapter 3, §1). Using obvious notationfor the predicates, he expresses this as

∀x((Dx& − Cx)→ Lx) (7.17)

Page 103: Symbolic Logic Text

7.2. PREDICATE LOGIC NOTATION 91

Literally, this says that if anything is a dog but not a chihuahua, then it likes the cold.Unfortunately, this does not tell us what is the case if a dog is a chihuahua. Lemmon musthave been aware of this, of course, and so I assume that his translation reflected a preferencefor logical minimalism — that is, translating into symbols in a way that requires as fewcommitments as possible. This may not be a bad practice in general. However, most logictexts now translate exceptives in such a way as to make the intention as explicit as possible.For instance, Lemmon’s sentence above would now be translated as

∀x((Dx& − Cx)→ Lx) &∀x((Dx&Cx)→ −Lx). (7.18)

It can be shown that this is equivalent to

∀x(Dx→ (Cx↔ −Lx)). (7.19)

In words, to say that all dogs except chihuahuas like the cold is the same thing as sayingthat if anything is a dog, then it is a chihuahua if and only if it does not like the cold.

Negative exceptives can be translated in a similar way. “No animal except the mongooselikes to eat cobras” could be translated as

∀x(Ax→ (Mx↔ Cx)). (7.20)

Examples such as exceptives illustrate the fact that the aim of symbolization is notto translate the natural language expression literally into symbols, but to analyze andexpress the implicit logical intention of the natural language expression — if one can befound (since in natural language we are often not very logically precise at all). In order totranslate a sentence correctly, we have to figure out its logic, not merely transliterate wordsinto symbols without thinking about their meanings.

7.2.12 Use of “Only”

The key to these is to recall that P → Q can be translated not only as “If P then Q,”but also as “P only if Q” or “Only if Q, P .” So we would therefore translate “Only themongoose likes eating cobras” as

∀x(Cx→Mx). (7.21)

where Cx := “likes eating cobras” and Mx := “is a mongoose”.

7.2.13 Variants of Categorical Forms Using Relations

We can construct innumerable variations of the basic categorical forms using relations.These will often involve multiple quantification.

Here are a few illustrative examples, which are, again, in no way meant to exhaust allthe possibilities.

1. Every A has relation R to some B:∀x∃y((Ax&By)→ Rxy).

2. Any cat is smarter than any dog.∀x∀y((Cx&Dy)→ Sxy).

Page 104: Symbolic Logic Text

92 CHAPTER 7. PREDICATE LOGIC: SYMBOLIZATION AND TRANSLATION

3. Some dogs are smarter than some people.∃x∃y(Dx&Py&Sxy).

Note that order of quantifiers is crucial to the meaning. For instance, let U be the set ofall people, and Pxy := “x is the parent of y”. Then ∀x∃yPxy says that everyone is theparent of someone, while ∃x∀yPxy says that at least one person is the parent of everyone— obviously a very different claim.

7.2.14 The Indefinite Article

In English the indefinite articles “a” and “an” indicate that the object called for could beone among possibly many objects of a similar nature. If I say, “Please bring me a book fromthat shelf,” I mean that any book from that shelf (although not any book in the universe)will do. By contrast, the definite article “the” implies uniqueness. I might say, for instance,“Please bring me the book from that shelf,” and this makes sense only if there is just onebook on that shelf.

An indefinite article phrase is an expression such as “an aardvark,” “a brown bear,”etc. Such phrases are not propositions by themselves but they can be combined with verbsor verb phrases in a variety of ways to make propositions. In predicate logic we can easilysymbolize indefinite article phrases. Since ∃xFx can be read, “There exists an F ,” “Thereexists an object that is F ,” or “There exists an x such that Fx,” we can get an indefinitearticle phrase simply by removing the quantifier ∃ from ∃xFx to get xFx. The latter canbe read “an F ,” “an object that is F ,” or “an x such that Fx”. The word “an” itself means“at least one.”

The variable x, when written this way in front of a propositional function Fx, canbe called an indefinite article operator . If Fx is a truth-functional combination of otherpropositional functions, we can use brackets to indicate the scope of the indefinite articleoperator. Suppose ∃x(Ux&Bx) means “There exist some brown bears”. Then x(Ux&Bx)means simply “a brown bear”. In such indefinite article phrases, the variable x in thepropositional function Fx is bound by the article operator in the same way that it is boundby the quantifier in a quantificational proposition.

In summary:

• To symbolize the phrase “an F ,” just remove the ∃ from the proposition ∃xFx (“Thereis an F”), to get xFx.

• To define a term letter that applies to any object in U that is an F , but only theobjects in U that are F s, pick a term letter that has not been used in any undischargedassumption, say a, and write a := xFx. This can be read as “Let a be an F”, or “Leta stand for any item in U that is F .”

• Note carefully that a := xFx is not an assumption because it is not even a propositionat all; instead, it is an instruction to use a symbol in a certain way.

– To make this point clear, note the difference between f := xHx and Hf . Thefirst could read “let ’Frodo’ be a hobbit,” and the second could read “Frodo is ahobbit.” The first formula is merely a stipulation that a letter will be used in acertain way, while the second is a substantive (false) claim.

Page 105: Symbolic Logic Text

7.2. PREDICATE LOGIC NOTATION 93

This notational device will prove to be very useful in certain parts of predicate naturaldeduction, to be introduced in the next chapter. It is also possible to symbolize the definitearticle, but in order to do that we need the concept of identity , which will appear in Chapter9.

Page 106: Symbolic Logic Text

94 CHAPTER 7. PREDICATE LOGIC: SYMBOLIZATION AND TRANSLATION

7.3 Exercises

In these exercises, use the following symbolizations:

Ax := “x is an aardvark”V x := “x is from the Transvaal”Fx := “x is an ant”Sx := “x likes to sleep in the sun”Cx := “x is a cobra”Dx := “x has a strange diet”Tx := “x is a termite”Lx := “x is an animal”Mx := “x is a mongoose”Exy := “x likes to eat y”Txy := “x wins the Lou Marsh trophy in y”m := “Mike Weir”n := “2003”

1. Translate the following sentences into predicate notation:

(a) All aardvarks like eating any ants.

(b) Only mongooses like to eat cobras.

(c) All aardvarks except those from the Transvaal like to sleep in the sun.

(d) No animals except aardvarks like to eat termites.

(e) No mongoose is from the Transvaal.

(f) Mike Weir won the Lou Marsh Trophy in 2003.

(g) a mongoose

(h) a mongoose that does not have a strange diet

2. Translate the following formulas into sensible English:

(a) ∀x((Mx&V x)→ Dx)

(b) ∃x∃y(Cx&My&Sx&Sy)

(c) ∀x((Mx ∨ Ax)→ ∀y(Fy → Exy))

(d) ∀x((Cx ∨ Mx)→ −V x)

(e) x(Mx&∀y(Cy → −Exy))

Page 107: Symbolic Logic Text

Chapter 8

Predicate Logic: NaturalDeduction

Natural deduction for first-order predicate logic consists of ordinary propositional naturaldeduction plus four special rules that allow us to take off quantifiers and put them back onagain.

8.1 Duality Relations

Before presenting the introduction and elimination rules for quantifiers, it is useful to statethe so-called Duality Relations which relate existentials and universals because these helpus to understand the meaning of the quantifiers. Later in this chapter we will see howformally to prove these relations.

∀xFx ↔ −∃x− Fx∃xFx ↔ −∀x− Fx

∀x− Fx ↔ −∃xFx∃x− Fx ↔ −∀xFx

When used to justify a move in a deduction, an instance of any one of these rulescan be referred to simply as Duality. Examples will be given below, once we review theintroduction and elimination rules for quantifiers.

8.2 Introduction and Elimination Rules

We will introduce the rules for the introduction and elimination of quantifier symbols inorder of their difficulty.

8.2.1 Universal Elimination (UE)

From ∀xFx we may infer Fa, where Fx is any propositional function of x (possibly includingother bound quantifiers) and a is any term. This rule simply expresses what is naturallyimplied by a universal claim, which is that it applies to anything in the universe of discourse.

95

Page 108: Symbolic Logic Text

96 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

(UE is sometimes called “Universal Instantiation,” because we use it to write an instanceof a universal claim.)

Example: From the statement “All philosophers are witty” and the statement “Paul isa philosopher”, we can validly conclude that Paul is witty:

1 (1) ∀x(Px→Wx) A2 (2) Pp A1 (3) Pp→Wp 1 UE1,2 (4) Wp 2,3 MP

8.2.2 Universal Introduction (UI)

This rule allows us to move from a proposition of the form Fa, where a is an arbitraryconstant , to a universal ∀xFx. This is valid, however, only if a is genuinely arbitrary, in thesense that it is a name which could be applied to any element in the universe of discourseU . Think of arbitrary constants such as a as being temporary names assigned to itemspicked entirely at random from U , so that there is nothing special about a — whatever wesay about it can be said about anything in U . We can assure ourselves of this if we makesure that a was not mentioned in the premisses, an undischarged temporary assumption,or a definition. (Remember, in deductive logic we have no access to any information thatis not explicitly given in the premisses.) UI is called “Universal Generalization” in manylogic texts.

Here is a valid use of UI.

Example: All horses are mammals, all mammals are warm-blooded; therefore,all horses are warm-blooded.

Using obvious definitions of the predicate symbols, we can symbolize this as follows:

46 ∀x(Hx→Mx), ∀x(Mx→Wx) ` ∀x(Hx→Wx)

1 (1) ∀x(Hx→Mx) A2 (2) ∀x(Mx→Wx) A1 (3) Ha→Ma 1 UE2 (4) Ma→Wa 2 UE1,2 (5) Ha→Wa 3,4 HS1,2 (6) ∀x(Hx→Wx) 5 UI

The arbitrary constant a introduced in line 3 does not appear in the premisses. Theuse of UI in moving from line 5 to 6 is therefore valid.

Those who have studied categorical logic will recognize this sequent as the familiarsyllogism in Barbara: all A’s are B’s, all B’s are C’s; therefore, all A’s are C’s. (See Copi[6].)

Restriction on UI

We cannot use UI on a term that was introduced in an undischarged assumption or adefinition.

Page 109: Symbolic Logic Text

8.2. INTRODUCTION AND ELIMINATION RULES 97

8.2.3 Fallacy Alert!

If we attempt to universalize on a term appearing in a premiss, we risk committing thefallacy of Hasty Generalization. For instance, from the fact that Paul is bald, we cannotvalidly infer that everyone is bald:

1 (1) Bp A1 (2) ∀yBy 1 UI (?!)

This is invalid because the term p appears in an assumption.

8.2.4 Existential Introduction (EI)

Existential Introduction, like Universal Elimination, is a very safe rule to use, since there isonly one simple restriction on the terms involved.

EI states that from Fa, where a is any term (proper name or arbitrary constant), wemay infer ∃xFx.

For example, from the statement that Paul is a witty philosopher, we can validly con-clude that something is a witty philosopher (or, in other words, that there exists a wittyphilosopher):

1 (1) Pp&Wp A1 (2) ∃x(Px&Wx) 1 EI

Restriction on EI

The only restriction on EI is that we can only introduce an existential quantifier on one dis-tinct term letter at a time. Thus for instance, it is valid to go from Fa&Ga to ∃x(Fx&Gx),but invalid to try to go from Fa&Gb to ∃x(Fx&Gx).

8.2.5 Existential Elimination (EE)

The general pattern of EE is that we infer a conclusion P from an existential claim ∃xFx(possibly together with other premisses). Just as an existentially quantified statement isa generalization of an OR statement, Existential Elimination is a generalization of vE. InvE we have to examine each possible inferential path starting by temporarily assumingeach disjunct and seeing if it leads to the desired conclusion; in EE, we examine a typicalinferential path defined by introducing an arbitrary constant that can stand for anythingthat has the property F .

Here is a simple example of the use of EE. Suppose we want to establish the validity ofthe following argument: All philosophers are wise; there are some philosophers; therefore,something is wise. We proceed as follows:

47 ∀x(Px→Wx), ∃xPx ` ∃xWx

1 (1) ∀x(Px→Wx) A2 (2) ∃xPx A

Page 110: Symbolic Logic Text

98 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

(3) a := xPx Df.2 (4) Pa 2,3 EE1 (5) Pa→Wa 1 UE1,2 (6) Wa 4,5 MP1,2 (7) ∃xWx 6 EI

The object is to prove the desired conclusion from the existential in line 2, in combina-tion with the assumption at line 1. In 3 we introduce an arbitrary constant a. It is crucial(as we shall see) that the arbitrary constant introduced here be a new constant that hasnot appeared in any earlier lines (because that would, in effect, remove its arbitrariness).We are saying that we will suppose that a is the name of the entity referred to in line 2.

Note very carefully that line 3 is not an assumption; in fact, it is not a proposition atall. Rather, it is an instruction to use a certain symbol in a certain way. Hence it doesnot have to be discharged.1 However, we do cite the definition on line 3 in the justificationcolumn for line 4, because we use it to derive line 4.

In line 5 we can instantiate on line 1 with UE in terms of a because UE can be appliedto any term whatsoever. We then do a little elementary propositional logic, and then atline 7 we get the desired conclusion by using EI to go from line 6.

What we are doing in EE is actually a very simple pattern of reasoning that we oftenuse in day-to-day discussions. Here is the same argument cast into informal language:

Suppose we accept that all philosophers are wise and that there are some philoso-phers. Suppose also that a is a typical philosopher. Then a must be wise;therefore, someone is wise.

That’s all there is to it!

Restrictions on EE

There are three restrictions on the use of EE.

1. The arbitrary constant cannot appear in a premiss of the argument.

2. The arbitrary constant cannot be one that appears in any previous assumptions unlessthey were discharged.

3. The arbitrary constant cannot have been used in a previous definition.

These rules can be summarized by saying that when introducing a term by means of adefinition, always use a new letter. Also, don’t forget that we are not allowed to do UniversalIntroduction on any term introduced by a definition.

1Many logic texts (such as [17]) use a much more complicated version of EE in which the propositionPa is introduced as the leading assumption of a subproof. Once the desired conclusion ∃xWx is reached, itis necessary to back out of the subproof, discharging the initial assumption along the way. The use of thearticle operator, as in line 3 above, eliminates the need for all of this complexity. Some elementary logictexts, on the other hand, would move directly from lines 1 and 2 above to line 4, simply calling the move‘Existential Elimination’ or ‘Existential Instantiation.’ This move is of questionable validity, since one doesnot really know what a is. If a term is selected at random from U , that does not guarantee that it willbelong to the subset of U containing all but only those items whose existence is claimed by the existentialpremiss we are working from.

Page 111: Symbolic Logic Text

8.2. INTRODUCTION AND ELIMINATION RULES 99

8.2.6 Fallacy Alert!

There are several types of fallacies that we can fall into if we do not obey the restrictionson the use of the arbitrary constant.

Fallacies from Constants in a Premiss

Here is a fallacy that results from running an EE with a constant that appears in a premiss:

1 (1) ∃xFx A2 (2) Ga A

(3) a := xFx Df.1 (4) Fa 1,3 EE (??)1,2 (5) Fa&Ga 2,3 &I1,2 (6) ∃x(Fx&Gx) (?!) 4 EI

The mistake here is to use the arbitrary constant a from line 2 in the assumption in line 3.The use of EI in going from 4 to 5 is in itself correct.

Intuitively, this “argument” says that from the fact that something is F , and that a isG, we can conclude that the same thing is both F and G. This is as if we said that fromthe fact that there exists a man, and from the fact that our friend Susan is a woman, thatSusan is both woman and man.

By contrast, here is a slightly different sequent that is, in fact, valid:

48 ∃xFx,Ga ` ∃xFx&∃xGx

1 (1) ∃xFx A2 (2) Ga A2 (3) ∃xGx 2 EI1,2 (4) ∃xFx&∃xGx 1,3 &I

Here is an instance of this sequent in words: from the fact that there exists a man, andfrom the fact that Susan is a woman, we can validly conclude that there exists a man andthat there exists a woman.

Fallacies From Using a Previously Defined Constant

1 (1) ∃xFx A2 (2) ∃xGx A

(3) a := xFx Df.(4) a := xGx Df. (??)

1 (5) Fa 1,3 EE2 (6) Ga 2,4 EE1,2 (7) Fa&Ga 5,6 &I1,2 (8) ∃x(Fx&Gx) 7 EI

This argument would apparently allow one to infer from the fact that there are some oddnumbers and some even numbers, that there are some numbers which are both odd andeven. The mistake was using the letter a in line 4 that had already been defined in adifferent way in line 3.

Page 112: Symbolic Logic Text

100 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

Fallacies Combining EE with UI

The example given in the “Fallacy Alert!” for UI, involving Hasty Generalization, is obvi-ously fallacious. But slightly more subtle fallacies are possible if we are not careful to obeythe prohibition against using UI on constants that we have defined in order to do an EE.Consider the following “derivation”:

1 (1) ∃xFx A(2) a := xFx Df.

1 (3) Fa 1,2 EE1 (4) ∀xFx 2 UI (??)

From the assumption that something is F we seem again to have arrived at the conclusionthat everything is F ! In fact, the use of EE at line 3 is correct. The fallacy here is in line 4,where we universalized on the name introduced in line 2. Never universalize on a constantdefined in terms of an article phrase.

8.3 Summary of Introduction and Elimination Rules for Quan-tifiers

Universal Elimination (UE):From ∀xFx, we can infer Fa, where a is any term.Restrictions: none.

Universal Introduction (UI):From Fa, where a is an arbitrary constant, we can infer ∀xFx.Restrictions:

• The term a cannot have been mentioned in a premiss.

• The term a cannot appear in an undischarged assumption.

• The term a cannot have already been defined using an article phrase.

Existential Introduction (EI):From Fa, where a is a term, we can infer ∃xFx.Restrictions:

• We can only do EI on all occurrences in a formula of one term letter; never doEI on more than one term letter.

Existential Elimination (EE):From ∃xFx we can infer Fa, so long as a := xFx.Restrictions:

• The arbitrary constant a cannot appear in a premiss.

• The arbitrary constant a cannot appear in a previous undischarged assumption.

• The arbitrary constant a cannot appear in a previous definition.

Page 113: Symbolic Logic Text

8.4. EXAMPLES 101

8.4 Examples

In this section I develop a number of typical quantificational arguments to show the appli-cation of the Introduction and Elimination Rules.

49 ∀x(Hx→Mx),∀x(Mx→ −Cx) ` ∀x(Hx→ −Cx)

1 (1) ∀x(Hx→Mx) A2 (2) ∀x(Mx→ −Cx) A1 (3) Ha→Ma 1 UE2 (4) Ma→ −Ca 2 UE1,2 (5) Ha→ −Ca 3,4 HS1,2 (6) ∀x(Hx→ −Cx) 5 UI

Here is an instance of this sequent in words:

All horses are mammals;no mammals are chitinous;∴ no horses are chitinous.

50 ∀x((Dx ∨ Fx)→ Cx),−Cb ` −Db

1 (1) ∀x((Dx ∨ Fx)→ Cx) A2 (2) −Cb A1 (3) (Db ∨ Fb)→ Cb 1 UE1,2 (4) −(Db ∨ Fb) 2,3 MT1,2 (5) −Db& − Fb 4 deM1,2 (6) −Db 5 &E

An instance in words:

All dogs and cats are carnivorous;Bessy is not carnivorous;∴ Bessy is not a dog.

51 ∀x(Rx→ V x), ∃xRx ` ∃xV x

1 (1) ∀x(Rx→ V x) A2 (2) ∃xRx A

(3) a := xRx Df.2 (4) Ra 2,3 EE1 (5) Ra→ V a 1 UE1,2 (6) V a 4,5 MP1,2 (7) ∃xV x 6 EI

Here is an instance in words:

Page 114: Symbolic Logic Text

102 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

Whoever wears red has a vivid colour sense;someone is wearing red;∴ someone has a vivid colour sense.

52 ∀x(Rx→ (V x& Ix)), ∃xRx ` ∃xIx

1 (1) ∀x(Rx→ (V x& Ix)) A2 (2) ∃xRx A

(3) a := xRx Df.2 (4) Ra 2,3 EE1 (5) Ra→ (V a& Ia) 1 UE1,2 (6) V a& Ia 4,5 MP1,2 (7) Ia 6 &E1,2 (8) ∃xIx 7 EI

Here is an instance in words:

Whoever wears red has a vivid colour sense and imagination;someone here is wearing red;∴ someone here has imagination.

53 ∀x(Rx→ (V x& Ix)),∀x(Ix→ Gx), ∃xRx ` ∃xGx

1 (1) ∀x(Rx→ (V x& Ix)) A2 (2) ∀x(Ix→ Gx) A3 (3) ∃xRx A

(4) a := xRx Df.3 (5) Ra 3,4 EE1 (6) Ra→ (V a& Ia) 1 UE1,3 (7) V a& Ia 5,6 MP1,3 (8) Ia 6 &E2 (9) Ia→ Ga 2 UE1,2,3 (10) Ga 8,9 MP1,2,3 (11) ∃xGx 10 EI

Instance in words:

Whoever wears red has a vivid colour sense and imagination;whoever has imagination will go far in life;someone is wearing red;∴ someone will go far in life.

8.5 Distribution Rules for Quantifiers

There are a number of sequents which govern the distribution of quantifiers over formulas.Some require care in their application.

Page 115: Symbolic Logic Text

8.6. PROOFS OF DUALITY RULES 103

54 ∀x(Fx&Gx) a` ∀xFx&∀xGxThe proof of this interderivability result is fairly straightforward, and is left as an

exercise (see Review Exercises). Note, however, that the following, which very similar, isinvalid:

∀x(Fx ∨ Gx) ` ∀xFx ∨ ∀xGx

A simple example shows that this is invalid: it is true that all natural numbers are eithereven or odd, but it is false that either all natural numbers are even or all natural numbersare odd.

The converse, however, is valid:

55 ∀xFx ∨ ∀xGx ` ∀x(Fx ∨ Gx)If everything is red or everything is green, then everything is red or green. (See Review

Exercises.)

56 ∃x(Fx ∨ Gx) a` ∃xFx ∨ ∃xGx(See Review Exercises.)

57 ∃x(Fx&Gx) ` ∃xFx&∃xGx(See Review Exercises.) The converse is invalid, however; from the fact that there are

some numbers which are even and some numbers which are odd, we cannot conclude thatthere are numbers which are both even and odd.

8.5.1 Summary of Distribution Rules

As described above, quantifiers can be distributed over their scopes sometimes, but not inall cases; one has to be careful in using these rules. Here are the ones that are valid, togetherwith convenient short-form names that can be used when they are cited in a justificationline:

∀x(Fx&Gx) a` ∀xFx&∀xGx Dist∀∀xFx ∨ ∀xGx ` ∀x(Fx ∨ Gx) Dist∀∃x(Fx ∨ Gx) a` ∃xFx ∨ ∃xGx Dist∃∃x(Fx&Gx) ` ∃xFx&∃xGx Dist∃

Note carefully that some of these results are interderivabilities, while some go only one way.Quantifiers can in some cases be “distributed” over → but the results are more com-

plicated and are best dealt with on an individual basis. (See the Review Exercises.)

8.6 Proofs of Duality Rules

The Duality Rules make intuitive sense from the meaning of quantificational symbols. Butthey can be proven using the introduction and elimination rules for predicate logic. I willprove the first one here; the remaining proofs are covered in the Review Exercises. Someof these derivations, like many proofs of propositions that seem to be “obvious,” are rathersubtle.

Page 116: Symbolic Logic Text

104 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

58 ∀xFx a` −∃x− Fx

(a) ∀xFx ` −∃x− Fx1 (1) ∀xFx A2 (2) ∃x− Fx A (RAA)

(3) a := x(−Fx) Df.2 (4) −Fa 2,3 EE1 (5) Fa 1 UE1,2 (6) Fa& − Fa 4,5 &I1 (7) −∃x− Fx 2–6 RAA

It is pretty clear what is going on in this proof: Assuming the opposite of the desiredconclusion leads to a contradiction with the starting assumption.

(b) −∃x− Fx ` ∀xFx1 (1) −∃x− Fx A2 (2) −Fa A (RAA)2 (3) ∃x− Fx 2 EI1,2 (4) ∃x− Fx& − ∃x− Fx 1,3 &I1 (5) Fa 2–4 RAA1 (6) ∀xFx 5 UI

In line 2, a is an arbitrary name which could be applied to any member of the domainof discourse. What this proof says is that given the starting premiss −∃x − Fx, we get acontradiction if we assume −Fa for any a in the domain.

59 ∃xFx a` −∀x− Fx

60 ∀x− Fx a` −∃xFx

61 ∃x− Fx a` −∀xFx

8.7 Rules of Passage

There are a number of quantificational formulas that are known as the Rules of Passage.In the following, P is any proposition whatsoever; it could, for instance, be a predicateformula itself containing quantifiers and bound variables. The rules themselves are called“Rules of Passage” because they allow us to pass P in or out of the scope of a quantifier.

62 ∃x(P &Fx) a` P &∃xFx

63 ∀x(P &Fx) a` P &∀xFx

64 ∃x(P ∨ Fx) a` P ∨ ∃xFx

65 ∀x(P ∨ Fx) a` P ∨ ∀xFx

Page 117: Symbolic Logic Text

8.8. INTERDERIVABILITY ⇔ EQUIVALENCE THEOREM 105

66 ∃x(P → Fx) a` P → ∃xFx

67 ∀x(P → Fx) a` P → ∀xFx

68 ∃x(Fx→ P ) a` ∀xFx→ P

69 ∀x(Fx→ P ) a` ∃xFx→ P

Note the switch between universals and existentials in the last two formulas. I will giveproofs of the last formula here; the rest of the Rules of Passage are left as Review Exercises.

(a) ∀x(Fx→ P ) ` ∃xFx→ P

1 (1) ∀x(Fx→ P ) A2 (2) ∃xFx A (CP)

(3) a := xFx Df.2 (4) Fa 2,3 EE1 (5) Fa→ P 1 UE1,2 (6) P 4,5 MP1 (7) ∃xFx→ P 2–6 CP

(b) ∃xFx→ P ` ∀x(Fx→ P )1 (1) ∃xFx→ P A2 (2) −∀x(Fx→ P ) A (RAA)2 (3) ∃x− (Fx→ P ) 2 Duality2 (4) ∃x(Fx& − P ) 3 Neg→

(5) b := x(Fx& − P ) Df.2 (6) Fb& − P 4,5 EE2 (7) Fb 6 &E2 (8) −P 6 &E2 (9) ∃xFx 7 EI1,2 (10) P 1,9 MP1,2 (11) P & − P 8,10 &I1 (12) ∀x(Fx→ P ) 2–11 RAA

Any of the sequents 62–69 can be sited simply as “Passage” in the justifications columnof a proof.

8.8 Interderivability ⇔ Equivalence Theorem

Don’t forget that any interderivability result in predicate logic, such as the Duality Rulesor the Rules of Passage, is translatable into a theorem stating an equivalence, and any suchequivalences can be used for substitutions just as with propositional logic.

For instance, from the first Rule of Passage listed above (62), we have

` ∃x(P &Fx)↔ (P &∃xFx)

Page 118: Symbolic Logic Text

106 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

8.9 What to Memorize

I do not expect you to memorize the Rules of Passage for this course. You should know theDuality Rules, however, since they depend on a very basic understanding of the meaning ofthe quantificational symbols. And of course you should be able to use UE, UI, EE, and EI.

8.10 A Shortcut

It is essential to master the use of the four introduction and elimination rules, becausethey allow for general procedures that can be applied to any first-order quantificationalargument. Under some circumstances to be specified below, however, we can skip the use ofthese rules and do Boolean operations directly on propositional functions within the scopeof quantifiers. This can greatly simplify many derivations.

Here are a few simple examples:

∀x∃y − (Fx→ Rxy) ↔ ∀x∃y(Fx& −Rxy) Neg→

∀x∀y(Fx→ (Gx→ Hy)) ↔ ∀x∀y((Fx&Gx)→ Hy) Exp

∀x(Fx→ (Gx&Hx)) ↔ ∀x((Fx→ Gx) & (Fx→ Hx)) Dist→

In other words, propositional equivalence relations can be applied to propositional functionsso long as they are entirely within the scope of the quantifiers that govern them.

The proof of the general validity of this rule requires metalogical techniques that arebeyond the scope of this course, although individual examples can be proven using thetechniques in this chapter.

8.11 Fallacy Alert!

A rather common error is the following:

(n) −∀xFx(n+1) −Fa n UE

Line (n) is (by Duality) equivalent to ∃x(−Fx). So there is something that is −F , butit might not be the individual a! If we want to use the statement −Fa (as we might, forinstance, if we were going to try an EE), we would have to assume it, and then (we hope)discharge it if the EE is successful. In other words, the statement −∀xFx is not really auniversal statement at all, and hence we cannot apply UE to it.

8.12 Fallacy Alert!

It is not possible to derive a particular existential from a universal. For instance, from thestatement that all hobbits live in the Shire, we cannot infer that there are some hobbitsliving in the Shire. The following inference is invalid:

1 (1) ∀x(Hx→ Sx) A1 (2) ∃x(Hx&Sx) 1 ???

Page 119: Symbolic Logic Text

8.13. REVIEW EXERCISES FOR CHAPTER 8 107

This is essentially a consequence of the fact that a universal claim is hypothetical and doesnot assert existence, while an existential claim does.

By comparison, the following pattern of inference is valid:

1 (1) ∀x(Hx→ Sx) A1 (2) Ha→ Sa 1 UE1 (3) ∃y(Hy → Sy) 2 EI

However, line (3) does not assert the existence of hobbits, whether living in the Shire ornot. It merely says that there is something which if it is a hobbit, lives in the Shire. Thisis like saying that if I were a hobbit, I would live in the Shire; and this in no way assertsthe existence of hobbits!

8.13 Review Exercises for Chapter 8

Use the following symbols for individuals, predicates, and relations:

p := Platos := Socratesh := Heideggerr := Rufusk := The Yukonc := the Republict := Thumperb := Being and TimeLx := x is a rabbitBx := x is brownDx := x is a dialogue

Ux := x is a bearHx := x is a personNx := x loves honeyAx := x loves asparagusRx := x loves broccoliMx := x is a mammalWx := x is warm-bloodedLxy := x loves yCxy := x came from yTxy := x takes y seriouslyAxy := x is the author of y

A

1. Translate the following arguments into symbols, and prove them valid:

(a) Rufus loves honey.∴ something loves honey.

(b) Plato was the author of The Republic.∴ Plato was the author of something.

(c) All bears love honey.Rufus is a bear.∴ Rufus loves honey.

(d) No bears love broccoli.Thumper loves broccoli.∴ Thumper is not a bear.

(e) All bears are mammals.All mammals are warm-blooded.Rufus is a bear.∴ Rufus is warm-blooded.

Page 120: Symbolic Logic Text

108 CHAPTER 8. PREDICATE LOGIC: NATURAL DEDUCTION

2. Prove the validity of the following sequents:

(a) Fa&Fb ` ∃xFx(b) ∀x(Mx→Wx),Ma&Ra `Wa

(c) ∀x(Fx→ Gx),∃xFx ` ∃xGx(d) ∀x(Fx→ −Gx), ∃xGx ` ∃x− Fx(e) ∀x− (Fx ∨ Gx) ` −∃x(Fx ∨ Gx)

B

1. Translate the following arguments into symbols and prove them valid:

(a) No one takes anything written by Heidegger seriously.Being and Time was written by Heidegger.∴ Socrates does not take Being and Time seriously.

(b) All bears love honey.Rufus is a brown bear.∴ Something brown loves honey.

(c) Plato was the author of Republic.Republic is a dialogue.∴ Plato wrote some dialogues.

(d) All bears love honey.Some bears are brown.All bears are mammals.∴ Some brown mammals love honey.

(e) All bears and rabbits are mammals.Bears do not like asparagus.Some bears come from the Yukon.∴ some mammals do not like asparagus.

2. Prove the Rules of Passage, Sequents 62 through to 69.

3. Prove the validity of the following sequents:

(a) Dist∀ (all of them)

(b) Dist∃ (all of them)

(c) ∀x(Fx→ Gx) ` ∀xFx→ ∀xGx(d) ∃x(Fx→ Gx) ` ∀xFx→ ∃xGx

4. It can be very instructive to try to make up your own examples as well!

C

1. Prove the Duality relations that were not proven in the Chapter; that is, Sequents59, 60, and 61.

Page 121: Symbolic Logic Text

Chapter 9

Predicate Logic With Identity

In this chapter we introduce the use of two simple rules concerning the relation of identity.These rules permit a powerful extension of predicate logic which enables us to express thelogical structure of a much larger class of quantificational propositions with a very minimalincrease in formal machinery.

By “identity” we simply mean a two-term relation which says that two proper namesor constants point to the same object in the universe of discourse. We could write this as

Ixy := x is identical to y

but it is easier, and more in keeping with standard mathematical usage, to express identityas follows:

(x = y) iff x and y point to the same member of U .

We will also allow the use of the following obvious notational convenience:

(a 6= b) := −(a = b)

To analyze the meaning of identity is, in fact, a tricky philosophical problem. Thedefinition I have given here is perilously close to being circular, since we probably wouldhave to use the identity relation in order to express the fact that two symbols single out thesame individual. For our purposes, however, we will define identity simply to be a relationthat obeys the following introduction and elimination rules:

Identity Introduction (=I):For any term a the formula a = a is a theorem, and may be introduced at any

point in a proof. In symbols:` a = a.

Identity Elimination (=E):If s and t are terms, if s = t, and given a proposition A(s) (that is, a proposition

containing s), we can derive A(t) by substituting one or more occurrences ofs in A(s) with t.

The second rule, =E, is the one we will use most often. It simply allows the substitution ofone term symbol for another when they are equal, similarly to the way we can substituteequals for equals in mathematics.

109

Page 122: Symbolic Logic Text

110 CHAPTER 9. PREDICATE LOGIC WITH IDENTITY

9.1 Examples

Here are a few elementary examples which illustrate the use of these rules, and demonstratesome basic properties of the identity relation:

70 a = b ` b = a

1 (1) a = b A(2) a = a =I

1 (3) b = a 1,2 =E

In line (3) we cite two lines, one of which is the statement of the identity we are using, whilethe other is the line into which we make a substitution. In line (2) we substitute b for a.This proof demonstrates that identity is a symmetric relation.

71 a = b& b = c ` a = c

1 (1) a = b& b = c A1 (2) a = b 1 &E1 (3) b = c 1 &E

(4) a = c 2,3 =E

This proof demonstrates that identity is a transitive relation.The next proof demonstrates a triviality: to say that something is Paul, and that thing

loves curry, is equivalent to saying that Paul loves curry. The main interest in this proof isthat it demonstrates the use of =E.

72 Fp a` ∃x(x = p&Fx)I’ll prove one part of it; the converse is left as an exercise.

1 (1) ∃x(x = p&Fx) A(2) b := p&Fb Df.

1 (3) b = p&Fb 1,2 EE1 (4) b = p 3 &E1 (5) Fb 3 &E1 (6) Fp 4,5 =E

The next proof shows that the way we use identity here corresponds to our basicnotion of identity, which is that things that are presumed identical should have all thesame properties.

73 a = b ` Fa↔ Fb

1 (1) a = b A2 (2) Fa A (CP)1,2 (3) Fb 1,2 =E1 (4) Fa→ Fb 2–3 CP5 (5) Fb A (CP)1,5 (6) Fa 1,5 =E

Page 123: Symbolic Logic Text

9.2. FALLACY ALERT! 111

1 (7) Fb→ Fa 5–6 CP1 (8) (Fa→ Fb) & (Fb→ Fa) 4,7 &I1 (9) Fa↔ Fb 8 Def↔

9.2 Fallacy Alert!

As shown above, the following pattern of reasoning is correct:

a = b ` Fa↔ Fb. (9.1)

This simply expresses what we mean by identity. But do not attempt the following move:

Fa↔ Fb ` a = b. (????) (9.2)

This is invalid because the two individuals a and b do not necessarily have to be the samein order for Fa to be true iff Fb is true. Suppose Fx is defined as “x is a philosopher.” Thestatement “F (Socrates) ↔ F (Plato)” is true, and yet Socrates was a distinct individualfrom Plato.

Remember, to say a = b is to say that the symbols a and b point to the same individualin the universe of discourse. To say Fa ↔ Fb is to say that the propositions Fa and Fbhave the same truth value.

9.3 Expressing Numerical Relations

We can use the identity relation to express simple numerical relationships. Here are someexamples:

At most one person knows the secret.∀x∀y((Kx&Ky)→ (x = y))

Here it is in words that are half-way between logic and ordinary English: Forall x and y, if x and y know the secret, then x and y are the same person.

Exactly (or only) one person knows the secret.∃x(Kx&∀y(Ky → (y = x))).

In half-logic: Someone knows the secret, and anyone else who knows the secretis that person.

Exactly two people know the secret.∃x∃y(Kx&Ky& (x 6= y) &∀z(Kz → ((z = x) ∨ (z = y)))).

In half-logic: there are two distinct people who know the secret, and anyone elsewho knows the secret is identical to one or the other of those two people.

Page 124: Symbolic Logic Text

112 CHAPTER 9. PREDICATE LOGIC WITH IDENTITY

9.4 Superlatives

It is possible to express simple superlatives using identity. Here’s an example of such atranslation, using hopefully obvious notation:

Smith is the tallest person in Logic class.Ls& ∀y((Ly& (y 6= s))→ Tsy)

In half-logic: Smith is in the Logic class and if there is anyone else in the Logicclass who is not Smith then Smith is taller than that person.

9.5 Uniqueness

The identity relation can be used to express the concept of uniqueness. Suppose we wantto say that there is a unique individual who is Prime Minister of Canada. Let Px := x is aPrime Minister of Canada. To express the concept that there is only one person who is thePrime Minister of Canada, we can write

∃x(Px&∀y(Py → (x = y)).

In words: there exists at least one Prime Minister of Canada, and any such individuals areidentical. This could also be read as saying that there is one and only one Prime Ministerof Canada, or that there is a unique Prime Minister of Canada.

It is sometimes convenient to have a special symbol to express the concept of uniqueexistence, and this is commonly done as follows:

∃!xPx := ∃x(Px&∀y(Py → (x = y))

Bertrand Russell is said to have read this as “E-shriek! x. . . ”, and it can be read as “thereexists one and only one P” or “there exists a unique P .”

9.6 Numerically Definite Quantifiers

The notion of unique existence can be generalized to express the existence of any definitenumber n of objects. Define the symbol ∃1xFx as follows:

∃1xFx := ∃!xFx.

Then we can write

∃2xFx := ∃x∃y(Fx&Fy& (x 6= y) &∀z(Fz ↔ (z = x ∨ z = y)))

This can be extended to any n. (See Exercises.)

9.7 Definite Descriptions

Phrases such as “the present King of France,” “the man in the Moon,” “the last person tocross the finish line,” or “the first African woman to win the Nobel Peace Prize,” are calleddefinite descriptions. They are widely used in ordinary language, and yet it might not beimmediately obvious how they should be analyzed logically. To see why this is important,consider the following sentences:

Page 125: Symbolic Logic Text

9.7. DEFINITE DESCRIPTIONS 113

1. “France presently has a bald King.”

2. “The present King of France is bald.”

The first sentence is false. The second sentence seems very suspicious, but it is not imme-diately obvious how to assign it a truth value. Some philosophers have gone so far as todeclare sentences such as this, in which a definite description points to a nonexistent object,to be meaningless. But that doesn’t seem quite right either, since we make deductions usingdefinite descriptions all the time, whether or not the objects mentioned are known to exist.Consider, for instance, the following elementary argument:

Rudolph is the lead reindeer on Santa’s sleigh.Blitzen is not Rudolph.∴ Blitzen is not the lead reindeer on Santa’s sleigh.

Intuitively this seems to be valid, so we wouldn’t want to say that definite descriptionshave no logic or are even meaningless. And yet we still have to say how to represent itsvalidity formally. As before, we motivate the need for new types of symbols by showingthat there are arguments that are clearly valid, but whose validity we do not yet know howto demonstrate.

What makes this argument work is clearly the fact that Rudolph is supposed to be thelead reindeer, not just any reindeer. In 1905, Bertrand Russell suggested that we shouldanalyse propositions using definite descriptions as combinations of existence claims withuniqueness claims. Thus, for instance, we could express “Harper is the Prime Minister” as

Ph&∀y(Py → (y = h))

In words, “Harper is the Prime Minister and if anyone is the Prime Minister, that person isHarper”. This statement needs to be distinguished carefully from the claim that “Harper isa Prime Minister,” which would be symbolized simply as Ph. The latter statement makessense, since there are, after all, several countries that have Prime Ministers, and severalformer Prime Ministers of Canada.

To say “The Prime Minister is a former Finance Minister,” we can write

∃x(Px&Fx&∀y(Py → (y = x)))

And to say “The present King of France is bald,” we can write

∃x(Kx&Bx&∀y(Ky → (y = x))

In other words, this simply says, “There exists a unique bald King of France,” and, at thetime of this writing, this statement is false.

We can also express the identity of two descriptions. “The author of Hamlet is theauthor of Lear” can be expressed as

∃x(Hx&Lx&∀y((Hy&Ly)→ (y = x))

As these examples suggest, a definite description such as “the Prime Minister of Canada”functions grammatically very much like a proper name such as “Mr. Stephen Harper.”

What about negations of statements using definite descriptions? For instance, howwould we translate “Layton is not the Prime Minister?” Intuitively, it seems that to deny

Page 126: Symbolic Logic Text

114 CHAPTER 9. PREDICATE LOGIC WITH IDENTITY

this statement should amount to saying that either Layton is not a Prime Minister, or thatsomeone other than Layton is a Prime Minister; either or both of those possibilities wouldbe sufficient to negate the claim that Layton is the Prime Minister.

A quick bit of predicate natural deduction shows that this analysis is correct:

−(Pl&∀y(Py → (y = l)))↔ −Pl ∨ −∀y(Py → (y = l)) deM↔ −Pl ∨ ∃y − (Py → (y = l)) Duality↔ −Pl ∨ ∃y(Py& (y 6= l)) Neg→

Russell’s method of analysing the logic of definite descriptions has generated a lot ofcontroversy in the literature on philosophy of language. However, it provides a workablelogical analysis of definite descriptive claims and, as we shall show next, allows us to demon-strate the validity of arguments involving definite descriptions which should, indeed, turnout to be valid. Russell’s method also illustrates a point that we noted earlier, which is thatthe grammatical structure of a sentence in a particular natural language such as English isnot always a good indication of its underlying logical form.

In his later years Russell defended his approach against criticism by P. F. Strawson:

My theory of descriptions was never intended as an analysis of the state of mindof those who utter sentences containing descriptions. Mr Strawson gives thename ‘S’ to the sentence ‘The King of France is wise’, and he says of me ‘Theway in which he [Russell] arrived at the analysis was clearly by asking himselfwhat would be the circumstances in which we would say that anyone who utteredthe sentence S had made a true assertion’. This does not seem to me a correctaccount of what I was doing. . . . I was concerned to find a more accurate andanalysed thought to replace the somewhat confused thoughts which most peopleat most times have in their heads. [21, p. 243]

Russell, and indeed many of those who contributed to modern symbolic logic, believed thatthe job of the logician is not only to describe the logic of ordinary reasoning in a preciseway, but to devise techniques that can help us to reason more effectively.

9.7.1 Example

Consider the following argument:

Harper is the Prime Minister.Layton is not Harper.∴ Layton is not the Prime Minister.

Using obvious definitions for the term and predicate symbols, we can prove this argu-ment valid as follows:

1 (1) Ph& ∀x(Px→ (x = h)) A2 (2) l 6= h A1 (3) Ph 1 &E1 (4) ∀x(Px→ (x = h)) 1 &E5 (5) Pl→ (l = h) 4 UE1,2 (6) −Pl 2,5 MT1,2 (7) −Pl ∨ ∃y(Py& (y 6= l)) 6 vI

Page 127: Symbolic Logic Text

9.8. THE DEFINITE ARTICLE 115

Note, again, that line (6) only says that Layton is not a Prime Minister. We need to “or” onthe extra clause that states that someone else might be Prime Minister as well as Layton.

A Fine Point

Many logic texts write statements such as “Harper is the Prime Minister” in the followingmore complicated form:

∃x(Px&∀y(Py → (y = h)) &Ph) (9.3)

This is equivalent to the simpler form we use here, which can be used whenever we knowthe proper name of “the such-and-such”. This can be proven from the fact that

∃x(Px& (x = h)) a` Ph. (9.4)

This is proven in the Exercises, below.

9.8 The Definite Article

(This section was revised Dec. 3/06.) Two chapters ago we showed how to express theindefinite article “a” or “an” in predicate notation. For example,

a := xFx

reads “let a be an object x such that Fx;” or more simply, “let a be an F .” The notationis inspired by the fact that we can read ∃xFx as “there exists an F ,” so that if we justremove the ∃ we get “an F .”

We can symbolize the definite article “the” in much the same way. Consider the fol-lowing expression:

∃x(Fx&∀y(Fy → (x = y))).

This reads “the F exists” or “there exists a unique F .” Take off the ∃, and we get

x(Fx&∀y(Ky → (x = y))),

which can be read as “the F” or “a unique F .”This can be easily extended to descriptions involving two or more predicates. For

instance, if

∃x(Kx&Bx&∀y(Ky → (x = y)))

means “The present King of France is bald,” then

x(Kx&Bx&∀y(Ky → (x = y)))

reads “the present bald King of France.”Here is a simple example of a natural deduction using definite descriptions and the

definite article operator. Demonstrate the following argument:

The author of Hamlet is the author of Macbeth.Shakespeare wrote Hamlet.∴ Shakespeare wrote Macbeth.

Page 128: Symbolic Logic Text

116 CHAPTER 9. PREDICATE LOGIC WITH IDENTITY

This is a very simple piece of reasoning, and it should have a very simple proof.First, we need to define some symbols:

Axy := x is an author of yh := Hamletm := Macbeths := Shakespeare.

We begin the proof by defining proper names for the author of Hamlet and the authorof Macbeth; we then write our two premisses in terms of those proper names, and make asimple substitution:

(1) a := x(Axh&∀y(Ayh→ (y = x))) Df.(2) b := x(Axm&∀y(Aym→ (y = x))) Df.

3 (3) a = b A4 (4) s = a A3,4 (5) s = b 3,4 =E

This is an enormous simplification over the methods for proving such arguments that will befound in most recent logic texts. What makes it work is the ability to define the “temporary”proper names a and b; with these in hand, very simple facts about the nature of identitycan be applied in ways that are as obvious as the argument itself. Behind the notationis the logical (or in part grammatical) fact that because definite descriptions such as “theauthor of Hamlet” pick out a unique object, they function exactly the same way as doproper names (which are labels for unique objects) ; we can therefore represent their logicquite faithfully if we simply assign to the definite description a proper name (if we do notalready know a name of the object being described). The only restriction is, as usual, thatwe have to use a term letter that has not been used before.

Page 129: Symbolic Logic Text

9.9. REVIEW EXERCISES 117

9.9 Review Exercises

Use the following symbol definitions:

a := Aristotles := The Stagirite1

p := Platoe := The Ethicsr := The Republict := The TimaeusKx := x knew (knows) The GoodAxy := x was (is) an author of y

A

1. Translate into symbols:

(a) Aristotle was not Plato.

(b) Aristotle was The Stagirite.

(c) Either Aristotle or Plato wrote the Republic.

(d) Plato wrote both the Republic and the Timaeus.

2. Translate into good, idiomatic English:

(a) p 6= s

(b) r 6= t

(c) Apr ∨ −Apr

3. Translate and prove:

(a) Aristotle wrote the Ethics.Aristotle was The Stagirite.∴ The Stagirite wrote the Ethics.

(b) If Aristotle wrote the Republic then Aristotle was Plato.However, Aristotle was not Plato.∴ Aristotle did not write the Republic.

4. Prove the following sequents:

(a) Fa&Ga, (a = b) ` Fa&Gb

(b) Fa→ Ga, (a = b) ` Fb→ Ga

(c) ∀x(Fax→ Gbx), a = b ` ∀x(Fax→ Gax)

(d) Faa→ (a 6= a) ` −Faa

1Aristotle came from a small town called Stagira, and so he was often called The Stagirite.

Page 130: Symbolic Logic Text

118 CHAPTER 9. PREDICATE LOGIC WITH IDENTITY

B

1. Translate into symbols:

(a) Plato was the author of the Republic.

(b) Aristotle was not the author of the Republic.

2. Translate into good, idiomatic English:

(a) ∃x((Axr&Axt) &∀y(Ayr → (x = y)))

(b) Apr&Apt&∀y(Ayr → (y = p))

3. Prove the following sequent:∃x(Px& (x = l)) a` Pl

4. Translate the following argument into symbols, and prove the resulting sequent:

Plato knew the Good.Aristotle was not Plato.At most one person knew the Good.∴ Aristotle did not know the Good.

C

1. Find a general recursive expression for the numerically definite quantifier ∃nxFx forany n.

Page 131: Symbolic Logic Text

Chapter 10

Solutions to Selected Exercises

Chapter 2

Exercises 2.2.8

A

1. (a) T &C or C &T

(c) T → −C

(e) −B → A

(g) B → C

(i) −C → −T

(k) A→ −B

(m) T ∨ −C or −C ∨ T

2. (a) If Bob goes to the store then Alice does not.orBob goes to the store only if Alice does not.

(c) Either Bob doesn’t go to the store or Cheryl goes to work.

(e) If Alice does not go to the store then Bob does.orAlice does not go to the store only if Bob does.

(g) Either Alice does not go to the store or Tad does not stay home.

(i) Tad stays home only if Bob goes to the store.

119

Page 132: Symbolic Logic Text

120 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

Exercises 2.3.1

A

1. (a) F

(c) T

(e) T

2. (a) T

(c) F

(e) T

Exercises 2.6.1

A

1. (a) (B&T )→ (A→ C)

(c) C ↔ T

(e) −T → (A ∨ B)

2. (a) If Alice and Bob go to the store and Cheryl goes to work, then Tad does’t stay home.

(c) If Alice goes to the store then if Bob goes to the store Cheryl goes to work.or:If Alice goes to the store then Bob goes to the store only if Cheryl goes to work.

(e) If Alice goes to the store only if Bob goes too, then Cheryl goes to work only if Tadstays home.

Exercises 2.7.1

A

1. (b) −C ∨ −T

(d) −(C → T )

2. (b) Bob going to the store does not imply either Cheryl not going to work or Tad stayinghome.

(d) Alice goes to the store if and only if Bob does not.

Page 133: Symbolic Logic Text

121

Solutions for Chapter 3

Exercises 3.3.1

A

1. (a) P, P → −Q ` −Q1 (1) P A2 (2) P → −Q A1,2 (3) −Q 1,2 MP

(c) P &Q,P → R ` R1 (1) P &Q A2 (2) P → R A1 (3) P 1 &E1,2 (4) R 2,3 MP

(e) −− P ` P ∨ R1 (1) −− P A1 (2) P 1 DN1 (3) P ∨ R 2 vI

B

1. (a) P → (Q&R), P ` R ∨ S1 (1) P → (Q&R) A2 (2) P A1,2 (3) Q&R 1,2 MP1,2 (4) R 3 &E1,2 (5) R ∨ S 4 vI

(c) P ∨ Q,Q→ R ` P ∨ R1 (1) P ∨ Q A2 (2) Q→ R A3 (3) P A (vE)3 (4) P ∨ R 3 vI5 (5) Q A (vE)2,5 (6) R 2,5 MP2,5 (7) P ∨ R 6 vI1,2 (8) P ∨ R 1,3–4,5–7 vE

Page 134: Symbolic Logic Text

122 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

(e) P → Q,Q→ R,−R ` −P

1 (1) P → Q A2 (2) Q→ R A3 (3) −R A4 (4) P A (RAA)1,4 (5) Q 1,4 MP1,2,4 (6) R 2,5 MP1,2,3,4 (7) R& −R 3,6 &I1,2,3 (8) −P 4–7 RAA

Exercises 3.7

A

2. (a) P,Q ` P → Q

1 (1) P A2 (2) Q A2 (3) P → Q 1–2 CP

(c) P → (Q ∨ R),−(Q ∨ R) ` −P

1 (1) P → (Q ∨ R) A2 (2) −(Q ∨ R) A1,2 (3) −P 1,2 MT

(e) (Q→ R),−(Q→ R) ` R

1 (1) Q→ R A2 (2) −(Q→ R) A1,2 (3) R 1,2 ExF

(g) P → (Q→ R), (Q→ R)→ T ` P → T

1 (1) P → (Q→ R) A2 (2) (Q→ R)→ T A1,2 (3) P → T 1,2 HS

(h) −(P & −Q) ` −P ∨ Q

1 (1) −(P & −Q) A1 (2) −P ∨ −−Q 1 deM1 (3) −P ∨ Q 2 DN

Page 135: Symbolic Logic Text

123

(j) (X ∨ Y )→ Q ` (−X & − Y ) ∨ Q1 (1) (X ∨ Y )→ Q A1 (2) −(X ∨ Y ) ∨ Q 1 Imp1 (3) (−X & − Y ) ∨ Q 2 deM

(l) −(X → Y )→ −P, P ` (X → Y )

1 (1) −(X → Y )→ −P A2 (2) P A1 (3) P → (X → Y ) 1 Trans1,2 (4) (X → Y ) 2,3 MP

(n) P & (Q→ R) ` (P & −Q) ∨ (P &R)

1 (1) P & (Q→ R) A1 (2) P & (−Q ∨ R) 1 Imp1 (3) (P & −Q) ∨ (P &R) 2 Dist

B

1. (a) P → Q,R→ S,−Q ∨ −S ` −P ∨ −R (DD)

1 (1) P → Q A2 (2) R→ S A3 (3) −Q ∨ −S A4 (4) −Q A (vE)1,4 (5) −P 1,4 MT1,4 (6) −P ∨ −R 5 vI7 (7) −S A (vE)2,7 (8) −R 2,7 MT2,7 (9) −P ∨ −R 8 vI1,2,3 (10) −P ∨ −R 3,4–6,7–9 vE

(c) P → Q ` −(P & −Q)

1 (1) P → Q A1 (2) −P ∨ Q 1 Imp1 (3) −P ∨ −−Q 2 DN1 (4) −(P & −Q) 3 deM

(d) −(P & −Q) ` P → Q

1 (1) −(P & −Q) A1 (2) −P ∨ −−Q 1 deM1 (3) −P ∨ Q 2 DN1 (4) P → Q 3 Imp

Page 136: Symbolic Logic Text

124 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

(g) P ∨ (Q&R) ` (P ∨ Q) & (P ∨ R)

1 (1) P ∨ (Q&R) A2 (2) P A (vE)2 (3) P ∨ Q 2 vI2 (4) P ∨ R 2 vI2 (5) (P ∨ Q) & (P ∨ R) 3,4 &I6 (6) Q&R A (vE)6 (7) Q 6 &E6 (8) R 6 &E6 (9) P ∨ Q 7 vI6 (10) P ∨ R 8 vI6 (11) (P ∨ Q) & (P ∨ R) 9,10 &I1 (12) (P ∨ Q) & (P ∨ R) 1,2–5,6–11 vE

(i) P → (Q→ R) ` (P &Q)→ R (Exp)

1 (1) P → (Q→ R) A2 (2) P &Q A (CP)2 (3) P 2 &E1,2 (4) Q→ R 1,3 MP1 (5) Q 2 &E1,2 (6) R 4,5 MP1 (7) (P &Q)→ R 2–6 CP

(k) (P → Q) & (P → R) ` P → (Q&R)

1 (1) (P → Q) & (P → R) A2 (2) P A (CP)1 (3) P → Q 1 &E1 (4) P → R 1 &E1,2 (5) Q 2,3 MP1,2 (6) R 2,4 MP1,2 (7) Q&R 5,6 &I1 (8) P → (Q&R) 2–7 CP

(m) −(P & −Q) ` P → Q

1 (1) −(P & −Q) A1 (2) −P ∨ −−Q 1 deM1 (3) −P ∨ Q 2 DN1 (4) P → Q 3 Imp

2. (a) P,Q ` P ↔ Q

1 (1) P A

Page 137: Symbolic Logic Text

125

2 (2) Q A2 (3) P → Q 1–2 CP1 (4) Q→ P 1–2 CP1,2 (5) (P → Q) & (Q→ P ) 3,4 &I1,2 (6) P ↔ Q 5 Def↔

(c) (P → R), (Q→ R) ` (P &Q)→ R

1 (1) P → R A2 (2) Q→ R A3 (3) P &Q A (CP)3 (4) P 3 &E1,3 (5) R 1,4 MP1 (6) (P &Q)→ R 3–5 CP

(e) R,R↔ P,−Q→ −P ` Q1 (1) R A2 (2) R↔ P A3 (3) −Q→ −P A3 (4) P → Q 3 Trans2 (5) (R→ P ) & (P → R) 2 Def↔2 (6) R→ P 5 &E2,3 (7) R→ Q 4,6 HS1,2,3 (8) Q 1,7 MP

(g) P ↔ S, P ∨ Q,P ∨ R,−Q ∨ −R ` S1 (1) P ↔ S A2 (2) P ∨ Q A3 (3) P ∨ R A4 (4) −Q ∨ −R A5 (5) −Q A (vE)2,5 (6) P 2,5 DS7 (7) −R A (vE)3,7 (8) P 3,7 DS2,3,4 (9) P 4,5–6,7–8 (vE)1 (10) (P → S) & (S → P ) 1 Def↔1 (11) P → S 10 &E1,2,3,4 (12) S 9,11 MP

(i) P → (Q ∨ R), (P → Q)→ X, (P → R)→ Y ` X ∨ Y1 (1) P → (Q ∨ R) A2 (2) (P → Q)→ X A

Page 138: Symbolic Logic Text

126 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

3 (3) (P → R)→ Y A1 (4) (P → Q) ∨ (P → R) 1 Dist↔5 (5) P → Q A (vE)2,5 (6) X 2,5 MP2,5 (7) X ∨ Y 6 (vI)8 (8) P → R A (vE)3,8 (9) Y 3,8 MP3,8 (10) X ∨ Y 6 vI1,2,3 (11) X ∨ Y 1,5–7,8–10 vE

Solutions for Chapter 4

Exercise 4.1.1 (A)

1. Identify the rule used in each of the following deductive steps.

(a) MP

(b) MT

(c) vI

(d) HS

(e) Imp

Exercise 4.2.1 (A)

1. State the justifications for each of the following steps.

(a) n Imp(×2)

(b) n Imp, Dist

(c) n Imp, DN

(d) n Imp, Dist

(e) n, n+1 MT, DN

(f) n Assoc, Comm.

Exercise 4.5.1 (B)

1. (a) ` (P & − P )→ Q

1 (1) P & − P A (CP)1 (2) Q 1 ExF

(3) (P & − P )→ Q 1–2 CP

(b) ` P → (−P → Q)

Page 139: Symbolic Logic Text

127

1 (1) P A (CP)2 (2) −P A (CP)1,2 (3) P & − P 1,2 &I1,2 (4) Q 3 ExF1 (5) −P → Q 2–4 CP

(6) P → (−P → Q) 1–5 CP

Alternate proof:

(1) P ∨ −P ExM(2) (P ∨ −P ) ∨ Q 1 vI(3) −P ∨ (P ∨ Q) 2 Assoc, Comm(4) P → (−− P ∨ Q) 3 Imp, DN(5) P → (−P → Q) 4 Imp

(c) ` P ∨ (P → Q)(1) −P → (P → Q) TI (Ques. 1(a))(2) −− P ∨ (P → Q) 1 Imp(3) P ∨ (P → Q) 2 DN

Alternate proof:

(1) P ∨ −P ExM(2) (P ∨ −P ) ∨ Q 1 vI(3) P ∨ (−P ∨ Q) 2 Assoc(4) P ∨ (P → Q) 3 Imp

(d) ` (P → Q) ∨ (Q→ P )(1) P ∨ −P ExM(2) (P ∨ −P ) ∨ (Q ∨ −Q) 1 vI(3) (−P ∨ Q) ∨ (−Q ∨ P ) 2 Assoc (×2), Comm (×3)(4) (P → Q) ∨ (Q→ P ) 3 Imp (×2)

(e) ` (P → Q) ∨ (Q→ R)(1) Q ∨ −Q ExM(2) −P ∨ (Q ∨ −Q) 1 vI(3) (−P ∨ (Q ∨ −Q)) ∨ R 2 vI(4) (−P ∨ Q) ∨ (−Q ∨ R) 3 Assoc (×2)(5) (P → Q) ∨ (Q→ R) 4 Imp (×2)

(f) ` (−P → P )→ P

1 (1) −P → P A (CP)2 (2) −P Q (RAA)1,2 (3) P 1,2 MP1,2 (4) P & − P 2,3 &I1 (5) P 2–4 RAA, DN

(6) (−P → P )→ P 1–5 CP

Page 140: Symbolic Logic Text

128 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

Alternate proof:

1 (1) −P → P A (CP)1 (2) −− P ∨ P 1 Imp1 (3) P ∨ P 2 DN1 (4) P 3 Idem

(5) (−P → P )→ P 1–4 CP

Exercise 4.5.1 (C)

1. (a) ` P ∨ −P1 (1) −(P ∨ −P ) A (RAA)2 (2) P A (RAA)2 (3) P ∨ −P 2 vI1,2 (4) (P ∨ −P ) & − (P ∨ −P ) 1,3 &I1 (5) −P 2–4 RAA1 (6) P ∨ −P 5 vI1 (7) (P ∨ −P ) & − (P ∨ −P ) 1,6 &I

(8) P ∨ −P 1–7 RAA, DN

This elegant proof is due to Lemmon (his Sequent 44.) [17].

Exercise 4.6.1 (B)

1. (a) Given P,Q ` R, show ` (P &Q)→ R

1 (1) (P &Q) A (CP)1 (2) P 1 &E1 (3) Q 1 &E1 (4) R 2,3 SI

(5) (P &Q)→ R 1–4 CP

(b) Given ` (P &Q)→ R, show P,Q ` R1 (1) P A2 (2) Q A1,2 (3) P &Q 1,2 &I

(4) (P &Q)→ R TI1,2 (5) R 3,4 MP

Exercise 4.7.1 (B)

1. (a) ` (P ↔ Q)↔ (−(P → −Q) ∨ −(−Q→ P ))

Page 141: Symbolic Logic Text

129

(P ↔ Q) ↔ ((P &Q) ∨ (−P & −Q)) Equiv↔ ((P & −−Q) ∨ (−Q& − P )) DN, Comm↔ (−(P → −Q) ∨ −(−Q→ P )) Neg→ (×2)

(b) ` ((Q ∨ R)→ P )↔ ((Q→ P ) & (R→ P ))

((Q ∨ R)→ P ) ↔ (−(Q ∨ R) ∨ P ) Imp↔ ((−Q& −R) ∨ P ) deM↔ ((−Q ∨ P ) & (−R ∨ P )) Dist↔ ((Q→ P ) & (R→ P )) Imp(×2)

(c) ` (P → (Q ∨ R))↔ ((P → Q) ∨ (P → R))

(P → (Q ∨ R)) ↔ (−P ∨ (Q ∨ R)) Imp↔ ((−P ∨ −P ) ∨ (Q ∨ R)) Idem↔ (−P ∨ Q) ∨ (−P ∨ R)) Assoc, Comm↔ (P → Q) ∨ (P → R) Imp

(d) ` (P → (Q&R))↔ ((P → Q) & (P → R))

(P → (Q&R)) ↔ (−P ∨ (Q&R)) Imp↔ ((−P ∨ Q) & (−P ∨ R)) Dist↔ ((P → Q) & (P → R)) Imp(×2)

(e) ` (P ∨ (Q→ R))↔ ((Q→ P ) ∨ (−P → R))

(P ∨ (Q→ R)) ↔ (P ∨ (−Q ∨ R) Imp↔ ((P ∨ P ) ∨ (−Q ∨ R)) Idem↔ ((−Q ∨ P ) ∨ (−− P ∨ R)) Comm, Assoc, DN↔ ((Q→ P ) ∨ (−P → R)) Imp(×2)

Solutions for Chapter 5

Review Exercises on Chapter 5

A

1. Write the complete truth table for each of the following wffs:

(a)

− (P ∨ Q) & − (P & Q)F T T T F F T T TF T T F F T T F FF F T T F T F F TT F F F T T F F F

Page 142: Symbolic Logic Text

130 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

(b)

− (− (P → Q) & − (Q → P ))T F T T T F F T T TT T T F F F F F T TT F F T T F T T F FT F F T F F F F T F

(c)

− (P & (Q & R))F T T T T TT T F T F FT T F F F TT T F F F FT F F T T TT F F T F FT F F F F TT F F F F F

(d)

P → (Q → R)T T T T TT F T F FT T F T TT T F T FF T T T TF T T F FF T F T TF T F T F

2. Evaluate the following wffs, when P = T and Q = F .

(a)

− (P → Q)T T F F

(b)

− (P ↔ Q)T T F F

(c)

(− P ↔ Q)F T T F

Page 143: Symbolic Logic Text

131

(d)

(− P & − Q) ∨ (P & Q)F T F F T T T T TF T F T F F T F FT F F F T F F F TT F T T F T F F F

3. Use truth tables to characterize the following wffs as tautologous, inconsistent, orcontingent.

(a)

P → P

T T TF T F

Tautology

(b)

− P → P

F T T TT F F F

Contin-

gent

(c)

P → (− P → P )T T F T T TF T T F F F

Tautology

(d)

(P ∨ −P ) → (P & − P )T F F

Contradiction

4. Check the following sequents for validity. Use short-cuts where they are suitable.

(a)

P ∨ Q − Q P

F F F T F F

This is valid, since making the conclusion F forces a premissto be F.

(b)

P → Q − Q − P

T F F T F F T

Valid

Page 144: Symbolic Logic Text

132 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

(c)

P → Q P → (P & Q)T F F T F T F F

Valid

(d)

P → Q − P − Q

F T T T F F T

Invalid, since there is a line in the truth table in which thepremisses are T and the conclusion F.

B

1. Use truth tables to classify the following wffs as either tautologous, inconsistent, orcontingent.

(a)

P ∨ (Q & R)F F F F FT T T T T

Contingent; to demonstrate this, all we have to do is showthat there is at least one line that comes out T, and one thatcomes out F.

(b)

P → (R & S)T F F FT T T T T

Contingent

(c)

P → (Q → P )T T T T

Tautology

(d)

(P → R) → (P → (P & R))T F F T T F T F F

Tautology; since the only way to make the consequent Fmakes the antecedent F as well.

Page 145: Symbolic Logic Text

133

(e)

(P → (Q→ P )) → R

T T TT F F

Contingent; we know by (c) that the antecedent is alwaysT, so the value depends on R, which can be T or F.

(f)

(P ↔ Q) ↔ (Q ↔ − P )T T T F T F F TT F F F F T F TF F T F T T T FF T F F F F T F

Contradiction

(g)

P → (R& −R)T F FF T F

Contingent

(h)

(P → R) → − P

T T T F F TF T T T F

Contingent

(i)

(P → (Q & R)) & (P → (Q & − R))F T T F TT T F F F

Contingent

(j)

− (P → Q) & − (Q → P )F T T T F F T T TT T F F F F F T TF F T T F T T F FF F T F F F F T F

Contradiction

Page 146: Symbolic Logic Text

134 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

2. Use truth tables to check the following sequents for validity.

(a)

P → Q Q → R P → R

T F F F T F T F F

Valid

(b)

P ↔ Q Q ↔ R P ↔ R

T F F F T F T F FF F T T T T F F T

Valid

(c)

P → Q R → Q P → R

T T T F T T T F F

Invalid

(d)

P → Q − Q → P

F T F T F F F

Invalid

(e)

P → Q R → S − Q ∨ − S − P ∨ − R

T T T T T T F T F F T F T F F T

Valid

(f)

P ∨ Q P → R Q → R R

F F F F T F F T F F

Valid

(g)

P → Q P → (P ∨ Q)F T T T T T

Valid, since the conclusion is a the-orem.

Page 147: Symbolic Logic Text

135

(h)

P → Q P → (P & Q)T F F T F T F F

Valid

(i)

P ∨ R P → (R ∨ S) − S − P

T T T T T T T F T F F T

Invalid

(j)

P → (M & N) M → R N → S − (R ∨ S) − P ∨ M

T F F F F F T F F T F T F F F F T F F

Valid

Page 148: Symbolic Logic Text

136 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

Solutions for Chapter 6

Exercise 6.7

1(b) P ∨ QX−P−Q�� @@

The tree closes; therefore the sequent is valid.

1(c) P ∨ QXP → RXQ→ SX−(R ∨ S) X−R−S�� @@

−Q S×�� @@

−P R×�� @@

The tree closes; therefore the sequent is valid.

1(e) P → QX−(P → (P &Q)) X

P−(P &Q) X

�� @@−P×

−Q�� @@

−P×

The tree closes; therefore the sequent is valid.

1(g) P−(P → Q) X

P−Q

The tree is open; therefore the sequent is invalid.

1(j) P → (Q&R) X−QP�� @@

−P×

Q&RXQR×

The tree is closed; therefore the sequent is valid.

Page 149: Symbolic Logic Text

137

Solutions for Chapter 7

Section 7.3

1.(a) ∀x∀y((Ax&Fy)→ Exy)

(b) ∀x∀y(Cy → (Exy →Mx))

OR

∀x∀y((Cy&Exy)→Mx)

(c) ∀x(Ax→ (Sx↔ −V x))

(d) ∀x∀y((Lx&Ty)→ (Exy ↔ Ax))

(e) ∀x(Mx→ −V x)

(f) Tmn

(g) xMx

(h) x(Mx& −Dx)

2.(a) Any mongoose from the Transvaal has a strange diet.

(b) Some cobras and Mongooses like to sleep in the sun.

(c) All mongooses and aardvarks like to eat ants.

(d) No cobra or mongoose is from the Transvaal.

(e) a mongoose that doesn’t like to eat any cobras.

Page 150: Symbolic Logic Text

138 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

Solutions for Chapter 8

Section 8.14

A

1. (a) 1 (1) Nr A2 (2) ∃xNx 1 EI

(b) 1 (1) Apc A1 (2) ∃xApx 1 EI

(c) 1 (1) ∀x(Ux→ Nx) A2 (2) Ur A1 (3) Ur → Nr 1 UE1,2 (4) Nr 2,3 MP

(d) 1 (1) ∀x(Ux→ −Rx) A2 (2) Rt A1 (3) Ut→ −Rt 1 UE1,2 (4) −Ut 2,3 MT

(e) 1 (1) ∀x(Ux→Mx) A2 (2) ∀x(Mx→Wx) A3 (3) Ur A1 (4) Ur →Mr 1 UE2 (5) Mr →Wr 2 UE1,2 (6) Ur →Wr 4,5 HS1,2,3 (7) Wr 3,6 MP

2. (a) 1 (1) Fa&Fb A1 (2) Fa 1 &E1 (3) ∃xFx 2 EI

(b) 1 (1) ∀x(Mx→Wx) A2 (2) Ma&Ra A1 (3) Ma→Wa 1 UE2 (4) Ma 2 &E1,2 (5) Wa 3,4 MP

(c) 1 (1) ∀x(Fx→ Gx) A2 (2) ∃xFx A

(3) a := xFx Df.2 (4) Fa 2,3 EE1 (5) Fa→ Ga 1 UE1,2 (6) Ga 4,5 MP1,2 (7) ∃xGx 6 EI

(d) 1 (1) ∀x(Fx→ −Gx) A2 (2) ∃xGx A

(3) a := xGx Df.2 (4) Ga 2,3 EE1 (5) Fa→ −Ga 1 UE

Page 151: Symbolic Logic Text

139

1,2 (6) −Fa 4,5 MT1,2 (7) ∃x− Fx 6 EI

(e) 1 (1) ∀x− (Fx ∨ Gx) A1 (2) −∃x(Fx ∨ Gx) 1 Duality

B

1. (a) 1 (1) ∀x(Ahx→ −∃yTyx) A2 (2) Ahb A1 (3) Ahb→ −∃yTyb 1 UE1,2 (4) −∃yTyb 2,3 MP1,2 (5) ∀y − Tyb 4 Duality1,2 (6) −Tsb 5 UE

(b) 1 (1) ∀x(Ux→ Nx) A2 (2) Ur&Br A2 (3) Ur 2 &E2 (4) Br 2 &E1 (5) Ur → Nr 1 UE1,2 (6) Nr 3,5 MP1,2 (7) Br&Nr 4,6 &I1,2 (8) ∃x(Bx&Nx) 7 EI

(c) 1 (1) Apc A2 (2) Dc A1,2 (3) Apc&Dc 1,2 &I1,2 (4) ∃x(Apx&Dx) 3 EI

(d) 1 (1) ∀x(Ux→ Nx) A2 (2) ∃x(Bx&Ux) A3 (3) ∀x(Ux→Mx) A

(4) a := x(Bx&Ux) Df.2 (5) Ba&Ua 2,4 EE2 (6) Ba 5 &E2 (7) Ua 5 &E3 (8) Ua→Ma 3 UE2,3 (9) Ma 7,8 MP1 (10) Ua→ Na 1 UE1,2 (11) Na 7,10 MP1,2,3 (12) Ma&Ba&Na 6,9,11 &I1,2,3 (13) ∃x(Mx&Bx&Nx) 12 EI

(e) 1 (1) ∀x(Ux→Mx) &∀x(Lx→Mx) A2 (2) ∀x(Ux→ −Ax) A3 (3) ∃x(Ux&Cxk) A

(4) a := x(Ux&Cxk) Df.3 (5) Ua&Cak 3,4 EE3 (6) Ua 5 &E2 (7) Ua→ −Aa 2 UE

Page 152: Symbolic Logic Text

140 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

2,3 (8) −Aa 6,7 MP1 (9) ∀x(Ux→Mx) 1 &E1 (10) Ua→Ma 9 UE1,3 (11) Ma 6,10 MP1,2,3 (12) Ma& −Aa 8,11 &I1,2,3 (13) ∃x(Mx& −Ax) 12 EI

2. (a) Sequent 62

i. ∃x(P &Fx) ` P &∃xFx1 (1) ∃x(P &Fx) A

(2) a := x(P &Fx) Df.1 (3) P &Fa 1,2 EE1 (4) P 3 &E1 (5) Fa 3 &E1 (6) ∃xFx 5 EI1 (7) P &∃xFx 4,6 &I

ii. This could be a good exam question!

(b) Sequent 63

i. 1 (1) ∀x(P &Fx) A1 (2) P &Fa 1 UE1 (3) P 2 &E1 (4) Fa 2 &E1 (5) ∀xFx 4 UI1 (6) P &∀xFx 3,5 &I

ii. This could be a good exam question!

(c) Sequent 64

i. 1 (1) ∃x(P ∨ Fx) A(2) a := x(P ∨ Fa) Df.

1 (3) P ∨ Fa 1,2 EE4 (4) P A (vE)4 (5) P ∨ ∃xFx 4 vI6 (6) Fa A (vE)6 (7) ∃xFx 6 EI6 (8) P ∨ ∃xFx 7 vI1 (9) P ∨ ∃xFx 3,4–5,6–8 vE

ii. This could be a really good question for the exam! Hint: it is basically thereverse of the above, with EE’s inside vE subproofs.

(d) Sequent 65

i. 1 (1) ∀x(P ∨ Fx) A1 (2) P ∨ Fa 1 UE3 (3) P A (vE)3 (4) P ∨ ∀xFx 3 vI5 (5) Fa A (vE)5 (6) ∀xFx 5 UI

Page 153: Symbolic Logic Text

141

5 (7) P ∨ ∀xFx 6 vI1 (8) P ∨ ∀xFx 2,3–4,5–7 vE

This proof illustrates a subtle point. It might seem that the use of UI inline 6 is illegitimate, since it is done on the term letter a that appears in theassumption on line 5. The catch is that a was actually introduced on line2, where it was treated as an entirely arbitrary constant to which line 1 isapplied. In other words, a was not actually introduced in an assumption! Itis the idea that a is F that is assumed in line 5.

ii. Another possible exam question!(e) Sequent 66

i. 1 (1) ∃x(P → Fx) A2 (2) P A

(3) a := x(P → Fx) Df.1 (4) P → Fa 1,3 EE1,2 (5) Fa 2,4 MP1,2 (6) ∃xFx 5 EI1 (7) P → ∃xFx 2–6 CP

ii. 1 (1) P → ∃xFx A2 (2) −∃x(P → Fx) A (RAA)2 (3) ∀x− (P → Fx) 2 Duality2 (4) −(P → Fa) 3 UE2 (5) P & − Fa 4 Neg→2 (6) P 5 &E2 (7) −Fa 5 &E2 (8) ∀x− Fx 7 UI2 (9) −∃xFx 8 Duality1,2 (10) ∃xFx 1,6 MP1,2 (11) ∃xFx& − ∃xFx 9,10 &I1 (12) ∃x(P → Fx) 2–11 RAA

(f) Sequent 67

i. 1 (1) ∀x(P → Fx) A2 (2) P A (CP)1 (3) P → Fa 1 UE1,2 (4) Fa 2,3 MP1,2 (5) ∀xFx 4 UI1 (6) P → ∀xFx 2–5 CP

ii. The converse is left as an exercise.(g) Sequent 68

i. 1 (1) ∃x(Fx→ P ) A2 (2) ∀xFx A (CP)

(3) a := x(Fx→ P ) Df.1 (4) Fa→ p 1,3 EE2 (5) Fa 2 UE1,2 (6) P 4,5 MP1 (7) ∀xFx→ P 2–6 CP

Page 154: Symbolic Logic Text

142 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

ii. 1 (1) ∀xFx→ P A2 (2) −∃x(Fx→ P ) A (RAA)2 (3) ∀x− (Fx→ P ) 2 Duality2 (4) ∀x(Fx& − P ) 3 Neg→2 (5) Fa& − P 4 UE2 (6) Fa 5 &E2 (7) −P 5 &E2 (8) ∀xFx 6 UI1,2 (9) P 1,8 MP1,2 (10) P & − P 7,9 &I1 (11) ∃x(Fx→ P ) 2–10 RAA

3. (a) Sequent 54 is a straightforward application of UI; it is left as an exercise. Sequent55 is a fairly easy vE; use UE and IU within each subproof.

(b) Prove ∃x(Fx ∨ Gx) a` ∃xFx ∨ ∃xGxi. 1 (1) ∃x(Fx ∨ Gx) A

(2) a := x(Fx ∨ Gx) Df.1 (3) Fa ∨ Ga 1,2 EE4 (4) Fa A (vE)4 (5) ∃xFx 4 EI4 (6) ∃xFx ∨ ∃xGx 5 vI7 (7) Ga A (vE)7 (8) ∃xGx 7 EI7 (9) ∃xFx ∨ ∃xGx 8 vI1 (10) ∃xFx ∨ ∃xGx 3,4–6,7–9 vE

ii. The converse proof is very similar, except that you do EEs inside each vEsubproof.

Prove ∃x(Fx&Gx) ` ∃xFx&∃xGx(Note: this is one that you really should be able to do for the exam.)

1 (1) ∃x(Fx&Gx) A(2) a := x(Fx&Gx) Df.

1 (3) Fa&Ga 1,2 EE1 (4) Fa 3 &E1 (5) Ga 3 &E1 (6) ∃xFx 4 EI1 (7) ∃xGx 5 EI1 (8) ∃xFx&∃xGx 6,7 &I

(c) 1 (1) ∀x(Fx→ Gx) A2 (2) ∀xFx A (CP)1 (3) Fa→ Ga 1 UE2 (4) Fa 2 UE1,2 (5) Ga 3,4 MP1,2 (6) ∀xGx 5 UI1 (7) ∀xFx→ ∀xGx 2–6 CP

Note: a question similar to this could easily be on the exam!

Page 155: Symbolic Logic Text

143

(d) 1 (1) ∃x(Fx→ Gx) A1 (2) ∃x(−Fx ∨ Gx) 1 Imp1 (3) ∃x− Fx ∨ ∃xGx 2 Dist∃ (Sequent 56)1 (4) −∀xFx ∨ ∃xGx 3 Duality1 (5) ∀xFx→ ∃xGx 4 Imp

C

1. Prove ∃xFx a` −∀x− Fx

(a) 1 (1) ∃xFx A2 (2) ∀x− FX A (RAA)

(3) a := xFx Df.1 (4) Fa 1,3 EE2 (5) −Fa 2 UE1,2 (6) Fa& − Fa 4,5 &I1 (7) −∀x− Fx 2–6 RAA

(b) 1 (1) −∀x− Fx A2 (2) −∃xFx A (RAA)3 (3) Fa A (RAA)3 (4) ∃xFx 3 EI2,3 (5) ∃xFx& − ∃xFx 2,4 &I2 (6) −Fa 3–5 RAA2 (7) ∀x− Fx 6 UI1,2 (8) ∀x− Fx& − ∀x− Fx 1,7 &I1 (9) ∃xFx 2–8 RAA

2. Prove ∀x− Fx a` −∃xFx

(a) 1 (1) ∀x− Fx A2 (2) ∃xFx A

(3) a := xFx Df.2 (4) Fa 2,3 EE1 (5) −Fa 1 UE1,2 (6) Fa& − Fa 4,5 &I2 (7) −∀xFx 1–6 RAA1,2 (8) ∀xFx& − ∀xFx 1,7 &I1 (9) −∃xFx 2–8 RAA

Note the double RAA in this elegant proof, whose essential structure is due toLemmon.

(b) 1 (1) −∃xFx A2 (2) Fa A2 (3) ∃xFx 2 EI1,2 (4) ∃xFx& − ∃xFx 1,3 &I1 (5) −Fa 2–4 RAA1 (6) ∀x− Fx 5 UI

3. Prove ∃x− Fx a` −∀xFx

Page 156: Symbolic Logic Text

144 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

(a) 1 (1) ∃x− Fx A2 (2) ∀xFx A (RAA)

(3) a := x(−Fx) Df.1 (4) −Fa 1,3 EE2 (5) Fa 2 UE1,2 (6) Fa& − Fa 4,5 &I1 (7) −∀xFx 2–6 RAA

(b) 1 (1) −∀xFx A2 (2) −∃x− Fx A3 (3) −Fa A3 (4) ∃x− Fx 3 EI2,3 (5) ∃x− Fx& − ∃x− Fx 2,4 &I2 (6) Fa 3–5 RAA2 (7) ∀xFx 6 UI1,2 (8) ∀xFx& − ∀xFx 1,7 &I1 (9) ∃x− Fx 2–8 RAA

Page 157: Symbolic Logic Text

145

Solutions for Chapter 9

A

1. (a) p 6= a

(b) a = s

(c) Aar ∨ Apr(d) Apr&Apt

2. (a) Plato was not The Stagirite.

(b) The Republic is not the Timaeus.

(c) Plato either did or did not write Republic.

3. (a) 1 (1) Aae A2 (2) a = s A1,2 (3) Ase 1,2 =E

(b) 1 (1) Aar → (a = p) A2 (2) a 6= p A1,2 (3) −Aar 1,2 MT

4. (a) 1 (1) Fa&Ga A2 (2) a = b A1,2 (3) Fa&Gb 1,2 =E

(b) 1 (1) Fa→ Ga A2 (2) a = b A1,2 (3) Fb→ Ga 1,2 =E

(c) 1 (1) ∀x(Fax→ Gbx) A2 (2) a = b A1,2 (3) ∀x(Fax→ Gax) 1,2 =E

(d) 1 (1) Faa→ (a 6= a) A(2) a = a =I

1 (3) −Faa 1,2 MT

B

1. (a) Apr&∀y(Ayr → (y = p))

(b) −(Aar&∀y(Ayr → (y = a)))This is equivalent to−Aar ∨ −∀y(Ayr → (y = a))and this is equivalent to−Aar ∨ ∃y(Ayr& (y 6= a)).

2. (a) The author of the Republic was the author of the Timaeus.

(b) Plato was the author of the Republic and the Timaeus.

Page 158: Symbolic Logic Text

146 CHAPTER 10. SOLUTIONS TO SELECTED EXERCISES

3. 1 (1) ∃x(Px& (x = l)) A(2) a := x(Px& (x = l)) Df.

1 (3) Pa& (a = l) 1,2 EE1 (4) Pa 3 &E1 (5) a = l 3 &E1 (6) Pl 4,5 =E

1 (1) Pl A(2) l = l =I

1 (3) Pl& (l = l) 1,2 &I1 (4) ∃x(Px& (x = l)) 3 EI

4. 1 (1) Kp A2 (2) a 6= p A3 (3) ∀x∀y((Kx&Ky)→ (x = y)) A3 (4) (Kp&Ka)→ (p = a) 3 UE (× 2)2,3 (5) −(Kp&Ka) 2,4 MT2,3 (6) −Kp ∨ −Ka 5 deM1,2,3 (7) −Ka 1,6 DS

Page 159: Symbolic Logic Text

Bibliography

[1] Benaceraff, Paul, and Putnam, Hilary (eds.) Philosophy of Mathematics: SelectedReadings. Englewood Cliffs, NJ: Prentice-Hall, 1964.

[2] Bergmann, M., Moor, J., and Nelson, J. The Logic Book. Second Edition. NewYork: McGraw Hill, 1990.

[3] Birch, David. Illustrations by Devis Grebu. The King’s Chessboard . New York:Penguin, 1988.

[4] Boolos, George,and Jeffrey, Richard Computability and Logic Second Edition.Cambridge: Cambridge University Press, 1980.

[5] Brown, George. S. The Laws of Form. London: George Allen and Unwin, 1969.

[6] Copi, I. M., and Cohen, C. Introduction to Logic Tenth Edition. Upper SaddleRiver, NJ: Prentice-Hall, 1998.

[7] DeVidi, David, and Solomon, Graham, “On Confusions About Bivalence and Ex-cluded Middle”, Dialogue 38(4), 1999, 785–799.

[8] Govier, T. A Practical Study of Argument. Third Edition. Belmont, CA:Wadsworth, 1992.

[9] Halmos, P. R. Naive Set Theory. New York: Van Nostrand Reinhold, 1960.

[10] Hofstadter, Douglas. Godel, Escher, Bach: An Eternal Golden Braid.New York:Basic Books, 1979.

[11] Hunter, Geoffrey. Metalogic. London: Macmillan, 1971.

[12] Hughes, R. I. G. “Quantum Logic,” Scientific American 245(4) (October 1981),202–213.

[13] Jacquette, Dale Symbolic Logic. Belmont, CA: Wadsworth, 2001.

[14] Jeffrey, R. Formal Logic, Its Scope and Limits. Third Edition. New York: McGraw-Hill, 1991.

[15] Klemke, E. Theory of Sets. New York: Dover, 1950.

[16] LeBlanc, J. Thinking Clearly: A Guide to Critical Reasoning. New York: W. W.Norton, 1998.

147

Page 160: Symbolic Logic Text

148 BIBLIOGRAPHY

[17] Lemmon, E. J. Beginning Logic. Indianapolis, IN: Hackett, 1978. First published1965, Van Nostrand Reinhold.

[18] Mates, Benson Elementary Logic. Second Edition. New York: Oxford UniversityPress, 1972.

[19] Rucker, Rudy Infinity and the Mind: The Science and Philosophy of the Infinite.New York: Bantam Books, 1983.

[20] Russell, Bertrand. Introduction to Mathematical Philosophy. London: Allen & Un-win, 1918.

[21] Russell, Bertrand. My Philosophical Development. London: Allen & Unwin, 1959.

[22] Russell, Bertrand, and Whitehead, A. N. Principia Mathematica. Second Edition.Cambridge: Cambridge University Press, 1927.

[23] Schumm, G. F. A Teaching Companion to Lemmon’s Beginning Logic. Indianapo-lis, IN: Hackett, 1979.

[24] Smullyan, Raymond. What is the Name of This Book? Englewood Cliffs, NJ:Prentice-Hall, 1978.