Frontiers of Logicical Theory
© 2011 By Paul Herrick
Let us begin with some definitions. When a huge, overarching new development occurs in an academic field, it is usual to call the older theory the “traditional theory” and to call the new theory “modern.” In the third century B.C., the Stoic logicians probably thought of their theory as “modern logic” and saw the slightly earlier Aristotelian theory as the “traditional” logic that theirs had eclipsed. Today, when logicians use the term traditional logic they usually mean the categorical logic developed by Aristotle and his successors.
The term modern logic is usually reserved for the system developed by Frege, Russell, and their successors—the logic of sentences, operators, predicates, and quantifiers presented here. Modern logic is also called “symbolic logic” because of its extensive use of symbols. It is also called “mathematical logic” due to its close relationship to mathematics. Confusion arises, however, when modern logic is called “classical logic” (with “classical” meaning “standard” rather than “original”) since some logicians call Aristotle’s original system of logic “classical” (meaning “original”).
Beginning early in the twentieth century, a number of revolutionary developments occurred in logical theory, resulting in the construction and development of a large number of post-classical systems of logic, also called “nonclassical systems,” each with its own highly developed formal language, axioms, rules of deduction, and theorems. Some of these systems were intended merely as additions or supplements to classical or standard logic; others were offered as total replacements. Some are downright bizarre (from a commonsense standpoint); others are intriguing, to say the least.
In the previous section we examined one nonclassical system of logic: modal logic. Meant as an addition to classical logic, not a replacement, modal logic was actually pioneered by Aristotle (as we have seen), and significant developments occurred in the Middle Ages as well. However, most of the advanced theoretical developments in the subject occurred only after Saul Kripke formulated a new semantical theory for modal logic in the late 1950s. Modal logic is the most highly developed of the nonclassical systems of logic; it is also the most widely employed system of logic in contemporary philosophy.
The rest of this section introduces several other nonclassical logics, each born in the twentieth century. This “cook’s tour” of prominent nonclassical systems is not meant to be complete or in-depth. You will get a taste of each system and no more. But hopefully taste enough to whet your appetite if you wish to go beyond classical logic to what some have called the “frontiers” of the subject.
One axiom, or principle, common to all systems of classical logic is the law of noncontradiction, a principle (as you know) first formulated by the founder of logic, Aristotle himself, at the dawn of the subject. We saw that in truth-functional logic, a self-contradiction (a statement of the form P &~P) logically implies any and every sentence and that if a natural deduction proof begins with a self-contradiction, then any and every sentence can be derived using the rules of inference. A sentence is called “explosive” if its truth implies the truth of any and every sentence. Thus, in classical truth-functional logic, a self-contradiction is an “explosive sentence.” Here is the argument:
Let P and Q be any declarative sentences.
- 1. P &~P
- 2. P
- Simp 1
- 3. ~P
- Simp 1
- 4. P v Q
- Add 2
- 5. Q
- DS 3, 4
Let us call this the “explosion argument.” Classical logic is an explosive system in the following sense: Given the way validity, implication, and the rules of inference are defined in classical logic, a self-contradiction is explosive it implies the truth of any and every sentence, and every sentence can be derived from a single self-contradiction. (But contradictions are not the only sentences that are explosive. Consider the following sentence: “Every sentence is true.” Can you explain why this noncontradictory sentence is also explosive?)
One problem with any explosive sentence is that it logically implies triviality, i.e., the condition wherein every sentence is true. For obvious reasons, few logicians are eager to embrace trivialism.
A paraconsistent system of logic is one that is not explosive. Thus, in paraconsistent (“inconsistency tolerant”) logic, a contradiction does not imply the truth of any and every sentence. The rules of inference are such that not every statement logically follows from a contradiction, and it is not the case that any sentence can be proved true using the rules of inference of the system in a proof that starts with a contradiction. Paraconsistent logic thus gives up the classical “principle of explosion.” The motivation behind paraconsistent logic is the feeling that it ought to be possible to reason in a logical, discriminating way, starting with a contradictory claim or theory, without ending up in triviality. Paraconsistent systems of logic thus allow the possibility of contradictory theories that do not reduce to triviality.
However, problems abound. For instance, any paraconsistent logic must reject as invalid at least one classical inference rule even though each rule seems obviously and certainly valid. To nullify the explosion argument, for example, a paraconsistent logic must give up either disjunctive syllogism, addition, or simplification. But each of these rules seems clearly valid and can be proved valid on a truth-table.
Dialetheism is the philosophical claim that at least some contradictions are true. (Dialetheism is a claim, not a formal system of nonclassical logic.) A dialetheia, if one were to exist, would be “a true contradiction.” Thus, according to dialetheism, for at least one sentence P, both P and ~P are true at the same time! Dialetheism therefore rejects the classic “law of noncontradiction”—a principle that is foundational for all of classical logic.
Since classical logic is explosive, if one accepts dialetheism and classical logic, then one is logically committed to trivialism. The dialetheial classicist must be a trivialist. The dialetheist who wants to avoid trivialism (and thus to maintain some degree of reasonableness) must therefore reject classical logic and adopt in its place a paraconsistent logic (since paraconsistent logic is not explosive). Few dialetheists are trivialists. Most avoid it by using a paraconsistent logic. However, although most dialetheists are also paraconsistentists (those who adopt a paraconsistent logic), not all paraconsistentists are dialetheists, for not all paraconsistentists believe in the existence of dialetheia, and paraconsistent logic does not commit one to their existence.
Thus, there are dialetheic paraconsistentists and anti-dialetheic paraconsistentists.
A dialetheist who also accepts classical logic must maintain that all contradictions are true (not surprisingly, for as a trivialist, he maintains that all sentences are true). In contrast, the dialetheic paraconsistentist maintains only that some contradictions are true (for his paraconsistent logic allows him to avoid the slippery and explosive slope down to trivialism). Why anyone in their right mind, let alone a logician, would want to say that one contradiction is true but that another one is false, is a complicated question we shall not enter here. Perhaps it has no rational answer.
The Liar Paradox: An Argument for Dialetheia?
If trivialism is true, then of course infinitely many true contradictions exist. However, short of trivialism, do at least some dialetheia exist? One argument for the existence of dialetheia rests on an analysis of the famous “Liar Paradox”—a conundrum first raised and debated in ancient Greece as early as the fifth century B.C. The Megarian logician Eubulides of Miletus stated the paradox in these words: “A man says that he is lying; is what he says true or false?” But it is more helpful to think about the issue in purely sentential terms. Consider the following sentence, labeled S:
S: This sentence is false.
Is sentence S true, or is it false? If S is true, then it is false. If S is false, then it is true. Either way we turn, we seem to have a contradiction on our hands! The sentence appears to be declarative, yet every declarative sentence, it would seem, is either true or false. What are we to make of a sentence like this, a sentence asserting its own falsity? (Such sentences are said to be “self-referential” since they refer to themselves.) Most logicians have rejected, as somehow logically illegitimate, self-referential sentences like S, on the grounds that self-contradictions and other paradoxes would result if we accept them (and self-contradictions are impossible).
The dialetheist claims to have a perfectly reasonable solution to the Liar Paradox, a solution that takes the Liar sentence seriously and does not reject it as ill-formed or logically illegitimate. What’s wrong, asks the dialetheist, with simply “biting the bullet” and calling this sentence both true and false at the same time? Thus, although it may be a tough nut to swallow, dialtheia exist!
Of course, classical logicians, taking their stand on the law of noncontradiction, reply that there is plenty wrong with accepting the existence of even one dialetheia. If we grant the existence of even one self-contradiction, then we are logically committed to trivialism, since (a) the standard inference rules are all clearly logically valid and cannot reasonably be rejected, and (b) using the standard inference rules every sentence follows from a self-contradiction. But trivialism is absurd. Therefore, dialetheism lands us in the land of absurdity.
Dialetheists attempt to rebut this argument, of course; they do so by arguing that if we refuse to acknowledge the existence of dialetheia, then we generate other logical paradoxes, and these are even worse! Thus the battle between the dialetheists and the anti-dialetheists shifts to a weighing of paradoxes. Which side has the least bad paradoxes? Which side has the most palatable paradoxical consequences? The arguments pro and con are intriguing, to be sure, but going down this road any further would take us well beyond an introduction to logical theory. Classical logicians shudder at the thought of paraconsistent systems of logic, and the prospect of a dialetheia gives them nightmares. The current “high priest” of dialetheism is the philosopher Graeme Priest (University of Melbourne).
Fuzzy logic drops the classical law of bivalence (the principle that there are only two truth-values, truth-and falsity) and allows the existence of an infinity of truth values between 0 and 1. Suppose Pete is 80 and Fred is 90 and someone says, “Pete is old,” and someone else says, “Fred is old.” In fuzzy logic, one of these sentences will be assigned a higher degree of truth than the other. Suppose Pat has belonged to the club for 10 years and Joe just joined yesterday. In fuzzy logic, the sentence “Pat belongs to the club” may be considered truer than the sentence “Joe belongs to the club.” There are many things one can do in fuzzy logic. For example, once truth-values between 0 and 1 are assigned to individual sentences, fuzzy truth-functions can be defined over them, fuzzy truth-tables can be constructed, and so on. The motivation for fuzzy systems of logic is a desire to reason rigorously with vague terms.
Fuzzy logic is one of a large number of “many-valued logics”—logical systems that have more than two truth values. Scholars are divided on the issue, but Aristotle actually may be the founder of many-valued logic. In contemplating the logic of “future contingents”—statements about contingent future events—he said things that suggested he believed that statements about contingent future events are neither true nor false before the event occurs (or does not occur) and only become true or false when the event occurs (or does not occur). If so, then there would seem to be at least three truth-values for declarative sentences: true, false, and indeterminate. Thus, in Book 9 of De Interpretatione, Aristotle considered the sentence, “A sea battle will occur tomorrow.” Is the sentence true or false? (Notice that to ask this is not the same as asking, Does anyone know if the sentence is true or false?) Aristotle considered this option: The sentence is true today if indeed a sea battle will occur tomorrow, and the sentence is false today if indeed a sea battle will not occur tomorrow.
There are problems with this option, argued Aristotle. For one thing, it seems to imply fatalism, the claim that the future course of events is fixed ahead of time, and therefore nobody can ever do anything that would affect the (already fixed) course of future events. In which case, it would seem to follow that nobody has free will. For example, suppose it is true right now, although I do not know it, that I will eat a burrito for lunch tomorrow. It would seem to follow that tomorrow, when I eat the burrito, I must eat the burrito, I can do nothing else but eat the burrito, since the truth cannot be changed. But it is part of our moral commonsense that if someone does something at a moment in time, and they could not have done anything else, then they do not act of their own free will and they are not responsible for what they do.
Thus it seems that if future contingents are right now either true or false, then nobody has free will. Aristotle apparently favored a different solution: Statements about future events are neither true nor false before the future events occur or do not occur; they only become true (if the event occurs) or false (if it does not occur). If “future contingents” are neither true nor false ahead of time, then perhaps there is a third truth-value, one that holds while a proposition is “waiting” to become true or false: “undetermined.”
Recall the “paradoxes of strict implication”: A necessarily false proposition implies any and every proposition, and a necessary truth is implied by any and every proposition:
- ☐~P →Q
- P → ☐Q
Recall the paradoxes associated with the truth-table for the horseshoe, the “paradoxes of material implication”: A conditional (“If-then sentence”) is true whenever its antecedent is false, and it is also true whenever its consequent is true, even when the antecedent and consequent have nothing whatsoever to do with each other. In symbols:
3. ~P ⊃ (P ⊃ Q)
4. Q ⊃ ( P ⊃ Q)
Classical logicians are bothered by the paradoxes of both strict and material implication; the paradoxes are an integral part of classical logic, yet they seem counterintuitive. A system of relevance logic banishes these paradoxes by redefining validity and by formulating a new semantics for the “if, then” operator in such a way that the paradoxes no longer arise in the system.
Roughly, in relevance logic, in order for a conclusion to follow from a set of premises, the premises have to have something to do with the conclusion; they have to be “relevant” to the conclusion. And in order for a conditional sentence to be true, the antecedent has to be relevant to the consequent; it has to have something to do with the consequent. However, formulating such a system is not as easy as it sounds. The challenge lies in defining relevance in a rigorous and systematic way, and in a way that is suitable for a system of logic. This is what relevance logic attempts to do. However, that is also what relevant logic has so far not succeeded in doing. Nobody has yet formulated a relevance-based definition of validity and implication that (a) avoids the paradoxes of implication, and at the same time that (b) all logicians find acceptable.
In addition, relevance logic faces major internal problems of its own. It turns out that although the new, relevance-based definitions of validity and implication remove the classical paradoxes from logical theory, at the same time they generate new paradoxes of their own, paradoxical consequences that follow from the new (“relevance”) definitions and bedevil relevance logic itself. Paradoxes that are worse, argue classical logicians, than the classical paradoxes.
Recall the counterintuitive nature of the truth-table for the horseshoe. If P and Q are both false, then P ⊃ Q is true; if P is false and Q is true, then P ⊃ Q is true; and so on. Many students find the truth-table for the horseshoe to be illogical. It certainly does not correspond to everyday English usage. Conditional logics reject the truth-table for material implication (the horseshoe) and try to capture the truth conditions for “if, then” statements in a different way—in a way that avoids counterintuitive results. There are many different theories of the conditional, each proposing a different semantical theory for the conditional sentence, and some of these “conditional logics” are very, very complex indeed. But here’s the truly fascinating thing about the different systems that have been created in an attempt to capture the logic of the conditional: All this immensely complicated logical machinery has to do with the meaning of one little word, the word if.
Epistemic logic attempts to capture, in formal terms, the logic of knowledge and belief. By treating “knows that” and “believes that” as dyadic predicates, and by representing propositions as objects of belief and knowledge, a whole system of logical relations governing belief and knowledge can be rigorously specified. Some logicians claim the resulting system of illuminates the nature of human inquiry. For a taste of this nonclassical logic, let “Bap” abbreviate “agent a believes proposition p,” and let Kap abbreviate “agent a knows that proposition p is true.” The following would seem to be reasonable axiomatic starting points for any system of epistemic logic. First:
Kap → Bap
This says, “If agent a knows that p, then a believes that p.” Surely to know something is also to believe it. Knowledge presupposes belief. However, the converse would surely not be true:
Bap → Kap
This says, “If agent a believes that p, then a knows that p.” Surely belief does not imply knowledge. Someone could believe that p without knowing that p (for instance, if p were to be a false proposition). That someone believes that p does not imply that he knows that p.
Next, the following would also seem to be an axiom of any sound epistemic logic:
Kap → p
This says, “If a knows that p, then p is true.” So far, so good. But difficult questions arise when formulating a logic of knowledge and belief. For instance, should the following formula be an axiom?
Kap → KaKap
This says, “If a knows that p, then a knows that a knows that p.” Does knowing that p imply knowing that one knows that p? Epistemic logic is one of the more interesting of the nonclassical systems.
Logic is so abstract it sometimes just does not know what to do with real existence. Aristotle’s categorical logic was predicated on the assumption that the subject terms of all categorical statements must refer to existing objects; empty terms are not allowed. Modern categorical logic drops Aristotle’s “existential assumption” part way: the subject terms of universal categoricals may be empty terms, although not so for the subject terms for particular categoricals: these subject terms must of course refer to existing objects. The semantics of modern predicate logic contains a different plank regarding existence: it is assumed that at least one thing exists. These existential assumptions are all made in order to prevent a paradoxical consequence that would otherwise arise.
Free logic gets its name from the fact that it goes further than classical logic in “freeing” itself from assumptions about the existence of things. In free logic, singular terms do not have to refer to really existing things in the domain of quantification. The machinery of the system can handle “Santa Claus has a white beard” as easily as it handles “The President is a Democrat.” Universally free logic goes one step further and drops the classical assumption that the domain of quantification contains at least one existing member. With a completely empty domain, universally free logics take the final step to existence independent reasoning. What that means, and the changes in logical theory entailed by the loss of these existence assumptions, are surprising indeed.
Deontic logic is the logic of obligation, permission, and related concepts. This is not morally obligatory, but for a taste of this nonclassical branch of our subject, let us examine a couple axioms for deontic logic proposed by Nicholas Rescher.
Let O(p/c) abbreviate “Action p is in circumstance c”
Let F(p/c) abbreviate “Action p is forbidden in circumstance c.”
Let P(p/c) abbreviate “Action p is permitted in circumstance c.”
For example, drinking water is permitted in a public park, taking someone’s life for sport is forbidden in any circumstance, and paying one’s debt is obligatory if one has borrowed money. Among the axioms Rescher proposes for correct deontic inference we find the following:
P( p v ~p/c)
This says. “In any circumstance, either an action p, or its negation, is permitted. For example, in any circumstance, either it is permitted that I sing, or it is permitted that I not sing. This sounds reasonable.
P(p v q/c) ⊃ P(p/c) v P(q/c)
This says that, “If actions p or q are permitted in circumstance c, then either p is permitted in c or q is permitted in c.”
(p ⊃q) ⊃ [P(p/c) ⊃ P(q/c)]
This says, “If p implies q, then if p is permitted in c, then q is permitted in c.” For instance, if saying that I will pay back the loan implies that I have made a promise, then if it is permitted that I say I will pay back the loan, then it is permitted that I make a promise.
The axioms may seem trivial, but the theorems derived (from the complete set) are not trivial at all. Recall that some of Euclid’s five axioms for geometry also seem quite trivial, for instance this one: All right angles are equal to each other. Or this one: Through any two points only one line can be drawn. However, the theorems Euclid proves, on the basis of just five axioms, are anything but obvious. Deontic logic: recommended, but not morally required.
We have time for one more nonclassical system of logic. We already met the Megarian logician Diodorus Cronus (ca. 340-280 B.C.), one of the first to propose a theory of the conditional. Cronus was also one of the first logicians to explore the relationship between time, necessity, and possibility and thus to bring time and modality into one logical framework. In the twentieth century, inspired by the work of this ancient logician, Arthur Prior brought modern tools of logic to bear on the area, and the result was tense logic: the logical theory of statements about time.
Let Gp abbreviate “It is going to be the case that p.”
Let Wp abbreviate “It was the case that p.”
Let Hp abbreviate “It has always been the case that p.”
Let Ap stand for “It always will be the case that p.”
Among the candidates proposed by Prior as axioms for this new field of logic we find the following:
Ap ⊃ Gp
If it will always be the case that p, then it is going to be the case that p.
A( p ⊃ q) ⊃ (Ap ⊃ Aq)
If p will always imply q, then if p always will be the case, then q always will be the case.
~Gp ⊃ G~Gp
If it will never be the case that p, then it is going to be the case that it will never be the case that p.
Again, the axioms may seem trivial, but the theorems derived on the basis of the axioms are anything but. This concludes our tour of some of the many interesting systems on the frontiers of logic grouped under the heading “nonclassical logic.”