## Logics: More Than One Way to Skin a Cat…

Interview by Richard Marshall.

Heinrich Wansing is a philosopher of philosophical logic, modal logic, non-classical logic and epistemology. Here he discusses why logic is important to philosophy, what kinds of logic he’s interested in, in the work of Gentzen and Belnap, classical and non-classical logics, why it makes sense to talk about a correct logic, what proofs are, proof-theoretic understanding of validity, intuitionist logic, Dummett and Kripke, paraconsistent logic, dialetheism, Nelson’s paraconsistent logic and computer science and the connexive logic of Aristotle and Boethius.

**3:AM:** What made you become a philosopher?

**Heinrich Wansing:** As a philosopher, in trying to answer this question I am tempted to address the question “What is a philosopher”? However, I suspect that for present purposes it is better to avoid that question and to take an understanding of “philosopher” for granted. What I can say is that after grammar school, I decided to study philosophy although I had no crystal clear conception of what exactly philosophy is all about. In 1983 I enrolled at the University of Düsseldorf and soon realized that the more interesting characters among the professors in philosophy taught certain exacting and sophisticated subjects, including the philosophy of language, epistemology, and logic. Thus, it was certain people, including Michael Sukale, Hartmut Brands, Axel Bühler, and Gabriel Falkenberg, who set me on track. At a certain point I decided to follow my intellectual interests, to become a philosopher, and to concentrate on philosophical logic. After two years in Düsseldorf, I moved to West-Berlin, where I had a great time and studied philosophy at the Free University, FU. I started attending seminars by David Pearce, who used Neil Tennant’s *Natural Logic* as an introductory logic text, who got me interested in modal logic, and who brought the Lvov-Warsaw school of anti-irrationalism and logic to my attention. Kasimierz Ajdukiewicz’s “Die syntaktische Konnexität” aroused my interest in categorial grammar and the idea of syntactical parsing as logical deduction, themes that led me to what is now called “substructural logic”.

At that time David Pearce became interested in constructive logics with strong negation, and since then an important part of my research has been dealing with the concept of negation. I benefited a lot from David’s international connections to leading philosophers and logicians, from Poland, Sweden, the Netherlands, and other countries. Moreover, David established a co-operation with Wolfgang Rautenberg at the mathematics department of FU, who had some bright students, including Marcus Kracht and Frank Wolter. After completing my master’s thesis, I went to Amsterdam, where Johan van Benthem became a co-supervisor of my doctoral thesis about systems of substructural logic with strong negation. Studying in Amsterdam meant studying modal logic, and I got interested in proof systems for modal logics, realizing that this at that time underdeveloped field called for generalizations of Gerhard Gentzen’s sequent calculus.

I remember a train ride to Konstanz in 1990 during which Peter Schroeder-Heister mentioned Nuel Belnap’s display logic to me. I read Belnap’s seminal paper, visited him in Pittsburgh, and invited him to a workshop in Germany. Eventually, I modified his treatment of modal operators in display logic, which turned out to get several people interested in the modal display calculus. In 1991 I took up a one-year position at the Free University of Berlin and have been working as a philosopher since then. So, what made me a philosopher? In hindsight I would say that it was my interest in linguistic meaning and in capturing it in terms of semantical models as well as formal proof systems.

**3:AM:** Well now you’re an expert in logic. Can you say why logic is so important to philosophy and in particular, what kinds of logic you are interested in?

**HW:** What is, among other things, fascinating about logic is that it is at home in more than one discipline. There is mathematical logic, philosophical logic, logic in computer science, and logic in certain areas of linguistics, including natural langue semantics. I am neither an expert in mathematical or computational logic nor am I a linguist, but I have the privilege to co-operate with colleagues from mathematics, linguistics and computer science. Logic is a central discipline in general and for several reasons. Why is it important to philosophy? First of all, philosophical logic is part of philosophy. The notions of consequence, inference, validity, proof, truth, and falsity are fundamental notions of intrinsic philosophical interest. And so are the notions logicians try to capture by connectives and quantifiers in formal languages: negation (“not”), conjunction (“and”), disjunction (“or”), implication (“if, … then …”), universal quantification (“for all”), particular quantification (“for some”), etc.

I am interested in modal logics, in non-classical logics that can be used to model information processing, and in combinations of such systems. Among the modal notions are concepts of immense philosophical relevance, such as necessity, possibility, conceivability, knowledge, belief, obligation, permission, and agency. Information processing inevitably leads one to consider systems of non-classical logic. Information, understood as propositional content, something like Fregean thoughts, can lack or abound. It may happen that neither the information that p is available nor the information that ~p is available, where p is an atomic proposition and ~p, the negation of p, expresses that p is false. It may also happen that both the information that p and the information that ~p is available. It may even happen that the information that both of the latter scenarios arise is available. Such situations can be encoded by assigning a so-called “higher-order truth value” to p. One may wonder which higher-order truth value then emerges for more complex compound propositions. In any case there is no reason to assume that the information that p and ~p provides any information whatsoever in the sense of entailing each and every proposition, or that the information that p or ~p is always available.

A basic logic for information processing is the so-called first-degree entailment logic, FDE. The system FDE is paraconsistent, it violates the classically valid principle ex contradictione sequitur quodlibet. First-degree entailment logic is also paracomplete insofar as the law of the excluded third does not hold, thereby accounting for incomplete information. However, FDE lacks a genuine implication operation, usually denoted as “→”, and if one expands FDE by the constructive implication of intuitionistic logic, one ends up with a constructive paraconsistent logic that was suggested by David Nelson (and an often not mentioned co-author of Nelson, Ahmad Almukdad).

Secondly, logic is of great instrumental value; it helps us to clarify conceptual relationships and recognize consequences of assumptions we are operating upon. Unfortunately, even nowadays philosophers do not always define the notions they are investigating, which makes it difficult to apply logic.

**3:AM:** Two names, “Gentzen” and “Belnap”, appear large in your work and I think the general reader will not be aware of them. So to start with can you introduce us to Gentzen. Can you say why he’s important to logic and your work?

**HW:** Gerhard Gentzen was a German mathematician and logician. He was born in 1909 and defended his PhD thesis in 1933 at the University of Göttingen. Gentzen was a student of Paul Bernays, but when Bernays was forbidden to teach by the Nazi authorities, Hermann Weyl formally acted as Gentzen’s thesis adviser. In 1935 Gentzen became an assistant to David Hilbert, and in 1943, Gentzen took up a position at the German University in Prague. Shortly after World War II, in August 1945, Gerhard Gentzen died of malnutrition in prison at Charles Square in Prague.

Gentzen is often said to have been a genius, and his work on the foundations of mathematics and proof theory has indeed been pathbreaking. He is known for his proof of the consistency of Peano Arithmetic, but within philosophy and logic, he is especially famous for his invention of two kinds of formal proof systems: natural deduction and the sequent calculus. A brief passage from the published version of his thesis is often presented as suggesting an operational, rule-based account of the meaning of the logical operations, giving rise to what is now called “proof-theoretic semantics”. Whereas systems of natural deduction contain rules that manipulate formulas, Gentzen’s sequent calculus rules manipulate derivability statements. Gentzen showed that for classical and so-called intuitionistic logic, a certain rule that expresses the use of lemmata in inferences, the cut rule, although it cannot be derived, need not be postulated as primitive. This is the content of his celebrated Hauptsatz (main theorem). His proof that applications of the cut rule can be eliminated from inferences has several important applications, for example, as a means of showing consistency. All this is important to my work because I have explored to which extent Gentzen’s sequent calculi can be generalized to cover modal logics. Moreover, I took up work by the German philosopher Franz von Kutschera to show that all logical connectives that can be captured in certain Gentzen style sequent calculi for logics with strong negation can be defined from a given small set of such connectives.

**3:AM:** And Belnap, what makes him significant?

**HW:** Nuel D. Belnap is, in my opinion, among the most important philosophers and logicians of the 20th century. He was born in 1930 and from the 1950s onwards, Alan Ross Anderson and Nuel Belnap at the University of Pittsburgh developed systems of so-called “relevance logic”. Relevance logic is a branch of non-classical logic to which subsequently several influential logicians have made contributions, including Robert K. Meyer, Richard Routley (later Sylvan), J. Michael Dunn, Larisa Maksimova, Alastair Urquhart, Kit Fine, Graham Priest, Edwin Mares, and Greg Restall. A guiding idea of relevance logic is that in a valid implication (A → B), there should be some kind of relevance connection between A and B. This has led to the so-called variable-sharing requirement, saying that in a valid implication (A→ B), the formulas A and B should share some syntactic component; in propositional logic they should share at least one propositional variable. The aforementioned first-degree entailment logic, FDE, is due to Michael Dunn and Nuel Belnap. It is an implication-free relevance logic and has been presented by Belnap as a “useful four-valued logic” of “how a computer should think”.

Nuel Belnap’s work has been very significant for my own research. I have, for example, just edited a special issue of the Journal Studia Logica on “40 Years of FDE” together with Hitoshi Omori, and Hitoshi and I are working on the edition of a volume titled “*New Essays on Belnap-Dunn Logic*”. The journal special issue contains, among other contributions, a joint paper written by Sergei Odintsov in Novosibirsk and myself on FDE-based modal logics, and Sergei and I are about to work on a bi-lateral German/Russian research project on exactly that topic. [By the way, this co-operation exemplifies what I said in reply to your first question. Sergei is a mathematical logician working at the Sobolev Institute of Mathematics of the Russian Academy of Sciences, whereas I am a philosophical logician working in a philosophy department.] I already mentioned Nuel Belnap’s work on a certain generalization of Gentzen’s sequent calculus, but there is yet another Belnapian achievement that inspired me a lot, namely the seeing-to-it-that theory (stit-theory) of concrete agency, on which Nuel Belnap, Michael Perloff, and Ming Xu had been working in Pittsburgh since the late 1980s. This stit-theory is now one of the most widely applied formal theories of concrete agency (as compared to generic agency, exemplified by, for example, computer programs and other action types). Recently, I have been making use of stit-theory in a very special kind of modal logic, namely the logic of imagination reports. Imagination is a peculiar kind of mental attitude towards a proposition. Whereas belief is widely held to be something involuntary insofar as it seems to many that, as a matter of psychological fact, subjects cannot decide to believe, imagination (or at least a prominent kind of imagination) clearly is under voluntary control.

**3:AM:** One of the things that immediately emerges from all your work is that there seem to be many different kinds of logic. How is that so?

**HW:** There is a logic that is nowadays called “classical logic”. It is based on a number of assumptions, including the assumption that a domain of individuals the language of classical logic is used to talk about is never empty, and that every sentence is either true or false but not neither true nor false nor both true and false in a given situation. Historically, this logic is rather young and goes back to work by George Boole and Gottlob Frege in the second half of the 19th century. The first textbook on classical first-order logic appeared in 1927. It may be debated whether the classicality of what is now called “classical logic” is a historical coincidence or whether classical logic is classical for some deeper reasons. In any case there have been quite different systems of logic dating back to antiquity. [Sharing Belnap’s indeterminist conception of time as branching towards the future, I doubt that the history of logic is developing linearly. I have my doubts when it comes to linear models of social history as well.]

From a more technical perspective, there is a notion of a logical system that used to be widely accepted, and that is the notion of a consequence relation due to Alfred Tarski. According to that notion, a logic is a relation between sets of formulas (the assumptions or premises) and a single formula (the conclusion). Of course, not just any such relation is deemed a logic; a logical consequence relation has to satisfy a number of conditions. I am not sure whether I should list these conditions here, but I think I should remark that there are uncountably many such consequence relations. It is known that there are uncountably many logics already located between intuitionistic propositional logic (one of the most prominent non-classical logics) and classical propositional logic. These so-called intermediate logics may be seen to relax standards of constructiveness when moving from intuitionistic to classical logic. But there are indeed logics of quite different kinds. Many-valued logics give up the idea that every sentence is exactly either true or false, and paraconsistent logics, as already mentioned, give up the idea that a contradiction entails any proposition whatsoever.

**3:AM:** Does it make sense to ask whether there is a right logic? If it does, is classical logic the one?

**HW:** I think it does make sense to ask whether there is a right logic, provided the question is made more precise. A logical system requires a language, and there are different kinds of languages with different logical vocabularies. Classical logic in its standard vocabulary is certainly not the single correct logic simply because its rather expressive logical vocabulary is not expressive enough. One may wonder whether there is an all-encompassing logical vocabulary, but it is very easy to make your logical vocabulary extremely complex, so that at least for practical purposes it makes a lot of sense to work with a restricted vocabulary that is adjusted to certain applications. If you do not intend to talk about necessity, there is little need to use a modal necessity operator saying “it is necessary that”. But suppose you have fixed your logical vocabulary, and you want to reason about, say, necessity. Then you will quickly realize that there are different notions of necessity, for example, logical necessity, physical necessity, or even metaphysical necessity. Of course you can use different modal operators in your language to represent the different notions of necessity. Whether or not there is the correct logic of logical necessity, it is clear that the modal logic called “S5” comes much closer to capturing the notion of logical necessity than other systems of modal logic. Thus, I would say that it makes sense to ask whether a logic is right for certain applications, and trying to answer that question may result in substantial insights. What seems to be clear, for example, is that classical logic is not the right logic of information processing, and that it is not the right logic for constructive reasoning either. It is clear that a paraconsistent logic like FDE is much better suited to reasoning about information than classical logic, and that Nelson’s constructive paraconsistent logic is much better suited to constructively reason about information than FDE.

**3:AM:** Proofs and their relation to linguistic meaning is a theme running through much of your work isn’t it? Can you sketch how proofs help us with meaning?

**HW:** In general proof theory it is assumed that proofs are abstract objects. A proof has a finite number of premises and a conclusion. The premises and the conclusion are sentences of some language, (or maybe derivability statements expressing that a conclusion is derivable from a finite number of premises). Proofs are constructed by means of applying certain rules to premises and to axioms that are taken for granted. If the proof rules were such that any sentence whatsoever is provable, then this would trivialize our theories. The proof system would not help to draw any distinctions between the formulas of the language under consideration, so we are interested in non-trivializing proof systems in which some formulas are indeed unprovable, and then provability can be seen as imposing some constraints on linguistic meaning: The conclusion is provable from the premises in virtue of the meaning of the expressions occurring in the premises and the conclusion, and a proof thus provides information about linguistic meaning. Let me give a very simple example. If we have a theory about species that comprises certain statements about mammals and whales, and if we can present a proof from these statements showing that every whale is a mammal, then the proof may be seen to establish a fact about the meaning of the expressions “mammal” and “whale”. The rules of the proof system that underlies our theories may be seen to provide information about the meaning of the logical operations.

**3:AM:** You began by analyzing meaning in terms of proofs and grounds for asserting the conclusions of inferences didn’t you? Can you say how this works and perhaps give an example?

**HW:** Well, so far I have not been talking about validity, and the proof-theoretic understanding of validity is a delicate matter. Let me first briefly highlight that in so-called model-theoretic semantics, a proof reveals the validity of a sentence if the proof system is sound model-theoretically, which means that it guarantees that for every model M (every representation of some part of an assumed reality) the proven formula is true in M. In this type of semantics, it is assumed that the truth conditions and the falsity conditions of a formula in models specify the meaning of the formula. Proof-theoretic semantics assumes that the rules of a proof system specify linguistic meaning. A proof-theoretic notion of validity of proofs has been introduced by Dag Prawitz in the 1970s with respect to Gentzen’s natural deduction proof systems. These proof systems contain introduction and elimination rules for logical operations. The introduction rules for a logical operation symbol # lay down how a formula with # as its main operation symbol is canonically derived. And finally here is a translation of the almost unavoidable quote from Gentzen’s “*Untersuchungen über das logische Schließen*” (1934/35):

The introductions represent, as it were, the ‘definitions’ of the symbols concerned, and the eliminations are no more, in the final analysis, than the consequences of these definitions. This fact may be expressed as follows: In eliminating a symbol, we may use the formula with whose terminal symbol we are dealing only ‘in the sense afforded it by the introduction of that symbol’. (Gentzen, 1934/35, p. 80)

The introduction rules may indeed by viewed as describing grounds for asserting the conclusions of inferences. Suppose that Π and Π’ are proofs of A and B respectively. The rule (&I) of conjunction introduction, for example, states that from these two proofs we can obtain a proof of the conjunction (A&B) (“A and B”) by combining the two proofs Π and Π’ and deriving (A&B) from their conclusions.

The definition of Prawitz’s notion of validity of proofs refers to a reduction procedure that transforms proofs into what Prawitz calls “canonical proofs”, that is proofs ending with an introduction step. According to Prawitz, a canonical proof without assumptions is valid if its immediate subproofs are valid, whereas a non-canonical proof without assumptions is valid if it reduces to a valid canonical proof without assumptions. Roughly speaking, the reductions in question remove detours from proofs. If detours in proofs can always be eliminated by iterated applications of reduction steps, the proof system enjoys normalization, and every proof without assumptions can be transformed into a canonical proof ending in an application of an introduction rule. A formula may then be said to be proof-theoretically valid with respect to a given proof system and set of reductions if it has a valid proof in that system and the system enjoys normalization. The theorems of intuitionistic propositional logic are proof-theoretically valid because Gentzen’s natural deduction proof-system for this logic enjoys normalization with respect to the standard reductions.

**3:AM:** How do you understand intuitionist propositional logic? Are you in disagreement with figures like Dummett and Kripke in this?

**HW:** I understand intuitionistic logic as an attempt to formalize constructive reasoning, in particular constructive reasoning in mathematics, and I consider that attempt to be successful as far as reasoning about provability in a language without negation is concerned. I believe that intuitionistic logic does not formalize constructive reasoning as far as negative information and disprovability is concerned. A constructive feature of intuitionistic logic is, for example, that in this logic a disjunction “A or B” is provable if and only if A is provable or B is provable. Nelson’s constructive four-valued logic with strong negation enjoys the disjunction property as well, but in addition it also satisfies what is called the “constructible falsity property”: The negation of (A&B) is provable if and only if the negation of A is provable or the negation of B is provable, which we may understand as saying that a conjunction is disprovable if and only if at least one of the conjuncts is disprovable. I think I am in disagreement with Dummett as far as falsification and its role in proof-theoretic semantics is concerned. I am certainly not in disagreement with Kripke concerning intuitionistic logic. His model-theoretic semantics of intuitionistic logic in terms of information states that are preordered by a relation of possible expansion of such states is really appealing and can be generalized to obtain a semantics for Nelsons’s paraconsistent logic.

**3:AM:** Paraconsistent logics are another set of logics that veer away from classical logic. Priest and his dialetheist logic is a version that tolerates true contradictions. Can you say how and why logicians find it useful to have logics that tolerate inconsistencies – after all, it may seem strange that it can be logical to be inconsistent?

**HW:** Thank you for this question; it gives me an opportunity to address a possible misunderstanding regarding dialetheism, the view that some statements are indeed both true and false. My dear friend Graham Priest is a dialetheist. He believes that some sentences, including the liar sentence that says of itself that it is false, are dialetheas, that they are both true and false. A defender of (the usefulness of) a paraconsistent logic need not be a dialetheist, and Graham is well aware of this. Rejecting the idea that any proposition whatsoever follows from a contradiction does not mean that one is committed to dialetheism. In the Kripke-style semantics for Nelson’s paraconsistent logic, for example, an information state s may both support the truth of a formula A and support the truth of ~A, the negation of A. But this does not mean that both A and ~A are indeed true at s. Intuitively, it may happen that the state s supports the truth of A and supports the truth of ~A because both A and ~A are being told true at s. Logics that reject the idea of ex contradictione sequitur quodlibet are useful because we are encountering information overflow and nevertheless do not want to infer anything whatsoever from contradictory information.

(We have to be careful here. There are quite a few readings of the word “contradiction”. By a contradiction I here mean a formula of the form (A&~A).) We receive the information that Donald Trump does not believe that the Russian government tried to influence the election in 2016 and we receive the information that Donald Trump believes that the Russian government tried to influence the election in 2016. Nevertheless, we are still not inclined to infer, for example, that 16 is a prime number. It is rational to use a logic with a paraconsistent negation because this allows us to say “goodbye” to ex contradictione sequitur quodlibet. Moreover, it is rational to use a paraconsistent logic also because it enables us to draw distinctions between contradictions (A&~A) and (B&~B). Recently, Michel Dunn has argued that contradictory information that (A&~A) can be more useful than no information concerning A, and I believe he is right. The information that (A&~A) may bring A and ~A to our attention and that can be helpful if we have been unaware of A and of ~A. Receiving contradictory information about Trump, for example, may prompt a person to collect further information and learn interesting things about U.S.A.’s political system.

When I claim that sometimes it is better to have contradictory information than no information, I should emphasize that I clearly draw a distinction between having the information that A and believing that A. (Actually, the useful four-valued logic FDE has sometimes been motivated slightly misleadingly in terms of belief-related vocabulary. Nuel Belnap and I have written a short paper in which we stress that the four values are indeed informational “told” values.) Note also that one can go beyond paraconsistency. Most paraconsistent logics are negation consistent: They do not have pairs of formulas A and ~A as theorems. But there are also non-trivial and negation inconsistent paraconsistent logics. Richard Routley called negation inconsistent logics “dialectical”.

**3:AM:** What’s Nelson’s paraconsistent logic and why is it important for computer science?

**HW:** I have already made a number of comments on Nelson’s paraconsistent logic, nowadays also called “N4”, in my replies to earlier questions, so maybe I can be brief here. From my favorite point of view, N4 can be described as expanding the four-valued logic FDE by a constructive implication, so that the system formalizes a kind of constructive paraconsistent reasoning about information. If this reasoning is resource-sensitive, if it makes a difference whether a piece of information is available n times or n+1 times, then one has to consider substructural subsystems of N4. One central aspect of computer science is knowledge representation, and knowledge representation is closely related to information representation. There is an area in knowledge representation that is called “description logic”, and I have been told that description logics now play a role for the semantic web. Description logics are modal logics in disguise and are therefore a bridge between philosophical and mathematical modal logic on the one hand and computer science logic on the other. Back in 2002, I think, Sergei Odintsov and I gave a talk about paraconsistent, inconsistency-tolerant description logics to the group of Franz Baader in the computer science department at TU Dresden. (Franz Baader is one of the world-wide leading researchers in description logic.) At that time Franz was rather skeptical about description logics based on a paraconsistent logic. In the meantime paraconsistent description logics are not so exotic any more, although it is still very prominent in computer science to investigate mechanisms for avoiding inconsistencies and retaining classical logic instead of working with an inconsistency-tolerant logic.

**3:AM:** What are models and model-theoretic semantics and what role do they play in treating paraconsistency?

**HW:** For many if not most philosophers, formal semantics are model-theoretic semantics. Model-theoretic semantics exemplify realistic conceptions of linguistic meaning insofar as the meaning of an expression is, in one way or another, related to parts of reality or their representation. A proper name, for example, is taken to refer to an individual from a given domain of individuals, and the reference may be assumed to possibly vary from possible word to possible word, an assumption that is quite natural for many applications. Thus, for many philosophers semantics is about a relation between language and representations of parts of reality. Although model-theoretic semantics is the dominating paradigm, it is only one out of several paradigms within semantics. However, even for proof-theoretic semanticists, who are usually classified as being anti-realists, model-theoretic semantics is useful. If a mode-theoretic semantics is sound with respect to a given proof-system, finding a countermodel to a formula establishes the unprovability of that formula. Therefore, semantical models are very important in general, and they are important for treating paraconsistency as well. If one accepts that a model cannot satisfy a contradiction (A&~A), inconsistency-tolerant reasoning mechanisms must incorporate ways of eliminating contradictions so as to make room for applying classical logic or another inconsistency-intolerant logic. Model-theoretic semantics of paraconsistent logics must make sense of satisfying contradictions. As remarked above, in Nelson’s paraconsistent system N4, satisfying a contradiction (A&~A) at a given information state can be seen as saying that the state supports the truth and supports the falsity of A, i.e., supports the truth of A and the truth of ~A. (If this is expressed by saying that the state both verifies and falsifies A, verification and falsification are nevertheless to be understood as support of truth and support of falsity.)

**3:AM:** Computer science uses an extension of classical logic to express temporal reasoning. Why combine that with extensions of paraconsistent and intuitionist logics?

**HW:** Computer scientists are interested in temporal reasoning for reasons that are often very applications driven and different from, say, considerations about the ontology of time that are of interest to philosophers. If we want to formalize constructive reasoning with possibly inconsistent information, it is quite natural to investigate systems of temporal logic based on logics that are paraconsistent or constructive, or both.

**3:AM:** What’s connexive logic and what paradoxes does it help to solve?

**HW:** Systems of connexive logics validate what are called Aristotle’s and Boethius’ Theses: ~(A→ ~A), ~(~A→ A), ~ (A→ B) → (A → ~B), (A → ~B) → (A → B). The latter formulas are non-theorems of classical propositional logic, and in order to avoid trivialization, i.e., provability of all formulas, in systems of connexive logic, some theorems of classical logic must be given up. This is very different from what is the case with other non-classical logics, like intuitionistic logic, which are subsystems of classical logic in their shared syntax. Connexive logics are orthogonal to classical logic and are therefore also referred to as “contra-classical logics”. They have been studied not with the aim of resolving or avoiding certain paradoxes, but because they were, as it seems, endorsed by prominent ancient philosophers, namely Aristotle and Boethius (which brings us back to thinking about the classicality of so-called “classical logic”). Indeed, connexive first-order logic can be used to translate Aristotle’s syllogistic into the language of first-order logic. Another reason for studying connexive logics is that the understanding of negated implications expressed by Aristotle’s and Boethius’ Theses seems to be contested by empirical research among people who have not been exposed to courses in classical logic.

A straightforward and conceptually clear road to connexivity consists of requiring suitable falsity conditions for implications. This approach is particularly natural for expansions of first-degree entailment logic, such as Nelson’s paraconsistent logic, because FDE is a basic and simple four-valued logic which clearly separates truth and falsity from each other as two independent semantical dimensions. Recently, I had a great interest in connexive logics, and I have suggested a connexive logic, C, that suitably modifies the falsity condition of implications in Nelson’s paraconsistent logic (see, for example, here and here). Whilst in Nelson’s logic, as in classical logic and most other logics, an implication “if A, then B” is false just in case A is true and B is false, in C, “if A, then B” is false just in case A implies ~B. The system is very non-standard. It is non-trivial but inconsistent and validates, for example, ~((A&~A)→ ~(A&~A)) as an instance of one of Aristotle’s’ Theses as well as (A&~A) → ~(A&~A), where validating a formula means that every state of every model supports the truth of that formula. The latter formula emerges as valid quite naturally. A state supports the truth of ~(A&~A) just in case it supports the truth of ~A or the truth of ~~A. A state supports the truth of ~~A just in case it supports the truth of A. But if a state (and every expansion of it) supports the truth of (A&~A), it certainly supports the truth of both conjuncts separately.

**3:AM:** And for the readers of 3:AM, are there five books (other than your own) you can recommend to take us into your philosophical world?

**HW:** That’s a difficult question because my philosophical world is accessible to a large extent through journal papers and yet, the choice of five books is bound to leave out quite a few equally recommendable works, but here we go:

• Katalin Bimbó (ed.), *J. Michael Dunn on Information Based Logics*, Springer, 2016.

This is a book dedicated to J. Michel Dunn. As the title suggests, the volume has a focus on logics for reasoning about information.

• Graham Priest, *An Introduction to Non-Classical Logic*, Second Edition: From If to Is, CUP, 2008.

There are several things that are special and particularly nice about this textbook. It covers an amazingly broad spectrum of non-classical logics, including paraconsistent and even connexive systems, and it is written with great expository care and competence, taking into account that not everybody in this world is a native speaker of English.

• Sara Negri and Jan von Plato, *Structural Proof Theory*, CUP, 2010.

If one would ask me for an excellent and readable book about structural proof theory, this is one of the three monographs that immediately come to my mind.

• Nuel D. Belnap, Michel Perloff, and Ming Xu, *Facing the Future Agents and Choices in Our Indeterminist World*, Oxford UP, 2001.

This volume summarizes the work by Belnap, Perloff and Xu on stit-theory up to the year 2000. It contains rather conceptual, accessible, and philosophically interesting parts, dealing with the nature of agency, indeterminism, and the conception of time as a tree branching towards the future. The book also contains very technical work which the general reader may safely skip.

• Llody Humberstone, *The Connectives*, MIT Press, 2011.

I take this to be Llody Humberstone’s opus magnum. If you want to see one of the most remarkable and impressive logic books ever, have a look at it. Almost certainly you will not read all or even most of it, but you will get an impression of the depth and mathematical sophistication of non-classical logic.

**ABOUT THE INTERVIEWER**

**Richard Marshall** is still biding his time.

Buy his new book here or his first book here to keep him biding!

First published in 3:AM Magazine: Saturday, December 2nd, 2017.