$$\def\Dab{{\rm Dab}}$$

## Course Description

Adaptive logics offer a framework for the formal modeling of defeasible reasoning. When we reason defeasibly, our conclusions do not necessarily follow from our premises, but just likely or usually. One of the first applications of adaptive logics was reasoning on the basis of inconsistencies. The usefulness of Classical Logic is limited for this task since it allows one to derive just any formula when confronted with an inconsistent premise set. Inconsistency-adaptive logics, on the other hand, adapt themselves to a premise set in the following sense: they isolate inconsistencies and at the same time allow for the full power of Classical Logic for consistent parts of the premise set. The idea of letting a logic 'adapt itself to the premise set' has since been applied to many other forms of defeasible reasoning such as abductive reasoning (deriving explanations: think of Sherlock Holmes), inductive generalizations (to derive from 'some X are A' that 'all X are A'), normative reasoning in view of conflicting norms ('You shall not kill, but if you do, do it gently.'), default reasoning ('Tweety is a bird, thus she flies.'), and many others. In this seminar we will study adaptive logics, their dynamic proof theory, their semantics, and some of their applications. If time allows we will also investigate links to other central formal frameworks for defeasible reasoning. The literature will be announced in the first meeting. A basic understanding of propositional logic is advisory.

## Participants

Nachname Vorname
Wüste Lisa
Chechelnizki Georg
Bartels Steven
Liechti Tobias Alexander

## General Organisational Information

 Organised by: Christian Straßer, GA 3/39, 0234/3224721 Speaking hours: Wednesdays from 10:00–12:00 When: every Tuesday from 10:15 to 11:45, Summer term 2015 Where: GABF 04 / 358

### Credit Points and Marks

The credit point in this course can be earned on an individual basis depending on your personal preferences.

The following options are available:

1. Giving a talk an a relevant topic. (★★ / ★)
2. Writing an essay / a paper on a relevant topic. (★★ / ★)
3. Writing a protocol on a session. (★)
4. Examination: usually this is a combination of written and oral exam, but this can be adjusted according to personal preference. (★★ / ★)
5. doing research on a relevant topic (★★)

★★ means that you can get a mark and credit points for this option, ★ means that you can get credit points for this option.

For all options, please first talk with me about your plans: this is important to choose a relevant topic.

Option 5 is for students who feel very confident in the subject matter. This can e.g. serve as preparation for a master thesis.

### Exercises

There will be exercises for some sessions in the course. You are not obliged to do them and I will not keep written track of who did what. However, I strongly recommend you to work on exercises for your own sake: it will severely improve your understanding.

### Questions and Interaction

I strongly encourage you to ask questions during the course. Try not to be afraid to ask 'stupid/silly questions': often question which seem naive are the very best questions. Here are signs that should trigger immediate questions from your side:

• I don't know the meaning of this symbol on the slides/blackboard
• you mentioned 'notion X': I have no idea what that word means
• I think you made a mistake there
• I think I got lost somehow
• this seems interesting, is there some interesting literature I could take a look at?

Please feel also free to come to speaking hours, especially when you prepare something to get credit-points. I also encourage writing emails to me with questions.

## Slides

• v.1 [2015-04-16 Thu]
• inconsistency adaptive logics [2015-07-06 Mon]

## Exercises

Open the exercises in a dedicated window: here

 Where: GABF 04 / 358

### Session 2: Introducing Hilbert Style Proof Systems [2015-04-14 Tue]

To be discussed at [2015-04-21 Tue].

Recall the Hilbert Style axiomatisation of classical propositional logic from the slides:

 $$(A\supset1)$$ $$A\supset(B\supset A)$$ (A $$\supset$$ 2) $$(A\supset(B\supset C))\supset((A\supset B)\supset(A\supset C))$$ (A $$\supset$$ 3) $$((A\supset B)\supset A)\supset A$$ (A $$\wedge$$ 1) $$(A\wedge B)\supset A$$ (A $$\wedge$$ 2) $$(A\wedge B)\supset B$$ (A $$\wedge$$ 3) $$A\supset(B\supset(A\wedge B))$$ (A $$\vee$$ 1) $$A\supset(A\vee B)$$ (A $$\vee$$ 2) $$B\supset(A\vee B)$$ (A $$\vee$$ 3) $$(A\supset C)\supset((B\supset C)\supset((A\vee B)\supset C))$$ (A $$\equiv$$ 1) $$(A\equiv B)\supset(A\supset B)$$ (A $$\equiv$$ 2) $$(A\equiv B)\supset(B\supset A)$$ (A $$\equiv$$ 3) $$(A\supset B)\supset((B\supset A)\supset(A\equiv B))$$ (A $$\neg$$ 1) $$(A\supset\neg A)\supset\neg A$$ (A $$\neg$$ 2) $$A\supset(\neg A\supset B)$$ (A $$\neg$$ 3) $$A \vee \neg A$$

• Discussed on [2015-04-21 Tue].

Proof $$A \supset A$$.

Tip: you only need $$(A{\supset}1)$$ and $$(A{\supset}2)$$.

 1 $$(A \supset ((A \supset A) \supset A)) \supset ((A \supset (A\supset A)) \supset (A \supset A))$$ $$(A{\supset}2)$$ where $$A:=A, B := A \supset A, C:=A$$ 2 $$A \supset ((A\supset A) \supset A)$$ $$(A{\supset}1)$$ where $$A:=A, B:=A\supset A$$ 3 $$(A \supset (A\supset A)) \supset (A \supset A)$$ 1,2; MP 4 $$A \supset (A \supset A)$$ $$(A{\supset}1)$$ where $$A:= A, B:= A$$ 5 $$A \supset A$$ 3,4; MP

• Discussed on [2015-04-21 Tue].

Proof: $$(B \vee C) \supset (\neg B \supset C)$$.

 1 $$B \supset ( \neg B \supset C)$$ $$(A{\neg}2)$$ 2 $$C \supset (\neg B \supset C)$$ $$(A{\supset}1)$$ 3 $$(B \supset (\neg B \supset C)) \supset ( (C \supset (\neg B \supset C)) \supset ((B \vee C) \supset (\neg B \supset C)))$$ $$(A{\vee}3)$$ 4 $$(C \supset (\neg B \supset C)) \supset ((B \vee C) \supset (\neg B \supset C))$$ 1,3; MP 5 $$(B \vee C) \supset (\neg B \supset C)$$ 2,4; MP

• Discussed on [2015-04-21 Tue].

Proof: $$A \supset B, B \supset C \vdash A \supset C$$.

 1 $$(A \supset (B \supset C)) \supset ((A \supset B) \supset ( A \supset C))$$ $$(A{\supset}2)$$ 2 $$B \supset C$$ PREM 3 $$(B \supset C) \supset (A \supset (B \supset C))$$ $$(A{\supset}1)$$ 4 $$A \supset (B \supset C)$$ 2,3; MP 5 $$(A \supset B) \supset ( A \supset C)$$ 1,4; MP 6 $$A \supset B$$ PREM 7 $$A \supset C$$ 5,6; MP

• Discussed on [2015-04-21 Tue].

Proof: $$\neg \neg A \supset A$$.

 1 $$A \supset (\neg \neg A \supset A)$$ $$(A{\supset}1)$$ 2 $$\neg A \supset (\neg \neg A \supset \neg A)$$ $$(A{\neg}2)$$ 3 $$(A \supset (\neg \neg A \supset A)) \supset ((\neg A \supset (\neg \neg A \supset A)) \supset ((A \vee \neg A) \supset (\neg \neg A \supset A)))$$ $$(A{\supset}2)$$ 4 $$(\neg A \supset (\neg \neg A \supset A)) \supset ((A \vee \neg A) \supset (\neg \neg A \supset A))$$ 1,3; MP 5 $$(A \vee \neg A) \supset (\neg \neg A \supset A)$$ 2,4; MP 6 $$A \vee \neg A$$ $$(A{\neg}3)$$ 7 $$\neg \neg A \supset A$$ 5,6; MP

Proof: $$A \supset \neg \neg A$$.

##### Solution
 1 $$(\neg A \supset \neg \neg A) \supset \neg \neg A$$ $$(A{\neg}1)$$ 2 $$((\neg A \supset \neg \neg A) \supset \neg \neg A) \supset (A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A))$$ $$(A{\supset}1)$$ 3 $$A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A)$$ 1,2; MP 4 $$A \supset (\neg A \supset \neg \neg A)$$ $$(A{\neg}2)$$ 5 $$(A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A)) \supset ((A \supset (\neg A \supset \neg \neg A)) \supset (A \supset \neg \neg A))$$ $$(A{\supset}2)$$ 6 $$(A \supset (\neg A \supset \neg \neg A)) \supset (A \supset \neg \neg A)$$ 3,5; MP 7 $$A \supset \neg \neg A$$ 4,6; MP

• Discussed on [2015-04-21 Tue].

Show $$\{A, \neg A\} \vdash B$$.

Tip: use $$(A{\neg}2)$$ and MP.

 1 $$A$$ PREM 2 $$\neg A$$ PREM 3 $$A \supset (\neg A \supset B)$$ $$(A{\neg}2)$$ 4 $$\neg A \supset B$$ 1,3; MP 5 $$B$$ 2,4; MP

### Session 3: Mathematical Induction

Don't forget to take a look at the solutions of Session 2 and prepare Task 5 of Session 2 (we didn't discuss that example yet).

Recall from the course that mathematical induction works as follows:

You want to prove a statement $$S(n)$$ for all natural numbers $$n \in \mathbb{N}$$ (or for a subset of natural numbers $$N$$). E.g., in task one $$S(n)$$ reads $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$.

Induction Base: You start by proving it for the first natural number (in your set of interest $$N$$). For $$\mathbb{N}$$ this is 1 (except you begin to count at 0). In our example you first prove $$1 = \frac{1 \cdot (1+1)}{2}$$.

Induction Step: Then you suppose that you have shown your statement $$S(\cdot)$$ for an arbitrary number $$n$$ and on this basis show that your statement also holds for $$n+1$$ (this is called the induction hypothesis or the inductive hypothesis). In our first example you suppose that $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$ holds and then you show that $$\sum_{i=1}^{n+1} i = \frac{(n+1)\cdot(n+2)}{2}$$ holds.

There is a variant of the proof method called strong mathematical induction. In the induction step you assume that $$S(\cdot)$$ holds for all $$m \le n$$ for an arbitrary $$n$$ and on this basis demonstrate that $$S(n+1)$$.

Prove via mathematical induction that $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$.

• Base Case ($$n=1$$): $$\sum_{i = 1}^{n} i = 1$$ and $$\frac{n \cdot (n+1)}{2} = \frac{1 \cdot 2}{2} = 1$$. Thus, $$\sum_{i = 1}^{n} = \frac{n \cdot (n+1)}{2}$$.
• Induction step: The inductive hypothesis is $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$. We need to show that $$\sum_{i=1}^{n+1} i = \frac{(n+1)\cdot (n+2)}{2}$$. We have:
1. $$\sum_{i=1}^{n+1} i = \left(\sum_{i=1}^n i \right) + (n+1)$$
2. By the inductive hypothesis, $$\left(\sum_{i=1}^n i \right) + (n+1) = \frac{n \cdot (n+1)}{2} + (n+1)$$
3. Note that $$\frac{n \cdot (n+1)}{2} + (n+1) = \frac{n \cdot (n+1)}{2} + \frac{2 \cdot (n+1)}{2} = \frac{n \cdot (n+1) + 2 \cdot (n+1)}{2} = \frac{(n+2) \cdot (n+1)}{2} = \frac{(n+1) \cdot (n+2)}{2}$$.
4. By 2 and 3, $$\sum_{i=1}^{n+1} i = \frac{(n+1) \cdot (n+2)}{2}$$.

Prove via mathematical induction that $$T(n) = 2^n - 1$$ where $$T(n) = \left\{ \begin{array}{cc} 1 & n=1 \\ 2 \cdot T(n-1) + 1 & n > 1 \end{array} \right.$$

• Base Case ($$n=1$$): $$T(1) = 1 = 2^1 - 1$$.
• Inductive step: The inductive hypothesis is that $$T(n) = 2^n - 1$$. We need to show that $$T(n+1) = 2^{n+1} - 1$$.
1. $$T(n + 1) = 2 \cdot T(n + 1 - 1) + 1 = 2 \cdot T(n) + 1$$.
2. By the inductive hypothesis and 1, $$T(n + 1) = 2 \cdot (2^n - 1) + 1$$.
3. Note that $$2 \cdot (2^n - 1) + 1 = (2^{n+1} - 2) + 1 = 2^{n+1} - 1$$.
4. By 2 and 3, $$T(n+1) = 2^{n+1} - 1$$.

### Session 4: some more proofs, the deduction theorem and transitivity

Don't forget that in tasks you can rely on results obtained in previous tasks.

#### Task 1: Reasoning by Cases

Prove: $$A \vee B, A \supset C, B \supset C \vdash C$$.

##### Solution
1. $$A \supset C$$ by Premise introduction.
2. $$(A \supset C) \supset ((B\supset C) \supset ((A \vee B) \supset C))$$ by $$(A\vee 3)$$
3. $$(B\supset C) \supset ((A \vee B) \supset C)$$ by MP.
4. $$B\supset C$$ by premise introduction.
5. $$(A \vee B) \supset C$$ by MP.
6. $$A \vee B$$ by premise introduction.
7. $$C$$ by MP.

Show that if $$\Gamma \vdash A$$ for all $$A \in \Gamma'$$ and $$\Gamma' \vdash B$$, then also $$\Gamma \vdash B$$.

##### Solution
• Suppose $$\Gamma \vdash A$$ for all $$A \in \Gamma'$$ and suppose $$\Gamma' \vdash B$$.
• Hence, there is a proof of $$B$$ from $$\Gamma'$$.
• Since proofs are finite, there is a finite subset $$\Gamma'_f = \{B_{1}, \ldots, B_{n}\}$$ of $$\Gamma'$$ such that there is a proof $$\mathcal{P}'$$ from $$\Gamma'_f$$ of $$B$$.
• For each $$B_i$$ ($$i \in \{1, \ldots, n\}$$) we know that there is a proof $$\mathcal{P}_i$$ of $$B_i$$ from $$\Gamma$$.
• We now concatenate the proofs $$\mathcal{P}_1, \ldots, \mathcal{P}_n$$ (under renaming of the proof lines) resulting in a proof $$\mathcal{P}^{\star}$$.
• Let $$\mathcal{P}''$$ be the result of
• removing each line in $$\mathcal{P}'$$ where $$B_i$$ ($$i \in \{1, \ldots, n\}$$) is introduced by Premise introduction
• and replacing each reference to a line in $$\mathcal{P}'$$ where some $$B_i$$ ($$i \in \{1, \ldots, n\}$$) is introduced by premise introduction in a justification of some line by a reference to a line in $$\mathcal{P}^{\star}$$ where $$B_i$$ is derived.
• Now we concatenate $$\mathcal{P}^{\star}$$ and $$\mathcal{P}''$$.
• This is a proof of $$B$$ from $$\Gamma$$.

#### Remark

Recall that last session we proved the deduction theorem. This can drastically simplify proofs.

Here's one example how it can be used in a proof.

Proof: $$\vdash (A \supset B) \supset ((B \supset C) \supset (A \supset C))$$.

1. Suppose $$A \supset B$$.
1. Suppose $$B \supset C$$.
1. Suppose $$A$$.
• Then by MP and 1, $$B$$.
• By MP and 1.1, $$C$$.
2. We have shown that $$A \supset B, B \supset C, A \vdash C$$.
3. By the deduction theorem, $$A \supset B, B \supset C \vdash A \supset C$$.
2. By the deduction theorem, $$A \supset B \vdash (B \supset C) \supset (A \supset C)$$.
2. By the deduction theorem, $$\vdash (A \supset B) \supset ((B \supset C) \supset (A \supset C))$$.

Show $$\vdash (A \supset B) \supset (\neg A \vee B)$$.

##### Solution
1. Suppose $$A \supset B$$.
1. Suppose $$A$$.
• By 1 and MP, $$B$$.
• By $$(A\vee 2)$$, $$B \supset (\neg A \vee B)$$.
• By MP, $$\neg A \vee B$$.
2. We have shown that $$A \supset B, A \vdash \neg A \vee B$$.
3. By the deduction theorem, $$A \supset B \vdash A \supset (\neg A \vee B)$$.
4. Suppose $$\neg A$$.
1. By $$(A\vee 1)$$, $$\neg A \supset (\neg A \vee B)$$.
2. By MP, $$\neg A \vee B$$.
5. We have shown that $$A \supset B, \neg A \vdash \neg A \vee B$$.
6. By the deduction theorem, $$A \supset B \vdash \neg A \supset (\neg A \vee B)$$.
7. Note that by $$(A \neg 3)$$, also $$A \supset B \vdash A \vee \neg A$$.
8. By lines 3, 6, 7, Task 1 and Task 2, also $$A \supset B \vdash \neg A \vee B$$. (Do you see why?)
2. By the deduction theorem $$\vdash (A \supset B) \supset (\neg A \vee B)$$.

Show $$\vdash (\neg A \vee B) \supset (A \supset B)$$.

##### Solution
1. Suppose $$\neg A \vee B$$.
1. By $$(A \supset 1)$$, $$B \supset (A \supset B)$$.
2. Suppose $$\neg A$$.
1. Suppose $$A$$.
1. By a previous task, $$B$$.
2. By the deduction theorem, $$\neg A \vee B, \neg A \vdash A \supset B$$.
3. By the deduction theorem, $$\neg A \vee B \vdash \neg A \supset (A \supset B)$$.
4. Hence, by 1, 3, Task 1 and Task 2, also $$A \supset B$$. (Do you see why?)
2. We have shown that $$\neg A \vee B \vdash A \supset B$$.
3. By the deduction theorem, $$\vdash (\neg A \vee B) \supset (A \supset B)$$.

### Session 5: Dynamic Proofs and Adaptive Strategies

Note that in tasks you can always use results you obtained in previous tasks.

Show that: if $$\Gamma \vdash A \vee \Dab(\Delta)$$ (where $$\vdash$$ is the derivability relation of the lower limit logic) then $$A$$ is derivable on a (non-empty) condition $$\Delta$$ in an adaptive proof from the premise set $$\Gamma$$.

For the courageous: Show the other direction. (You will need an inductive argument over the length of the dynamic proof of $$A \vee \Dab(\Delta)$$.)

##### Solution

($$\Rightarrow$$)

1. Let $$\Gamma \vdash A \vee \Dab(\Delta)$$.
2. Hence, there is a proof in LLL of $$A \vee \Dab(\Delta)$$ from $$\Gamma$$.
3. Hence, there is a finite subset $$\{A_1, \ldots, A_n\}$$ of $$\Gamma$$ from which $$A \vee \Dab(\Delta)$$ is provable.
4. We now construct a dynamic proof of $$A$$ from $$\Gamma$$ as follows:
 1 $$A_1$$ PREM $$\emptyset$$ 2 $$A_2$$ PREM $$\emptyset$$ $$\vdots$$ $$\vdots$$ PREM $$\emptyset$$ $$n$$ $$A_n$$ PREM $$\emptyset$$ $$n{+}1$$ $$A$$ $$1,\ldots,n;$$ RC $$\Delta$$

($$\Leftarrow$$)
We prove this by induction over the length of the proof of $$A$$ on the condition $$\Delta$$. For simplification we let '$$\vee \Dab(\emptyset)$$' denote the empty string.

Base case (line 1): $$B$$ is introduced as a premise on the condition $$\emptyset$$. Hence, $$B \in \Gamma$$ and this $$\Gamma \vdash B$$ and hence $$\Gamma \vdash B \vee \Dab(\emptyset)$$.

Inductive Step (line $$n{+}1$$): The inductive hypothesis is that, where $$C$$ is derived on the condition $$\Theta$$ at a line $$m \leq n$$, then $$\Gamma \vdash C \vee \Dab(\Theta)$$.

Suppose $$B$$ is derived at line $$n{+}1$$ on the condition $$\Lambda$$ with the justification J.

1. Case J = PREM: The argument is analogous to the base case.
2. Case J = $$l_1, \ldots, l_m$$; RU:
1. Suppose $$A_i$$ is derived on the condition $$\Delta_i$$ at line $$l_{i}$$ (where $$1 \le i \le m$$).
2. Hence, $$A_1, \ldots, A_m \vdash B$$ (by the definition of the rule RU) and $$\Theta = \Delta_1 \cup \ldots \cup \Delta_m$$.
3. For all $$i \in \{1, \ldots, m\}$$ we have:
1. By the inductive hypothesis, $$\Gamma \vdash A_i \vee \Dab(\Delta_i)$$.
2. Hence, $$\Gamma \vdash A_i \vee \neg\neg \Dab(\Delta_i)$$.
3. Hence, $$\Gamma \vdash \neg\neg \Dab(\Delta_i) \vee A_i$$. (Recall: $$A \vee B \vdash B \vee A$$.)
4. Thus, $$\Gamma \vdash \neg\Dab(\Delta_i) \supset A_i$$ (Recall: $$\neg A \vee B \vdash A \supset B$$.)
5. Thus, $$\Gamma \cup \{\neg \Dab(\Delta_i)\} \vdash A_i$$. (Resolution Theorem)
6. Thus, $$\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash A_i$$ (Monotonicity)
4. Hence, $$\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash A_1 \wedge \ldots \wedge A_m$$ (Recall that $$A \vdash B_1$$ and $$A \vdash B_2$$ implies $$A \vdash B_1 \wedge B_2$$.)
5. By 2.2 also $$A_1 \wedge \ldots \wedge A_m \vdash B$$ (Follows by transitivity.)
6. By 2.4 and 2.5, $$\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash B$$ (this is transitivity).
7. Thus, $$\Gamma \vdash B \vee \Dab(\Delta_1) \vee \Dab(\Delta_2) \vee \ldots \vee \Dab(\Delta_m)$$. (By the deduction theorem.)
8. Hence, $$\Gamma \vdash B \vee \Dab(\Delta_1 \cup \ldots \cup \Delta_m)$$ and $$\Gamma \vdash B \vee \Dab(\Theta)$$.
3. Case J = $$l_1, \ldots, l_m$$; RC: this can be shown in a similar way as the previous case.

Show that:

1. Whenever $$A$$ is derivable on a condition $$\Delta \cup \{B\}$$ in a dynamic proof from $$\Gamma$$ then also $$A \vee B$$ is derivable from $$\Gamma$$ on the condition $$\Delta$$.
2. Whenever $$A$$ is derivable on a condition $$\Delta$$ and $$\neg A$$ is derivable on the condition $$\emptyset$$ in a dynamic proof from $$\Gamma$$, then also $$\Dab(\Delta)$$ is derivable from $$\Gamma$$ on the condition $$\emptyset$$.

##### Solution

1. Suppose $$A$$ is derived on the condition $$\Delta \cup \{B\}$$ from $$\Gamma$$.
2. By Task 1, $$\Gamma \vdash A \vee \Dab(\Delta \cup \{B\})$$.
3. Hence, $$\Gamma \vdash A \vee \Dab(\Delta) \vee B$$ and thus $$\Gamma \vdash (A \vee B) \vee \Dab(\Delta)$$. (Recall that $$\vee$$ is associative ($$A \vee (B \vee C) \vdash (A \vee B) \vee C$$) and commutative ($$A \vee B \vdash B \vee A$$).)
4. Again, by Task 1, $$A \vee B$$ is derivable from $$\Gamma$$ on the condition $$\Delta$$.

1. Suppose $$A$$ is derivable on the condition $$\Delta$$ and $$\neg A$$ is derivable on the condition $$\emptyset$$ in a proof from $$\Gamma$$.
2. By the first part of Task 2, $$\Gamma \vdash A \vee \Dab(\Delta)$$ and $$\Gamma \vdash \neg A$$.
3. Thus, $$\Gamma \vdash \Dab(\Delta)$$ (disjunctive syllogism).

Task 2 shows that the following generic rules are derived rules for dynamic proofs in adaptive logics:

$$\begin{array}{rl} A & \Delta \cup \Delta' \\ \hline A \vee \Dab(\Delta) & \Delta' \end{array} ~~~ [\rm RD1]$$

$$\begin{array}{rl} A & \Delta \\ \neg A & \emptyset \\\hline \Dab(\Delta) & \emptyset \end{array} ~~~[\rm RD2]$$

Indicate mistakes in the following proof fragment from $$\Gamma = \{(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q), (\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q), \circ p, \circ q\}$$ where the reliability strategy is used:

 1 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 2 $$(\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 3 $$(\circ p \wedge \neg p)$$ 1,2; RU $$\emptyset$$ 4 $$\circ p$$ PREM $$\emptyset$$ 5 $$p$$ 4; RU $$\{\circ p \wedge \neg p\}$$ 6 $$\circ q$$ RU $$\emptyset$$ $$\checkmark$$ 7 $$q$$ 6; RC $$\{\circ q \wedge \neg q\}$$
##### Solution
 1 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 2 $$(\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 3 $$(\circ p \wedge \neg p)$$ 1,2; RU $$\emptyset$$ 4 $$\circ p$$ PREM $$\emptyset$$ RU -> RC, marking 5 $$p$$ 4; RU $$\{\circ p \wedge \neg p\}$$ RU -> PREM 6 $$\circ q$$ RU $$\emptyset$$ no marking $$\checkmark$$ 7 $$q$$ 6; RC $$\{\circ q \wedge \neg q\}$$

Let $$\Gamma = \{\circ p, \circ( p \supset q), \neg q\}$$. Extend the following proof so that line 2 gets marked (where the reliability strategy is used).

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$
##### Solution
 1 $$\circ p$$ PREM $$\emptyset$$ $$\checkmark$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ(p \supset q)$$ PREM $$\emptyset$$ 4 $$\neg q$$ PREM $$\emptyset$$ 5 $$p \supset q$$ 3; RC $$\{\circ(p \supset q) \wedge \neg (p \supset q)\}$$ 6 $$q$$ 2,5; RU $$\{\circ p \wedge \neg p,\circ(p \supset q) \wedge \neg (p \supset q)\}$$ 7 $$(\circ p \wedge \neg p) \vee (\circ(p \supset q) \wedge \neg (p \supset q))$$ 4,6; RD2 $$\emptyset$$

Let $$\Gamma = \{\circ p, \circ q, \neg q, \neg(p \wedge q)\}$$. Extend the following proof in such a way that $$p$$ is finally derived at line 2 (where the reliability strategy is used).

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ q$$ PREM $$\emptyset$$ 4 $$\neg(p \wedge q)$$ PREM $$\emptyset$$ 5 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ 1,3,4; RU $$\emptyset$$
##### Solution
 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ q$$ PREM $$\emptyset$$ 4 $$\neg(p \wedge q)$$ PREM $$\emptyset$$ 5 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ 1,3,4; RU $$\emptyset$$ 6 $$\neg q$$ PREM $$\emptyset$$ 7 $$\circ q \wedge \neg q$$ 3,6; RU $$\emptyset$$

Let $$\Gamma = \{\circ p, \circ q, \neg p \vee \neg q \vee \neg s\}$$. Finally derive $$\neg s$$ (where the reliability strategy is used).

##### Solution
 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ q$$ PREM $$\emptyset$$ 4 $$q$$ 3; RC $$\{\circ q \wedge \neg q\}$$ 5 $$\neg p \vee \neg q \vee \neg s$$ PREM $$\emptyset$$ 6 $$\neg q \vee \neg s$$ 2,5; RU $$\{\circ p \wedge \neg p\}$$ 7 $$\neg s$$ 4,6; RU $$\{\circ q \wedge \neg q\}$$

Let $$\Gamma = \{\circ p, \circ q, \neg p \vee \neg q \vee \neg s, \circ s\}$$.

1. Is $$\neg s$$ still finally derivable?
2. Finally derive $$p \vee q \vee s$$ with the minimal abnormality strategy.
##### Solution

Ad 1: Nope. Here's a demonstration

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$\circ q$$ PREM $$\emptyset$$ 3 $$\circ s$$ PREM $$\emptyset$$ $$\checkmark_6$$ 4 $$\neg s$$ 1,2; RC $$\{\circ p \wedge \neg p, \circ q \wedge \neg q\}$$ 5 $$\neg p \vee \neg q \vee \neg s$$ PREM $$\emptyset$$ 6 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q) \vee (\circ s \wedge \neg s)$$ 1,2,3,5; RU $$\emptyset$$

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$\circ q$$ PREM $$\emptyset$$ 3 $$\circ s$$ PREM $$\emptyset$$ $$\checkmark_6$$ 4 $$\neg s$$ 1,2; RC $$\{\circ p \wedge \neg p, \circ q \wedge \neg q\}$$ 5 $$\neg p \vee \neg q \vee \neg s$$ PREM $$\emptyset$$ 6 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q) \vee (\circ s \wedge \neg s)$$ 1,2,3,5; RU $$\emptyset$$ 7 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 8 $$p \vee q \vee s$$ 7; RU $$\{\circ p \wedge \neg p\}$$ 9 $$p \vee q \vee s$$ 2; RC $$\{\circ q \wedge \neg q\}$$ 10 $$p \vee q \vee s$$ 3; RC $$\{\circ s \wedge \neg s\}$$

Note that $$\Phi_{10}(\Gamma) = \bigl\{ \{\circ p \wedge \neg p\}, \{\circ q \wedge \neg q\}, \{\circ s \wedge \neg s\} \bigr\}$$. Hence, lines 8–10 are not marked at this point according to the marking definition of minimal abnormality.

### Session 7: Smoothness and Well-Foundedness

Let in the following $$\mathbb{N} = \{1, 2, 3, \ldots \}$$ be the set of natural numbers.

#### Smoothness and Well-foundedness

Let $${\prec} \subseteq X \times X$$.

• An element $$x \in X$$ is $$\prec$$-minimal iff there is no $$y \in X$$ such that $$y \prec x$$.
• $$\prec$$ is smooth (or stuttered) iff for each $$x \in X$$, either $$x$$ is $$\prec$$-minimal or there is a $$\prec$$-minimal $$y \in X$$ such that $$y \prec x$$.
• $$\prec$$ is well-ordered iff there are no countable infinitely descending sequences $$\ldots \prec x_1 \prec x_{0}$$.

Take $$<$$ to be the natural ordering on the natural numbers $$\mathbb{N}$$ (e.g., $$4<6$$, etc.).

1. Is $$<$$ smooth?
2. Is $$<$$ well-founded?
##### Solution
1. Yes. For any $$x \in \mathbb{N}$$, $$x =1$$ or $$1 < x$$. Clearly, $$1$$ is minimal.
2. Yes. If we fix a starting point $$x_{0}$$ there are always finitely many steps down to $$1$$.

Let $${\prec} \subseteq \mathbb{N}$$ be defined by $$x \prec y$$ iff $$y < x$$.

1. Is $$\prec$$ smooth?
2. Is $$\prec$$ well-founded?
##### Solution
1. No. There are no $$\prec$$-minimal elements because there are no $$<$$-maximal elements in $$\mathbb{N}$$.
2. If it is not smooth, it cannot be well-founded. (Try to see why!)

Let $$<$$ be the natural ordering on $$\mathbb{N} \cup \{\infty\}$$ where $$x < \infty$$ for all $$x \in \mathbb{N}$$.

1. Is $$<$$ smooth?
2. Is $$<$$ well-founded?
##### Solution
1. Yes, same reason as in task 1.
2. No, taking the starting point $$\infty$$, we have an infinite descending chain to 1.

Let $${\prec} \subseteq \mathbb{N} \cup \{\infty\}$$ be defined by $$x \prec y$$ iff $$y < x$$ (where $$<$$ is defined in Task 3).

1. Is $$\prec$$ smooth?
2. Is $$\prec$$ well-founded?
##### Solution
1. Yes. Now we have the $$\prec$$-minimal element $$\infty$$.
2. No. Given any starting point $$x_0$$ there are infinitely many steps '$$\prec$$-down to' $$\infty$$.

Suppose our first order language contains the relational symbol $$<$$. Our premise set $$\Gamma$$ consists of

1. $$\forall x \exists y (x < y)$$
2. $$\forall x \forall y ((x < y) \rightarrow \neg(x = y))$$
3. $$\forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$

Recall that a model $$M$$ comes with a domain $$D$$ and it interprets $$<$$ by giving it an extension $$\prec_M \subseteq (D \times D)$$. Every model interprets $$=$$ in the same way relative to the domain of $$M$$, namely by $${=_M} = \{(d,d) \mid d \in D\}$$.

One model $$M$$ of $$\Gamma$$ is for instance the one with domain $$\mathbb{N}$$ where $$<_M$$ is simply the natural ordering on $$\mathbb{N}$$ (so $$1 < 2< 3 <4 \ldots$$).

Formulas are then evaluated as expected, e.g.,

• $$M \models \forall x \exists y (x < y)$$ iff for all $$a \in D_M$$ there is a $$b \in D_M$$ for which $$(a,b) \in <_M$$.
##### Question 1

Suppose we define $${\sf ab}(M) = \{ (a,b) \mid (a,b) \in {<_{M}} \}$$ and order our models by $$M \prec M'$$ iff $$D_M = D_{M'}$$ and $${\sf ab}(M) \subset {\sf ab}(M')$$.

Let $$\mathcal{M}(\Gamma)$$ be the set of all models of $$\Gamma$$.

Is the order $$\prec$$ on $$\mathcal{M}(\Gamma)$$ smooth?

• Solution

No! Take models with domain $$\mathbb{N}$$. Assume some such model $$M$$ is $$\prec$$-minimal in $$\mathcal{M}(\Gamma)$$. Note that there is an $$a \in \mathbb{N}$$ for which there is a $$b \in \mathbb{N}$$ such that $$(b,a) \in {<_M}$$ since $$M \models \forall x \exists y (x < y)$$ and $$\mathbb{N} \neq \emptyset$$. Let $$M_a$$ be just like $$M$$, except that $${<_{M_a}} = \{ (c,d) \in <_M \mid d \neq a\}$$. Note that $${\sf ab}(M_a) = {\sf ab}(M) \setminus \{ (c,d) \in <_M \mid d = a \}$$.

If $$M_a \in \mathcal{M}(\Gamma)$$ we have reached a contradiction to the $$\prec$$-minimality of $$M$$.

We now show that $$M_a \models \forall x \exists y (x < y)$$. Let $$c \in \mathbb{N}$$. Since $$M \in \mathcal{M}(\Gamma)$$, there is a $$d \in \mathbb{N}$$ for which $$(c,d) \in <_M$$. If $$d \neq a$$ then also $$(c,d) \in <_{M_a}$$. Suppose $$d = a$$. Then, there is a $$d' \in \mathbb{N}$$ such that $$(a,d') \in <_M$$ (since $$M \models \forall x \exists y (x < y)$$). Since $$M \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$ also $$(c,d') \in <_M$$. Since $$M \models \forall x \forall y ((x < y) \rightarrow (x \neq y))$$, $$a \neq d'$$. Hence, $$(c,d') \in <_{M_a}$$.

We now show that $$M_a \models \forall x \forall y ((x < y) \rightarrow (x \neq y))$$. Let therefore $$c,d \in \mathbb{N}$$ such that $$(c,d) \in <_{M_a}$$. Hence, $$(c,d) \in <_M$$. Since $$M \models \forall x \forall y ((x < y) \rightarrow (x \neq y))$$, $$(c,d) \notin =_M$$ and thus $$(c,d) \notin =_{M_a}$$.

We now show that $$M_a \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$. Let therefore $$c,d,e \in \mathbb{N}$$ such that $$(c,d) \in <_{M_a}$$ and $$(d,e) \in <_{M_a}$$. Then also $$(c,d) \in <_M$$ and $$(d,e) \in <_M$$. Since $$M \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$, also $$(c,e) \in <_M$$. By the definition of $$<_{M_a}$$, the only way for $$(c,e)$$ not to be in $$<_{M_a}$$ is if $$e = a$$. However, this is impossible since in that case $$(d,e)$$ were not in $$<_{M_a}$$ either. Hence, $$(c,e) \in <_{M_a}$$.

##### Question 2

Suppose now we define $$\Gamma'$$ adding the following to $$\Gamma$$:

1. $$\forall x \forall y (( x < y) \rightarrow \neg( y < x))$$
2. $$\forall x \forall y ((x < y) \vee (y < x))$$

Is the order $$\prec$$ on $$\mathcal{M}(\Gamma')$$ smooth?

• Solution

Yes now it is. To see this suppose there are two models $$M, M' \in \mathcal{M}(\Gamma')$$ for which $$M \prec M'$$. That means that $${\sf ab}(M) \subset {\sf ab}(M')$$ and $$D_M = D_{M'}$$. Hence, there are $$a,b \in D_M$$ for which $$(a,b) \in <_{M'} \setminus <_M$$. Since $$M \models \forall x \forall y ((x < y) \vee (y < x))$$ this means that $$(b,a) \in <_M$$. Since $$M \prec M'$$, also $$(b,a) \in <_{M'}$$. However, $$M' \models \forall x \forall y (( x < y) \rightarrow \neg( y < x))$$ and hence since $$(a,b) \in <_{M'}$$, $$(b,a) \notin <_{M'}$$,—a contradiction.

This shows that in fact every model of $$\Gamma'$$ is $$\prec$$-minimal!

### Session 8: Some properties

#### Recall

Deduction Theorem
$$\Gamma \cup \{A\} \vdash B$$ implies $$\Gamma \vdash A \supset B$$.
Resolution Theorem
$$\Gamma \vdash A \supset B$$ implies $$\Gamma \cup \{A\} \vdash B$$.

Does the resolution theorem hold for the reliability strategy?

Consider $$\Gamma = \{\circ p, \neg(\circ q \wedge \neg q) \vee \neg p\}$$.

Can you construct a counter-example for the resolution theorem on the basis of $$\Gamma$$.

##### Solution
• To show this take $$\Gamma$$.
• This premise set is normal, ie. there are models of $$\Gamma$$ with empty abnormal part.
• Clearly, $$\Gamma \vdash^r p$$. Thus also $$\Gamma \vdash^r (\circ q \wedge \neg q) \supset p$$.
• On the other hand, we do not have $$\Gamma \cup \{\circ q \wedge \neg q \} \vdash^r p$$.
• The reason is that $$\Gamma \cup \{\circ q \wedge \neg q\} \vdash_{CL_{\circ}} \neg p$$.

Does the resolution theorem hold for minimal abnormality.

If you found a counter-example in Task 1, can it be reused for minimal abnormality?

##### Solution

The argument is analogous to Task 1.

Does the deduction theorem hold for the reliability strategy?

Consider $$\Gamma = \{(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q), \circ p, \circ q\}$$.

Can you construct on the basis of $$\Gamma$$ a counter-example for the deduction theorem?

##### Solution
• $$\Gamma \cup \{\circ q \wedge \neg q\} \vdash^r p$$, while
• $$\Gamma \nvdash^r (\circ q \wedge \neg q) \supset p$$.
• The reason is that there is a reliable model $$M$$ of $$\Gamma$$ for which $$M \models \neg p$$ and $$M \models \neg q$$.
• In this model we have $$M \not\models p$$ and $$M \models \circ q \wedge \neg q$$

Can you prove that the deduction theorem holds for the semantic consequence relation based on minimal abnormality? In other words, prove:

• $$\Gamma \cup \{A\} \Vdash^m B$$ implies $$\Gamma \Vdash^m A \supset B$$
##### Solution
• Suppose $$\Gamma \cup \{A\} \vdash^m B$$.
• Let $$M$$ be a minimally abnormal model of $$\Gamma$$.
• Assume that $$M \not\models A \supset B$$.
• Then $$M \models A$$ and $$M \models \neg B$$.
• Clearly, $$M \in \mathcal{M}^m(\Gamma \cup \{A\})$$.
• However, this is a contradiction, since $$M \not\models B$$.
• Thus, $$M \models A \supset B$$.

Prove: If $$\Gamma \Vdash^m A$$ for all $$A \in \Delta$$ then $$\mathcal{M}^m(\Gamma) = \mathcal{M}^m(\Gamma \cup \Delta)$$.

##### Solution
• Suppose $$\Gamma \Vdash^m A$$ for all $$A \in \Delta$$.
• Let $$M \in \mathcal{M}^m(\Gamma)$$.
• By the supposition, $$M \in \mathcal{M}(\Gamma \cup \Delta)$$.
• Suppose $$M' \in \mathcal{M}(\Gamma \cup \Delta)$$ such that $${\sf Ab}(M') \subset {\sf Ab}(M)$$.
• However, since $$M' \in \mathcal{M}(\Gamma)$$ this is a contradiction to the fact that $$M \in \mathcal{M}^m(\Gamma)$$.
• Thus, $$M \in \mathcal{M}^m(\Gamma \cup \Delta)$$.
• Let now $$M \in \mathcal{M}^m(\Gamma \cup \Delta)$$.
• Suppose there is a $$M' \in \mathcal{M}(\Gamma)$$ such that $${\sf Ab}(M') \subseteq {\sf Ab}(M)$$.
• By the main supposition, $$M' \in \mathcal{M}(\Gamma \cup \Delta)$$.
• Thus, $${\sf Ab}(M) = {\sf Ab}(M')$$ which shows that $$M \in \mathcal{M}^m(\Gamma)$$.

#### Recall: CLuN

##### The semantics of CLuN

Let $$\mathcal{W}$$ be the set of well-formed formulas, and $$\mathcal{A}$$ the set of propositional atoms.

• take an assignment $$v : {\cal W} \rightarrow \{0,1\}$$ (note that the assignment assigns truth-values to all well-formed formulas, not just to atoms as in classical propositional logic)
• define the valuation $$v_M$$ associated with a model $$M$$ as follows:
• where $$A \in {\cal A}$$: $$v_M(A) = v(A)$$ (where $${\cal A}$$ is the set of atoms: this is as usual in classical models)
• where $$A \in {\cal W}$$: $$v_M({\sim} A) = \max( 1- v_M(A), v({\sim} A))$$ (this is the unusual part for the paraconsistent negation)
• $$v_M(A \vee B) = \max(v_M(A), v_M(B))$$
• $$v_M(A \wedge B) = \min(v_M(A), v_M(B))$$
• $$v_M(\neg A) = 1 - v_M(A)$$ (this is the classical negation)
• and either define $$A \supset B$$ by $$\neg A \vee B$$ or use $$v_M(A \supset B) = \max(1- v_M(A) , v_M(B))$$ (this is the classical material implication)
##### The Axiomatisation of CLuN
$$(A\supset1)$$
$$A\supset(B\supset A)$$
$$(A\supset2)$$
$$(A\supset(B\supset C))\supset((A\supset B)\supset(A\supset C))$$
$$(A\supset3)$$
$$((A\supset B)\supset A)\supset A$$
$$(A\wedge1)$$
$$(A\wedge B)\supset A$$
$$(A\wedge2)$$
$$(A\wedge B)\supset B$$
$$(A\wedge3)$$
$$A\supset(B\supset(A\wedge B))$$
$$(A\vee1)$$
$$A\supset(A\vee B)$$
$$(A\vee2)$$
$$B\supset(A\vee B)$$
$$(A\vee3)$$
$$(A\supset C)\supset((B\supset C)\supset((A\vee B)\supset C))$$
$$(A\equiv1)$$
$$(A\equiv B)\supset(A\supset B)$$
$$(A\equiv2)$$
$$(A\equiv B)\supset(B\supset A)$$
$$(A\equiv3)$$
$$(A\supset B)\supset((B\supset A)\supset(A\equiv B))$$
$$(A\neg1)$$
$$(A\supset\neg A)\supset\neg A$$
$$(A\neg2)$$
$$A\supset(\neg A\supset B)$$
$$(A\neg3)$$
$$A \vee \neg A$$
$$(A{\sim}3)$$
$$A \vee {\sim} A$$

#### Task 1: A paraconsistent implication

We can define an implication $$\rightarrow$$ by $$A \rightarrow B =_{\rm df} {\sim}A \vee B$$.

Show that

1. Modus Ponens is not valid for $$\rightarrow$$ by showing that $$p, p \rightarrow q \nvdash_{\bf CLuN} q$$.
2. Contraposition is not valid for $$\rightarrow$$ by showing that $$p \rightarrow q \nvdash_{\bf CLuN} {\sim}q \rightarrow {\sim}p$$.

You can show this by demonstrating that there is an assignment $$v$$ under which e.g. $$v_{M}(p) = v_{M}(p \rightarrow q) = 1$$ but $$v_{M}(q) = 0$$ for Task 1.

##### Solution

Let $$v_{M}$$ be based on an assignment $$v$$ for which $$v(p) = v({\sim}p) = 1$$ and $$v(q) = 0$$. Then

• $$M \models p$$
• $$M \models p \rightarrow q$$ since $$M \models {\sim}p \vee q$$ since $$M \models {\sim} p$$
• $$M \not\models q$$.

Let $$v_M$$ be based on an assignment $$v$$ for which $$v(p) = v(q) = v({\sim}q) = 1$$ and $$v({\sim}p) = v({\sim}{\sim}q) = 0$$. Then

• $$M \models p \rightarrow q$$ since $$M \models q$$
• $$M \not\models {\sim}q \rightarrow {\sim}p$$ since $$M \not\models {\sim}{\sim} q$$ and $$M \not\models {\sim}p$$.

#### Task 2: $${\sim}$$ is weak!

Show:

• $$p \nvdash_{\bf CLuN} {\sim}{\sim} p$$
• $${\sim}{\sim} p \nvdash_{\bf CLuN} p$$
• $${\sim}(p \vee q) \nvdash_{\bf CLuN} {\sim} p \wedge {\sim} q$$
• $${\sim} p \wedge {\sim} q \nvdash_{\bf CLuN} {\sim}(p \vee q)$$
##### Solution
• take $$v(p) = 1$$ and $$v({\sim}{\sim}p) = 0$$
• take $$v({\sim}{\sim}p) = v({\sim}p) = 1$$ and $$v(p) = 0$$
• take $$v({\sim}(p \vee q)) = v(p) = v(q) = 1$$ and $$v({\sim}p) = v({\sim}q) = 0$$
• take $$v({\sim}p) = v({\sim}q) = v(p) = v(q) = 1$$ and $$v({\sim} p \vee q) = 0$$.

#### Task 3: what do you think?

Does the following hold?

• $$\neg{\sim} p \vdash_{\bf CLuN} p$$
• $${\sim}\neg p \vdash_{\bf CLuN} p$$
##### Solution
• Note that $$v_M(\neg{\sim}p) = 1$$ iff $$v_M({\sim}p) = 0$$ which implies that $$v_M(p) = 1$$.
• Take $$v(p) = 0$$ and $$v({\sim}\neg p) = 1$$. Then $$v_M(p) = 0$$ and thus $$v_M(\neg p) = 1$$ while $$v_M({\sim}\neg p) = 1$$.

Recall that the strengthening CLuNs of CLuN by De Morgan rules

• $${\sim}(A \wedge B) \vdash {\sim} A \vee {\sim} B$$
• $${\sim} A \vee {\sim} B \vdash {\sim}(A \wedge B)$$
• $${\sim}(A \vee B) \vdash {\sim} A \wedge {\sim} B$$
• $${\sim} A \wedge {\sim} B \vdash {\sim}(A \vee B)$$

and double negation introduction and elimination

• $${\sim}{\sim}A \vdash A$$
• $$A \vdash {\sim}{\sim} A$$

suffers from the so-called Flip-Flop-problem. An adaptive logic suffers from this problem if the conditions of defeasible inferences get involved in minimal disjunctions of abnormalities although they are intuitively speaking not problematic. In other words, abnormalities spread in uncontrolled ways.

Demonstrate this problem for the adaptive logic based on CLuNs and the abnormalities $$A \wedge {\sim}A$$ where $$A$$ is any well-formed formula on the basis of the premise set $$\Gamma = \{p \wedge {\sim}p, {\sim}{\sim}q, {\sim}q \vee r\}$$.

Tip: we would expect $$r$$ to be derivable from $${\sim}{\sim} q$$ and $${\sim}q \vee r$$. Show that this is not the case since the condition on which $$r$$ is derived is involved in a minimal disjunction of abnormaltities.

##### Solution
 1 $${\sim}{\sim} q$$ PREM $$\emptyset$$ 2 $$p \wedge {\sim}p$$ PREM $$\emptyset$$ 3 $$q \vee ({\sim} q \wedge {\sim}{\sim}q)$$ 1;RU $$\emptyset$$ 4 $$q$$ 3;RU $$\emptyset$$ 5 $${\sim}q \vee r$$ PREM $$\emptyset$$ 6 $$r$$ 4,5;RC $$\{q \wedge {\sim}q\}$$ 7 $$p \wedge r$$ 2,6;RU $$\{q \wedge {\sim}q\}$$ 8 $${\sim}p \vee {\sim}r$$ 2;RU $$\emptyset$$ 9 $${\sim}(p \wedge r)$$ 8;RU $$\emptyset$$ 10 $$(p\wedge r) \wedge {\sim}(p\wedge r)$$ 7,9;RU $$\{q \wedge {\sim}q\}$$ 11 $$[(p\wedge r) \wedge {\sim}(p\wedge r)] \vee [q \wedge {\sim}q]$$ 10;RA $$\emptyset$$

At this point line 6 is irrevocably marked (the disjunction at line 11 cannot be shortened).

#### Task 5: The incomparability of $${\bf CLuN}^{r}$$ and $${\bf CLuNs}^r$$.

Give two examples that demonstrate that

1. neither the consequence relation based on $${\bf CLuN}^r$$ is stronger than the one based on $${\bf CLuNs}^r$$
2. nor the consequence relation based on $${\bf CLuNs}^r$$ is stronger than the one based on $${\bf CLuN}^r$$.

### Session 10: more paraconsistency

Take the adaptive logic based on CLuNs, the set of abnormalities $$\Omega = \{A \wedge {\sim}A \mid A \in {\rm Atoms}\}$$, and the reliability strategy.

Show: where $$\Gamma = \{p \wedge {\sim}p, {\sim}{\sim}q, {\sim}q \vee r\}$$, $$\Gamma \vdash_{CLuNs^r} r$$.

Let $$\Gamma = \{p, q, {\sim}q \vee {\sim}p, {\sim}q \vee r, {\sim}p \vee r\}$$.

1. $$\Gamma \vdash_{\bf CLuN}^r r$$?
2. $$\Gamma \vdash_{\bf CLuN}^m r$$?

Created: 2015-07-07 Tue 18:37

Emacs 24.5.1 (Org mode 8.2.10)

Validate