# Course: Adaptive Logics

## Table of Contents

- Course Description
- Participants
- General Organisational Information
- Slides
- Exercises

\(\def\Dab{{\rm Dab}}\)

## Course Description

Adaptive logics offer a framework for the formal modeling of defeasible reasoning. When we reason defeasibly, our conclusions do not necessarily follow from our premises, but just likely or usually. One of the first applications of adaptive logics was reasoning on the basis of inconsistencies. The usefulness of Classical Logic is limited for this task since it allows one to derive just any formula when confronted with an inconsistent premise set. Inconsistency-adaptive logics, on the other hand, adapt themselves to a premise set in the following sense: they isolate inconsistencies and at the same time allow for the full power of Classical Logic for consistent parts of the premise set. The idea of letting a logic 'adapt itself to the premise set' has since been applied to many other forms of defeasible reasoning such as abductive reasoning (deriving explanations: think of Sherlock Holmes), inductive generalizations (to derive from 'some X are A' that 'all X are A'), normative reasoning in view of conflicting norms ('You shall not kill, but if you do, do it gently.'), default reasoning ('Tweety is a bird, thus she flies.'), and many others. In this seminar we will study adaptive logics, their dynamic proof theory, their semantics, and some of their applications. If time allows we will also investigate links to other central formal frameworks for defeasible reasoning. The literature will be announced in the first meeting. A basic understanding of propositional logic is advisory.

## Participants

Nachname | Vorname |
---|---|

Wüste | Lisa |

Chechelnizki | Georg |

Roussel | Adam |

Bartels | Steven |

Liechti | Tobias Alexander |

## General Organisational Information

Organised by: | Christian Straßer, GA 3/39, 0234/3224721 |

Speaking hours: | Wednesdays from 10:00–12:00 |

When: | every Tuesday from 10:15 to 11:45, Summer term 2015 |

Where: | GABF 04 / 358 |

### Credit Points and Marks

The credit point in this course can be earned on an individual basis depending on your personal preferences.

The following options are available:

- Giving a talk an a relevant topic. (★★ / ★)
- Writing an essay / a paper on a relevant topic. (★★ / ★)
- Writing a protocol on a session. (★)
- Examination: usually this is a combination of written and oral exam, but this can be adjusted according to personal preference. (★★ / ★)
- doing research on a relevant topic (★★)

★★ means that you can get a mark and credit points for this option, ★ means that you can get credit points for this option.

For all options, please first talk with me about your plans: this is important to choose a relevant topic.

Option 5 is for students who feel very confident in the subject matter. This can e.g. serve as preparation for a master thesis.

### Exercises

There will be exercises for some sessions in the course. You are not obliged to do them and I will not keep written track of who did what. However, I strongly recommend you to work on exercises for your own sake: it will severely improve your understanding.

### Questions and Interaction

I strongly encourage you to ask questions during the course. Try not to be afraid to ask 'stupid/silly questions': often question which seem naive are the very best questions. Here are signs that should trigger immediate questions from your side:

- I don't know the meaning of this symbol on the slides/blackboard
- I cannot read your hand-writing
- you mentioned 'notion X': I have no idea what that word means
- I think you made a mistake there
- I think I got lost somehow
- this seems interesting, is there some interesting literature I could take a look at?

Please feel also free to come to speaking hours, especially when you prepare something to get credit-points. I also encourage writing emails to me with questions.

## Slides

- v.1

- pdf/AL-course-slides.pdf
- printer-friendly: pdf/AL-course-slides-pdfjam.pdf

## Exercises

Open the exercises in a dedicated window: here

Where: | GABF 04 / 358 |

More information on the course can be found here.

### Session 2: Introducing Hilbert Style Proof Systems

To be discussed at

.Recall the Hilbert Style axiomatisation of classical propositional logic from the slides:

\((A\supset1)\) | \(A\supset(B\supset A)\) |

(A \(\supset\) 2) | \((A\supset(B\supset C))\supset((A\supset B)\supset(A\supset C))\) |

(A \(\supset\) 3) | \(((A\supset B)\supset A)\supset A\) |

(A \(\wedge\) 1) | \((A\wedge B)\supset A\) |

(A \(\wedge\) 2) | \((A\wedge B)\supset B\) |

(A \(\wedge\) 3) | \(A\supset(B\supset(A\wedge B))\) |

(A \(\vee\) 1) | \(A\supset(A\vee B)\) |

(A \(\vee\) 2) | \(B\supset(A\vee B)\) |

(A \(\vee\) 3) | \((A\supset C)\supset((B\supset C)\supset((A\vee B)\supset C))\) |

(A \(\equiv\) 1) | \((A\equiv B)\supset(A\supset B)\) |

(A \(\equiv\) 2) | \((A\equiv B)\supset(B\supset A)\) |

(A \(\equiv\) 3) | \((A\supset B)\supset((B\supset A)\supset(A\equiv B))\) |

(A \(\neg\) 1) | \((A\supset\neg A)\supset\neg A\) |

(A \(\neg\) 2) | \(A\supset(\neg A\supset B)\) |

(A \(\neg\) 3) | \(A \vee \neg A\) |

#### Task 1

- Discussed on .

Proof \(A \supset A\).

Tip: you only need \((A{\supset}1)\) and \((A{\supset}2)\).

1 | \((A \supset ((A \supset A) \supset A)) \supset ((A \supset (A\supset A)) \supset (A \supset A))\) | \((A{\supset}2)\) where \(A:=A, B := A \supset A, C:=A\) |

2 | \(A \supset ((A\supset A) \supset A)\) | \((A{\supset}1)\) where \(A:=A, B:=A\supset A\) |

3 | \((A \supset (A\supset A)) \supset (A \supset A)\) | 1,2; MP |

4 | \(A \supset (A \supset A)\) | \((A{\supset}1)\) where \(A:= A, B:= A\) |

5 | \(A \supset A\) | 3,4; MP |

#### Task 2

- Discussed on .

Proof: \((B \vee C) \supset (\neg B \supset C)\).

1 | \(B \supset ( \neg B \supset C)\) | \((A{\neg}2)\) |

2 | \(C \supset (\neg B \supset C)\) | \((A{\supset}1)\) |

3 | \((B \supset (\neg B \supset C)) \supset ( (C \supset (\neg B \supset C)) \supset ((B \vee C) \supset (\neg B \supset C)))\) | \((A{\vee}3)\) |

4 | \((C \supset (\neg B \supset C)) \supset ((B \vee C) \supset (\neg B \supset C))\) | 1,3; MP |

5 | \((B \vee C) \supset (\neg B \supset C)\) | 2,4; MP |

#### Task 3

- Discussed on .

Proof: \(A \supset B, B \supset C \vdash A \supset C\).

1 | \((A \supset (B \supset C)) \supset ((A \supset B) \supset ( A \supset C))\) | \((A{\supset}2)\) |

2 | \(B \supset C\) | PREM |

3 | \((B \supset C) \supset (A \supset (B \supset C))\) | \((A{\supset}1)\) |

4 | \(A \supset (B \supset C)\) | 2,3; MP |

5 | \((A \supset B) \supset ( A \supset C)\) | 1,4; MP |

6 | \(A \supset B\) | PREM |

7 | \(A \supset C\) | 5,6; MP |

#### Task 4

- Discussed on .

Proof: \(\neg \neg A \supset A\).

1 | \(A \supset (\neg \neg A \supset A)\) | \((A{\supset}1)\) |

2 | \(\neg A \supset (\neg \neg A \supset \neg A)\) | \((A{\neg}2)\) |

3 | \((A \supset (\neg \neg A \supset A)) \supset ((\neg A \supset (\neg \neg A \supset A)) \supset ((A \vee \neg A) \supset (\neg \neg A \supset A)))\) | \((A{\supset}2)\) |

4 | \((\neg A \supset (\neg \neg A \supset A)) \supset ((A \vee \neg A) \supset (\neg \neg A \supset A))\) | 1,3; MP |

5 | \((A \vee \neg A) \supset (\neg \neg A \supset A)\) | 2,4; MP |

6 | \(A \vee \neg A\) | \((A{\neg}3)\) |

7 | \(\neg \neg A \supset A\) | 5,6; MP |

#### Task 5

Proof: \(A \supset \neg \neg A\).

##### Solution

1 | \((\neg A \supset \neg \neg A) \supset \neg \neg A\) | \((A{\neg}1)\) |

2 | \(((\neg A \supset \neg \neg A) \supset \neg \neg A) \supset (A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A))\) | \((A{\supset}1)\) |

3 | \(A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A)\) | 1,2; MP |

4 | \(A \supset (\neg A \supset \neg \neg A)\) | \((A{\neg}2)\) |

5 | \((A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A)) \supset ((A \supset (\neg A \supset \neg \neg A)) \supset (A \supset \neg \neg A))\) | \((A{\supset}2)\) |

6 | \((A \supset (\neg A \supset \neg \neg A)) \supset (A \supset \neg \neg A)\) | 3,5; MP |

7 | \(A \supset \neg \neg A\) | 4,6; MP |

#### Task 6

- Discussed on .

Show \(\{A, \neg A\} \vdash B\).

Tip: use \((A{\neg}2)\) and MP.

1 | \(A\) | PREM |

2 | \(\neg A\) | PREM |

3 | \(A \supset (\neg A \supset B)\) | \((A{\neg}2)\) |

4 | \(\neg A \supset B\) | 1,3; MP |

5 | \(B\) | 2,4; MP |

### Session 3: Mathematical Induction

Don't forget to take a look at the solutions of Session 2 and prepare Task 5 of Session 2 (we didn't discuss that example yet).

Recall from the course that mathematical induction works as follows:

You want to prove a statement \(S(n)\) for all natural numbers \(n \in \mathbb{N}\) (or for a subset of natural numbers \(N\)). E.g., in task one \(S(n)\) reads \(\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}\).

**Induction Base**: You start by proving it for the first natural number (in your set of interest \(N\)). For \(\mathbb{N}\) this is 1 (except you begin to count at 0). In our example you first prove \(1 = \frac{1 \cdot (1+1)}{2}\).

**Induction Step**: Then you suppose that you have shown your statement \(S(\cdot)\) for an arbitrary number \(n\) and on this basis show that your statement also holds for \(n+1\) (this is called the **induction hypothesis** or the **inductive hypothesis**). In our first example you suppose that \(\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}\) holds and then you show that \(\sum_{i=1}^{n+1} i = \frac{(n+1)\cdot(n+2)}{2}\) holds.

There is a variant of the proof method called **strong mathematical induction**. In the induction step you assume that \(S(\cdot)\) holds for all \(m \le n\) for an arbitrary \(n\) and on this basis demonstrate that \(S(n+1)\).

#### Task 1

Prove via mathematical induction that \(\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}\).

- Base Case (\(n=1\)): \(\sum_{i = 1}^{n} i = 1\) and \(\frac{n \cdot (n+1)}{2} = \frac{1 \cdot 2}{2} = 1\). Thus, \(\sum_{i = 1}^{n} = \frac{n \cdot (n+1)}{2}\).
- Induction step: The inductive hypothesis is \(\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}\). We need to show that \(\sum_{i=1}^{n+1} i = \frac{(n+1)\cdot (n+2)}{2}\). We have:
- \(\sum_{i=1}^{n+1} i = \left(\sum_{i=1}^n i \right) + (n+1)\)
- By the inductive hypothesis, \(\left(\sum_{i=1}^n i \right) + (n+1) = \frac{n \cdot (n+1)}{2} + (n+1)\)
- Note that \(\frac{n \cdot (n+1)}{2} + (n+1) = \frac{n \cdot (n+1)}{2} + \frac{2 \cdot (n+1)}{2} = \frac{n \cdot (n+1) + 2 \cdot (n+1)}{2} = \frac{(n+2) \cdot (n+1)}{2} = \frac{(n+1) \cdot (n+2)}{2}\).
- By 2 and 3, \(\sum_{i=1}^{n+1} i = \frac{(n+1) \cdot (n+2)}{2}\).

#### Task 2

Prove via mathematical induction that \(T(n) = 2^n - 1\) where $$T(n) = \left\{ \begin{array}{cc} 1 & n=1 \\ 2 \cdot T(n-1) + 1 & n > 1 \end{array} \right.$$

- Base Case (\(n=1\)): \(T(1) = 1 = 2^1 - 1\).
- Inductive step: The inductive hypothesis is that \(T(n) = 2^n - 1\). We need to show that \(T(n+1) = 2^{n+1} - 1\).
- \(T(n + 1) = 2 \cdot T(n + 1 - 1) + 1 = 2 \cdot T(n) + 1\).
- By the inductive hypothesis and 1, \(T(n + 1) = 2 \cdot (2^n - 1) + 1\).
- Note that \(2 \cdot (2^n - 1) + 1 = (2^{n+1} - 2) + 1 = 2^{n+1} - 1\).
- By 2 and 3, \(T(n+1) = 2^{n+1} - 1\).

### Session 4: some more proofs, the deduction theorem and transitivity

*Don't forget that in tasks you can rely on results obtained in previous tasks.*

#### Task 1: Reasoning by Cases

Prove: \(A \vee B, A \supset C, B \supset C \vdash C\).

##### Solution

- \(A \supset C\) by Premise introduction.
- \((A \supset C) \supset ((B\supset C) \supset ((A \vee B) \supset C))\) by \((A\vee 3)\)
- \((B\supset C) \supset ((A \vee B) \supset C)\) by MP.
- \(B\supset C\) by premise introduction.
- \((A \vee B) \supset C\) by MP.
- \(A \vee B\) by premise introduction.
- \(C\) by MP.

#### Task 2: Transitivity

Show that if \(\Gamma \vdash A\) for all \(A \in \Gamma'\) and \(\Gamma' \vdash B\), then also \(\Gamma \vdash B\).

##### Solution

- Suppose \(\Gamma \vdash A\) for all \(A \in \Gamma'\) and suppose \(\Gamma' \vdash B\).
- Hence, there is a proof of \(B\) from \(\Gamma'\).
- Since proofs are finite, there is a finite subset \(\Gamma'_f = \{B_{1}, \ldots, B_{n}\}\) of \(\Gamma'\) such that there is a proof \(\mathcal{P}'\) from \(\Gamma'_f\) of \(B\).
- For each \(B_i\) (\(i \in \{1, \ldots, n\}\)) we know that there is a proof \(\mathcal{P}_i\) of \(B_i\) from \(\Gamma\).
- We now concatenate the proofs \(\mathcal{P}_1, \ldots, \mathcal{P}_n\) (under renaming of the proof lines) resulting in a proof \(\mathcal{P}^{\star}\).
- Let \(\mathcal{P}''\) be the result of
- removing each line in \(\mathcal{P}'\) where \(B_i\) (\(i \in \{1, \ldots, n\}\)) is introduced by Premise introduction
- and replacing each reference to a line in \(\mathcal{P}'\) where some \(B_i\) (\(i \in \{1, \ldots, n\}\)) is introduced by premise introduction in a justification of some line by a reference to a line in \(\mathcal{P}^{\star}\) where \(B_i\) is derived.

- Now we concatenate \(\mathcal{P}^{\star}\) and \(\mathcal{P}''\).
- This is a proof of \(B\) from \(\Gamma\).

#### Remark

Recall that last session we proved the deduction theorem. This can drastically simplify proofs.

Here's one example how it can be used in a proof.

Proof: \(\vdash (A \supset B) \supset ((B \supset C) \supset (A \supset C))\).

- Suppose \(A \supset B\).
- Suppose \(B \supset C\).
- Suppose \(A\).
- Then by MP and 1, \(B\).
- By MP and 1.1, \(C\).

- We have shown that \(A \supset B, B \supset C, A \vdash C\).
- By the deduction theorem, \(A \supset B, B \supset C \vdash A \supset C\).

- Suppose \(A\).
- By the deduction theorem, \(A \supset B \vdash (B \supset C) \supset (A \supset C)\).

- Suppose \(B \supset C\).
- By the deduction theorem, \(\vdash (A \supset B) \supset ((B \supset C) \supset (A \supset C))\).

#### Task 2

Show \(\vdash (A \supset B) \supset (\neg A \vee B)\).

##### Solution

- Suppose \(A \supset B\).
- Suppose \(A\).
- By 1 and MP, \(B\).
- By \((A\vee 2)\), \(B \supset (\neg A \vee B)\).
- By MP, \(\neg A \vee B\).

- We have shown that \(A \supset B, A \vdash \neg A \vee B\).
- By the deduction theorem, \(A \supset B \vdash A \supset (\neg A \vee B)\).
- Suppose \(\neg A\).
- By \((A\vee 1)\), \(\neg A \supset (\neg A \vee B)\).
- By MP, \(\neg A \vee B\).

- We have shown that \(A \supset B, \neg A \vdash \neg A \vee B\).
- By the deduction theorem, \(A \supset B \vdash \neg A \supset (\neg A \vee B)\).
- Note that by \((A \neg 3)\), also \(A \supset B \vdash A \vee \neg A\).
- By lines 3, 6, 7, Task 1 and Task 2, also \(A \supset B \vdash \neg A \vee B\). (Do you see why?)

- Suppose \(A\).
- By the deduction theorem \(\vdash (A \supset B) \supset (\neg A \vee B)\).

#### Task 3

Show \(\vdash (\neg A \vee B) \supset (A \supset B)\).

##### Solution

- Suppose \(\neg A \vee B\).
- By \((A \supset 1)\), \(B \supset (A \supset B)\).
- Suppose \(\neg A\).
- Suppose \(A\).
- By a previous task, \(B\).

- By the deduction theorem, \(\neg A \vee B, \neg A \vdash A \supset B\).

- Suppose \(A\).
- By the deduction theorem, \(\neg A \vee B \vdash \neg A \supset (A \supset B)\).
- Hence, by 1, 3, Task 1 and Task 2, also \(A \supset B\). (Do you see why?)

- We have shown that \(\neg A \vee B \vdash A \supset B\).
- By the deduction theorem, \(\vdash (\neg A \vee B) \supset (A \supset B)\).

### Session 5: Dynamic Proofs and Adaptive Strategies

Note that in tasks you can always use results you obtained in previous tasks.

#### Task 1

Show that: if \(\Gamma \vdash A \vee \Dab(\Delta)\) (where \(\vdash\) is the derivability relation of the lower limit logic) then \(A\) is derivable on a (non-empty) condition \(\Delta\) in an adaptive proof from the premise set \(\Gamma\).

For the courageous: Show the other direction. (You will need an inductive argument over the length of the dynamic proof of \(A \vee \Dab(\Delta)\).)

##### Solution

(\(\Rightarrow\))

- Let \(\Gamma \vdash A \vee \Dab(\Delta)\).
- Hence, there is a proof in LLL of \(A \vee \Dab(\Delta)\) from \(\Gamma\).
- Hence, there is a finite subset \(\{A_1, \ldots, A_n\}\) of \(\Gamma\) from which \(A \vee \Dab(\Delta)\) is provable.
- We now construct a dynamic proof of \(A\) from \(\Gamma\) as follows:
1 \(A_1\) PREM \(\emptyset\) 2 \(A_2\) PREM \(\emptyset\) \(\vdots\) \(\vdots\) PREM \(\emptyset\) \(n\) \(A_n\) PREM \(\emptyset\) \(n{+}1\) \(A\) \(1,\ldots,n;\) RC \(\Delta\)

(\(\Leftarrow\))

We prove this by induction over the length of the proof of \(A\) on the condition \(\Delta\). For simplification we let '\(\vee \Dab(\emptyset)\)' denote the empty string.

*Base case (line 1)*: \(B\) is introduced as a premise on the condition \(\emptyset\). Hence, \(B \in \Gamma\) and this \(\Gamma \vdash B\) and hence \(\Gamma \vdash B \vee \Dab(\emptyset)\).

*Inductive Step (line \(n{+}1\))*: The inductive hypothesis is that, where \(C\) is derived on the condition \(\Theta\) at a line \(m \leq n\), then \(\Gamma \vdash C \vee \Dab(\Theta)\).

Suppose \(B\) is derived at line \(n{+}1\) on the condition \(\Lambda\) with the justification J.

- Case J = PREM: The argument is analogous to the base case.
- Case J = \(l_1, \ldots, l_m\); RU:
- Suppose \(A_i\) is derived on the condition \(\Delta_i\) at line \(l_{i}\) (where \(1 \le i \le m\)).
- Hence, \(A_1, \ldots, A_m \vdash B\) (by the definition of the rule RU) and \(\Theta = \Delta_1 \cup \ldots \cup \Delta_m\).
- For all \(i \in \{1, \ldots, m\}\) we have:
- By the inductive hypothesis, \(\Gamma \vdash A_i \vee \Dab(\Delta_i)\).
- Hence, \(\Gamma \vdash A_i \vee \neg\neg \Dab(\Delta_i)\).
- Hence, \(\Gamma \vdash \neg\neg \Dab(\Delta_i) \vee A_i\). (Recall: \(A \vee B \vdash B \vee A\).)
- Thus, \(\Gamma \vdash \neg\Dab(\Delta_i) \supset A_i\) (Recall: \(\neg A \vee B \vdash A \supset B\).)
- Thus, \(\Gamma \cup \{\neg \Dab(\Delta_i)\} \vdash A_i\). (Resolution Theorem)
- Thus, \(\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash A_i\) (Monotonicity)

- Hence, \(\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash A_1 \wedge \ldots \wedge A_m\) (Recall that \(A \vdash B_1\) and \(A \vdash B_2\) implies \(A \vdash B_1 \wedge B_2\).)
- By 2.2 also \(A_1 \wedge \ldots \wedge A_m \vdash B\) (Follows by transitivity.)
- By 2.4 and 2.5, \(\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash B\) (this is transitivity).
- Thus, \(\Gamma \vdash B \vee \Dab(\Delta_1) \vee \Dab(\Delta_2) \vee \ldots \vee \Dab(\Delta_m)\). (By the deduction theorem.)
- Hence, \(\Gamma \vdash B \vee \Dab(\Delta_1 \cup \ldots \cup \Delta_m)\) and \(\Gamma \vdash B \vee \Dab(\Theta)\).

- Case J = \(l_1, \ldots, l_m\); RC: this can be shown in a similar way as the previous case.

#### Task 2

Show that:

- Whenever \(A\) is derivable on a condition \(\Delta \cup \{B\}\) in a dynamic proof from \(\Gamma\) then also \(A \vee B\) is derivable from \(\Gamma\) on the condition \(\Delta\).
- Whenever \(A\) is derivable on a condition \(\Delta\) and \(\neg A\) is derivable on the condition \(\emptyset\) in a dynamic proof from \(\Gamma\), then also \(\Dab(\Delta)\) is derivable from \(\Gamma\) on the condition \(\emptyset\).

(Use task 1).

##### Solution

Ad 1.:

- Suppose \(A\) is derived on the condition \(\Delta \cup \{B\}\) from \(\Gamma\).
- By Task 1, \(\Gamma \vdash A \vee \Dab(\Delta \cup \{B\})\).
- Hence, \(\Gamma \vdash A \vee \Dab(\Delta) \vee B\) and thus \(\Gamma \vdash (A \vee B) \vee \Dab(\Delta)\). (Recall that \(\vee\) is associative (\(A \vee (B \vee C) \vdash (A \vee B) \vee C\)) and commutative (\(A \vee B \vdash B \vee A\)).)
- Again, by Task 1, \(A \vee B\) is derivable from \(\Gamma\) on the condition \(\Delta\).

Ad 2.:

- Suppose \(A\) is derivable on the condition \(\Delta\) and \(\neg A\) is derivable on the condition \(\emptyset\) in a proof from \(\Gamma\).
- By the first part of Task 2, \(\Gamma \vdash A \vee \Dab(\Delta)\) and \(\Gamma \vdash \neg A\).
- Thus, \(\Gamma \vdash \Dab(\Delta)\) (disjunctive syllogism).

#### Remark on Task 2

Task 2 shows that the following generic rules are derived rules for dynamic proofs in adaptive logics:

$$\begin{array}{rl} A & \Delta \cup \Delta' \\ \hline A \vee \Dab(\Delta) & \Delta' \end{array} ~~~ [\rm RD1]$$

$$\begin{array}{rl} A & \Delta \\ \neg A & \emptyset \\\hline \Dab(\Delta) & \emptyset \end{array} ~~~[\rm RD2]$$

#### Task 3

Indicate mistakes in the following proof fragment from \(\Gamma = \{(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q), (\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q), \circ p, \circ q\}\) where the reliability strategy is used:

1 | \((\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)\) | PREM | \(\emptyset\) | |

2 | \((\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q)\) | PREM | \(\emptyset\) | |

3 | \((\circ p \wedge \neg p)\) | 1,2; RU | \(\emptyset\) | |

4 | \(\circ p\) | PREM | \(\emptyset\) | |

5 | \(p\) | 4; RU | \(\{\circ p \wedge \neg p\}\) | |

6 | \(\circ q\) | RU | \(\emptyset\) | |

\(\checkmark\) | 7 | \(q\) | 6; RC | \(\{\circ q \wedge \neg q\}\) |

##### Solution

1 | \((\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)\) | PREM | \(\emptyset\) | ||

2 | \((\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q)\) | PREM | \(\emptyset\) | ||

3 | \((\circ p \wedge \neg p)\) | 1,2; RU | \(\emptyset\) | ||

4 | \(\circ p\) | PREM | \(\emptyset\) | ||

RU -> RC, marking | 5 | \(p\) | 4; RU | \(\{\circ p \wedge \neg p\}\) | |

RU -> PREM | 6 | \(\circ q\) | RU | \(\emptyset\) | |

no marking | \(\checkmark\) | 7 | \(q\) | 6; RC | \(\{\circ q \wedge \neg q\}\) |

#### Task 4

Let \(\Gamma = \{\circ p, \circ( p \supset q), \neg q\}\). Extend the following proof so that line 2 gets marked (where the reliability strategy is used).

1 | \(\circ p\) | PREM | \(\emptyset\) | |

2 | \(p\) | 1; RC | \(\{\circ p \wedge \neg p\}\) |

##### Solution

1 | \(\circ p\) | PREM | \(\emptyset\) | |

\(\checkmark\) | 2 | \(p\) | 1; RC | \(\{\circ p \wedge \neg p\}\) |

3 | \(\circ(p \supset q)\) | PREM | \(\emptyset\) | |

4 | \(\neg q\) | PREM | \(\emptyset\) | |

5 | \(p \supset q\) | 3; RC | \(\{\circ(p \supset q) \wedge \neg (p \supset q)\}\) | |

6 | \(q\) | 2,5; RU | \(\{\circ p \wedge \neg p,\circ(p \supset q) \wedge \neg (p \supset q)\}\) | |

7 | \((\circ p \wedge \neg p) \vee (\circ(p \supset q) \wedge \neg (p \supset q))\) | 4,6; RD2 | \(\emptyset\) |

#### Task 5

Let \(\Gamma = \{\circ p, \circ q, \neg q, \neg(p \wedge q)\}\). Extend the following proof in such a way that \(p\) is finally derived at line 2 (where the reliability strategy is used).

1 | \(\circ p\) | PREM | \(\emptyset\) | |

2 | \(p\) | 1; RC | \(\{\circ p \wedge \neg p\}\) | |

3 | \(\circ q\) | PREM | \(\emptyset\) | |

4 | \(\neg(p \wedge q)\) | PREM | \(\emptyset\) | |

5 | \((\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)\) | 1,3,4; RU | \(\emptyset\) |

##### Solution

1 | \(\circ p\) | PREM | \(\emptyset\) | |

2 | \(p\) | 1; RC | \(\{\circ p \wedge \neg p\}\) | |

3 | \(\circ q\) | PREM | \(\emptyset\) | |

4 | \(\neg(p \wedge q)\) | PREM | \(\emptyset\) | |

5 | \((\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)\) | 1,3,4; RU | \(\emptyset\) | |

6 | \(\neg q\) | PREM | \(\emptyset\) | |

7 | \(\circ q \wedge \neg q\) | 3,6; RU | \(\emptyset\) |

#### Task 6

Let \(\Gamma = \{\circ p, \circ q, \neg p \vee \neg q \vee \neg s\}\). Finally derive \(\neg s\) (where the reliability strategy is used).

##### Solution

1 | \(\circ p\) | PREM | \(\emptyset\) | |

2 | \(p\) | 1; RC | \(\{\circ p \wedge \neg p\}\) | |

3 | \(\circ q\) | PREM | \(\emptyset\) | |

4 | \(q\) | 3; RC | \(\{\circ q \wedge \neg q\}\) | |

5 | \(\neg p \vee \neg q \vee \neg s\) | PREM | \(\emptyset\) | |

6 | \(\neg q \vee \neg s\) | 2,5; RU | \(\{\circ p \wedge \neg p\}\) | |

7 | \(\neg s\) | 4,6; RU | \(\{\circ q \wedge \neg q\}\) |

#### Task 7

Let \(\Gamma = \{\circ p, \circ q, \neg p \vee \neg q \vee \neg s, \circ s\}\).

- Is \(\neg s\) still finally derivable?
- Finally derive \(p \vee q \vee s\) with the minimal abnormality strategy.

##### Solution

Ad 1: Nope. Here's a demonstration

1 | \(\circ p\) | PREM | \(\emptyset\) | |

2 | \(\circ q\) | PREM | \(\emptyset\) | |

3 | \(\circ s\) | PREM | \(\emptyset\) | |

\(\checkmark_6\) | 4 | \(\neg s\) | 1,2; RC | \(\{\circ p \wedge \neg p, \circ q \wedge \neg q\}\) |

5 | \(\neg p \vee \neg q \vee \neg s\) | PREM | \(\emptyset\) | |

6 | \((\circ p \wedge \neg p) \vee (\circ q \wedge \neg q) \vee (\circ s \wedge \neg s)\) | 1,2,3,5; RU | \(\emptyset\) |

Ad 2:

1 | \(\circ p\) | PREM | \(\emptyset\) | |

2 | \(\circ q\) | PREM | \(\emptyset\) | |

3 | \(\circ s\) | PREM | \(\emptyset\) | |

\(\checkmark_6\) | 4 | \(\neg s\) | 1,2; RC | \(\{\circ p \wedge \neg p, \circ q \wedge \neg q\}\) |

5 | \(\neg p \vee \neg q \vee \neg s\) | PREM | \(\emptyset\) | |

6 | \((\circ p \wedge \neg p) \vee (\circ q \wedge \neg q) \vee (\circ s \wedge \neg s)\) | 1,2,3,5; RU | \(\emptyset\) | |

7 | \(p\) | 1; RC | \(\{\circ p \wedge \neg p\}\) | |

8 | \(p \vee q \vee s\) | 7; RU | \(\{\circ p \wedge \neg p\}\) | |

9 | \(p \vee q \vee s\) | 2; RC | \(\{\circ q \wedge \neg q\}\) | |

10 | \(p \vee q \vee s\) | 3; RC | \(\{\circ s \wedge \neg s\}\) |

Note that \(\Phi_{10}(\Gamma) = \bigl\{ \{\circ p \wedge \neg p\}, \{\circ q \wedge \neg q\}, \{\circ s \wedge \neg s\} \bigr\}\). Hence, lines 8–10 are not marked at this point according to the marking definition of minimal abnormality.

### Session 7: Smoothness and Well-Foundedness

Let in the following \(\mathbb{N} = \{1, 2, 3, \ldots \}\) be the set of natural numbers.

#### Smoothness and Well-foundedness

Let \({\prec} \subseteq X \times X\).

- An element \(x \in X\) is
**\(\prec\)-minimal**iff there is no \(y \in X\) such that \(y \prec x\). - \(\prec\) is
**smooth**(or**stuttered**) iff for each \(x \in X\), either \(x\) is \(\prec\)-minimal or there is a \(\prec\)-minimal \(y \in X\) such that \(y \prec x\). - \(\prec\) is
**well-ordered**iff there are no countable infinitely descending sequences \(\ldots \prec x_1 \prec x_{0}\).

#### Task 1

Take \(<\) to be the natural ordering on the natural numbers \(\mathbb{N}\) (e.g., \(4<6\), etc.).

- Is \(<\) smooth?
- Is \(<\) well-founded?

##### Solution

- Yes. For any \(x \in \mathbb{N}\), \(x =1\) or \(1 < x\). Clearly, \(1\) is minimal.
- Yes. If we fix a starting point \(x_{0}\) there are always finitely many steps down to \(1\).

#### Task 2

Let \({\prec} \subseteq \mathbb{N}\) be defined by \(x \prec y\) iff \(y < x\).

- Is \(\prec\) smooth?
- Is \(\prec\) well-founded?

##### Solution

- No. There are no \(\prec\)-minimal elements because there are no \(<\)-maximal elements in \(\mathbb{N}\).
- If it is not smooth, it cannot be well-founded. (Try to see why!)

#### Task 3

Let \(<\) be the natural ordering on \(\mathbb{N} \cup \{\infty\}\) where \(x < \infty\) for all \(x \in \mathbb{N}\).

- Is \(<\) smooth?
- Is \(<\) well-founded?

##### Solution

- Yes, same reason as in task 1.
- No, taking the starting point \(\infty\), we have an infinite descending chain to 1.

#### Task 4

Let \({\prec} \subseteq \mathbb{N} \cup \{\infty\}\) be defined by \(x \prec y\) iff \(y < x\) (where \(<\) is defined in Task 3).

- Is \(\prec\) smooth?
- Is \(\prec\) well-founded?

##### Solution

- Yes. Now we have the \(\prec\)-minimal element \(\infty\).
- No. Given any starting point \(x_0\) there are infinitely many steps '\(\prec\)-down to' \(\infty\).

#### Task 5

Suppose our first order language contains the relational symbol \(<\). Our premise set \(\Gamma\) consists of

- \(\forall x \exists y (x < y)\)
- \(\forall x \forall y ((x < y) \rightarrow \neg(x = y))\)
- \(\forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))\)

Recall that a model \(M\) comes with a domain \(D\) and it interprets \(<\) by giving it an extension \(\prec_M \subseteq (D \times D)\). Every model interprets \(=\) in the same way relative to the domain of \(M\), namely by \({=_M} = \{(d,d) \mid d \in D\}\).

One model \(M\) of \(\Gamma\) is for instance the one with domain \(\mathbb{N}\) where \(<_M\) is simply the natural ordering on \(\mathbb{N}\) (so \(1 < 2< 3 <4 \ldots\)).

Formulas are then evaluated as expected, e.g.,

- \(M \models \forall x \exists y (x < y)\) iff for all \(a \in D_M\) there is a \(b \in D_M\) for which \((a,b) \in <_M\).

##### Question 1

Suppose we define \({\sf ab}(M) = \{ (a,b) \mid (a,b) \in {<_{M}} \}\) and order our models by \(M \prec M'\) iff \(D_M = D_{M'}\) and \({\sf ab}(M) \subset {\sf ab}(M')\).

Let \(\mathcal{M}(\Gamma)\) be the set of all models of \(\Gamma\).

Is the order \(\prec\) on \(\mathcal{M}(\Gamma)\) smooth?

- Solution
No! Take models with domain \(\mathbb{N}\). Assume some such model \(M\) is \(\prec\)-minimal in \(\mathcal{M}(\Gamma)\). Note that there is an \(a \in \mathbb{N}\) for which there is a \(b \in \mathbb{N}\) such that \((b,a) \in {<_M}\) since \(M \models \forall x \exists y (x < y)\) and \(\mathbb{N} \neq \emptyset\). Let \(M_a\) be just like \(M\), except that \({<_{M_a}} = \{ (c,d) \in <_M \mid d \neq a\}\). Note that \({\sf ab}(M_a) = {\sf ab}(M) \setminus \{ (c,d) \in <_M \mid d = a \}\).

If \(M_a \in \mathcal{M}(\Gamma)\) we have reached a contradiction to the \(\prec\)-minimality of \(M\).

We now show that \(M_a \models \forall x \exists y (x < y)\). Let \(c \in \mathbb{N}\). Since \(M \in \mathcal{M}(\Gamma)\), there is a \(d \in \mathbb{N}\) for which \((c,d) \in <_M\). If \(d \neq a\) then also \((c,d) \in <_{M_a}\). Suppose \(d = a\). Then, there is a \(d' \in \mathbb{N}\) such that \((a,d') \in <_M\) (since \(M \models \forall x \exists y (x < y)\)). Since \(M \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))\) also \((c,d') \in <_M\). Since \(M \models \forall x \forall y ((x < y) \rightarrow (x \neq y))\), \(a \neq d'\). Hence, \((c,d') \in <_{M_a}\).

We now show that \(M_a \models \forall x \forall y ((x < y) \rightarrow (x \neq y))\). Let therefore \(c,d \in \mathbb{N}\) such that \((c,d) \in <_{M_a}\). Hence, \((c,d) \in <_M\). Since \(M \models \forall x \forall y ((x < y) \rightarrow (x \neq y))\), \((c,d) \notin =_M\) and thus \((c,d) \notin =_{M_a}\).

We now show that \(M_a \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))\). Let therefore \(c,d,e \in \mathbb{N}\) such that \((c,d) \in <_{M_a}\) and \((d,e) \in <_{M_a}\). Then also \((c,d) \in <_M\) and \((d,e) \in <_M\). Since \(M \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))\), also \((c,e) \in <_M\). By the definition of \(<_{M_a}\), the only way for \((c,e)\) not to be in \(<_{M_a}\) is if \(e = a\). However, this is impossible since in that case \((d,e)\) were not in \(<_{M_a}\) either. Hence, \((c,e) \in <_{M_a}\).

##### Question 2

Suppose now we define \(\Gamma'\) adding the following to \(\Gamma\):

- \(\forall x \forall y (( x < y) \rightarrow \neg( y < x))\)
- \(\forall x \forall y ((x < y) \vee (y < x))\)

Is the order \(\prec\) on \(\mathcal{M}(\Gamma')\) smooth?

- Solution
Yes now it is. To see this suppose there are two models \(M, M' \in \mathcal{M}(\Gamma')\) for which \(M \prec M'\). That means that \({\sf ab}(M) \subset {\sf ab}(M')\) and \(D_M = D_{M'}\). Hence, there are \(a,b \in D_M\) for which \((a,b) \in <_{M'} \setminus <_M\). Since \(M \models \forall x \forall y ((x < y) \vee (y < x))\) this means that \((b,a) \in <_M\). Since \(M \prec M'\), also \((b,a) \in <_{M'}\). However, \(M' \models \forall x \forall y (( x < y) \rightarrow \neg( y < x))\) and hence since \((a,b) \in <_{M'}\), \((b,a) \notin <_{M'}\),—a contradiction.

This shows that in fact every model of \(\Gamma'\) is \(\prec\)-minimal!

### Session 8: Some properties

#### Recall

- Deduction Theorem
- \(\Gamma \cup \{A\} \vdash B\) implies \(\Gamma \vdash A \supset B\).
- Resolution Theorem
- \(\Gamma \vdash A \supset B\) implies \(\Gamma \cup \{A\} \vdash B\).

#### Task 1

Does the resolution theorem hold for the reliability strategy?

Consider \(\Gamma_r = \{\circ p, \neg(\circ q \wedge \neg q) \vee \neg p\}\).

Can you construct a counter-example for the resolution theorem on the basis of \(\Gamma\).

#### Task 2

Does the resolution theorem hold for minimal abnormality.

If you found a counter-example in Task 1, can it be reused for minimal abnormality?

#### Task 3

Does the deduction theorem hold for the reliability strategy?

Consider \(\Gamma = \{(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q), \circ p, \circ q\}\).

Can you construct on the basis of \(\Gamma\) a counter-example for the deduction theorem?

#### Task 4

Can you prove that the deduction theorem holds for the semantic consequence relation based on minimal abnormality? In other words, prove:

- \(\Gamma \cup \{A\} \Vdash^m B\) implies \(\Gamma \Vdash^m A \supset B\)

#### Task 5

Prove: If \(\Gamma \Vdash^m A\) for all \(A \in \Delta\) then \(\mathcal{M}^m(\Gamma) = \mathcal{M}^m(\Gamma \cup \Delta)\).