back to the general course site

$$\def\Dab{{\rm Dab}}$$

## General Info

### Change Log

• Session 2 added [2015-04-16 Thu]

### Course

 Course: Adaptive Logics Organised by: Christian Straßer, GA 3/39, 0234/3224721 Speaking hours: Wednesdays from 10:00–12:00 When: every Tuesday from 10:00 to 12:00, , Summer term 2015 Where: GABF 04 / 358

## Session 2: Introducing Hilbert Style Proof Systems [2015-04-14 Tue]

To be discussed at [2015-04-21 Tue].

Recall the Hilbert Style axiomatisation of classical propositional logic from the slides:

 $$(A\supset1)$$ $$A\supset(B\supset A)$$ (A $$\supset$$ 2) $$(A\supset(B\supset C))\supset((A\supset B)\supset(A\supset C))$$ (A $$\supset$$ 3) $$((A\supset B)\supset A)\supset A$$ (A $$\wedge$$ 1) $$(A\wedge B)\supset A$$ (A $$\wedge$$ 2) $$(A\wedge B)\supset B$$ (A $$\wedge$$ 3) $$A\supset(B\supset(A\wedge B))$$ (A $$\vee$$ 1) $$A\supset(A\vee B)$$ (A $$\vee$$ 2) $$B\supset(A\vee B)$$ (A $$\vee$$ 3) $$(A\supset C)\supset((B\supset C)\supset((A\vee B)\supset C))$$ (A $$\equiv$$ 1) $$(A\equiv B)\supset(A\supset B)$$ (A $$\equiv$$ 2) $$(A\equiv B)\supset(B\supset A)$$ (A $$\equiv$$ 3) $$(A\supset B)\supset((B\supset A)\supset(A\equiv B))$$ (A $$\neg$$ 1) $$(A\supset\neg A)\supset\neg A$$ (A $$\neg$$ 2) $$A\supset(\neg A\supset B)$$ (A $$\neg$$ 3) $$A \vee \neg A$$

• Discussed on [2015-04-21 Tue].

Proof $$A \supset A$$.

Tip: you only need $$(A{\supset}1)$$ and $$(A{\supset}2)$$.

 1 $$(A \supset ((A \supset A) \supset A)) \supset ((A \supset (A\supset A)) \supset (A \supset A))$$ $$(A{\supset}2)$$ where $$A:=A, B := A \supset A, C:=A$$ 2 $$A \supset ((A\supset A) \supset A)$$ $$(A{\supset}1)$$ where $$A:=A, B:=A\supset A$$ 3 $$(A \supset (A\supset A)) \supset (A \supset A)$$ 1,2; MP 4 $$A \supset (A \supset A)$$ $$(A{\supset}1)$$ where $$A:= A, B:= A$$ 5 $$A \supset A$$ 3,4; MP

• Discussed on [2015-04-21 Tue].

Proof: $$(B \vee C) \supset (\neg B \supset C)$$.

 1 $$B \supset ( \neg B \supset C)$$ $$(A{\neg}2)$$ 2 $$C \supset (\neg B \supset C)$$ $$(A{\supset}1)$$ 3 $$(B \supset (\neg B \supset C)) \supset ( (C \supset (\neg B \supset C)) \supset ((B \vee C) \supset (\neg B \supset C)))$$ $$(A{\vee}3)$$ 4 $$(C \supset (\neg B \supset C)) \supset ((B \vee C) \supset (\neg B \supset C))$$ 1,3; MP 5 $$(B \vee C) \supset (\neg B \supset C)$$ 2,4; MP

• Discussed on [2015-04-21 Tue].

Proof: $$A \supset B, B \supset C \vdash A \supset C$$.

 1 $$(A \supset (B \supset C)) \supset ((A \supset B) \supset ( A \supset C))$$ $$(A{\supset}2)$$ 2 $$B \supset C$$ PREM 3 $$(B \supset C) \supset (A \supset (B \supset C))$$ $$(A{\supset}1)$$ 4 $$A \supset (B \supset C)$$ 2,3; MP 5 $$(A \supset B) \supset ( A \supset C)$$ 1,4; MP 6 $$A \supset B$$ PREM 7 $$A \supset C$$ 5,6; MP

• Discussed on [2015-04-21 Tue].

Proof: $$\neg \neg A \supset A$$.

 1 $$A \supset (\neg \neg A \supset A)$$ $$(A{\supset}1)$$ 2 $$\neg A \supset (\neg \neg A \supset \neg A)$$ $$(A{\neg}2)$$ 3 $$(A \supset (\neg \neg A \supset A)) \supset ((\neg A \supset (\neg \neg A \supset A)) \supset ((A \vee \neg A) \supset (\neg \neg A \supset A)))$$ $$(A{\supset}2)$$ 4 $$(\neg A \supset (\neg \neg A \supset A)) \supset ((A \vee \neg A) \supset (\neg \neg A \supset A))$$ 1,3; MP 5 $$(A \vee \neg A) \supset (\neg \neg A \supset A)$$ 2,4; MP 6 $$A \vee \neg A$$ $$(A{\neg}3)$$ 7 $$\neg \neg A \supset A$$ 5,6; MP

Proof: $$A \supset \neg \neg A$$.

#### Solution

 1 $$(\neg A \supset \neg \neg A) \supset \neg \neg A$$ $$(A{\neg}1)$$ 2 $$((\neg A \supset \neg \neg A) \supset \neg \neg A) \supset (A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A))$$ $$(A{\supset}1)$$ 3 $$A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A)$$ 1,2; MP 4 $$A \supset (\neg A \supset \neg \neg A)$$ $$(A{\neg}2)$$ 5 $$(A \supset ((\neg A \supset \neg \neg A) \supset \neg \neg A)) \supset ((A \supset (\neg A \supset \neg \neg A)) \supset (A \supset \neg \neg A))$$ $$(A{\supset}2)$$ 6 $$(A \supset (\neg A \supset \neg \neg A)) \supset (A \supset \neg \neg A)$$ 3,5; MP 7 $$A \supset \neg \neg A$$ 4,6; MP

• Discussed on [2015-04-21 Tue].

Show $$\{A, \neg A\} \vdash B$$.

Tip: use $$(A{\neg}2)$$ and MP.

 1 $$A$$ PREM 2 $$\neg A$$ PREM 3 $$A \supset (\neg A \supset B)$$ $$(A{\neg}2)$$ 4 $$\neg A \supset B$$ 1,3; MP 5 $$B$$ 2,4; MP

## Session 3: Mathematical Induction

Don't forget to take a look at the solutions of Session 2 and prepare Task 5 of Session 2 (we didn't discuss that example yet).

Recall from the course that mathematical induction works as follows:

You want to prove a statement $$S(n)$$ for all natural numbers $$n \in \mathbb{N}$$ (or for a subset of natural numbers $$N$$). E.g., in task one $$S(n)$$ reads $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$.

Induction Base: You start by proving it for the first natural number (in your set of interest $$N$$). For $$\mathbb{N}$$ this is 1 (except you begin to count at 0). In our example you first prove $$1 = \frac{1 \cdot (1+1)}{2}$$.

Induction Step: Then you suppose that you have shown your statement $$S(\cdot)$$ for an arbitrary number $$n$$ and on this basis show that your statement also holds for $$n+1$$ (this is called the induction hypothesis or the inductive hypothesis). In our first example you suppose that $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$ holds and then you show that $$\sum_{i=1}^{n+1} i = \frac{(n+1)\cdot(n+2)}{2}$$ holds.

There is a variant of the proof method called strong mathematical induction. In the induction step you assume that $$S(\cdot)$$ holds for all $$m \le n$$ for an arbitrary $$n$$ and on this basis demonstrate that $$S(n+1)$$.

Prove via mathematical induction that $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$.

• Base Case ($$n=1$$): $$\sum_{i = 1}^{n} i = 1$$ and $$\frac{n \cdot (n+1)}{2} = \frac{1 \cdot 2}{2} = 1$$. Thus, $$\sum_{i = 1}^{n} = \frac{n \cdot (n+1)}{2}$$.
• Induction step: The inductive hypothesis is $$\sum_{i=1}^n i = \frac{n \cdot (n+1)}{2}$$. We need to show that $$\sum_{i=1}^{n+1} i = \frac{(n+1)\cdot (n+2)}{2}$$. We have:
1. $$\sum_{i=1}^{n+1} i = \left(\sum_{i=1}^n i \right) + (n+1)$$
2. By the inductive hypothesis, $$\left(\sum_{i=1}^n i \right) + (n+1) = \frac{n \cdot (n+1)}{2} + (n+1)$$
3. Note that $$\frac{n \cdot (n+1)}{2} + (n+1) = \frac{n \cdot (n+1)}{2} + \frac{2 \cdot (n+1)}{2} = \frac{n \cdot (n+1) + 2 \cdot (n+1)}{2} = \frac{(n+2) \cdot (n+1)}{2} = \frac{(n+1) \cdot (n+2)}{2}$$.
4. By 2 and 3, $$\sum_{i=1}^{n+1} i = \frac{(n+1) \cdot (n+2)}{2}$$.

Prove via mathematical induction that $$T(n) = 2^n - 1$$ where $$T(n) = \left\{ \begin{array}{cc} 1 & n=1 \\ 2 \cdot T(n-1) + 1 & n > 1 \end{array} \right.$$

• Base Case ($$n=1$$): $$T(1) = 1 = 2^1 - 1$$.
• Inductive step: The inductive hypothesis is that $$T(n) = 2^n - 1$$. We need to show that $$T(n+1) = 2^{n+1} - 1$$.
1. $$T(n + 1) = 2 \cdot T(n + 1 - 1) + 1 = 2 \cdot T(n) + 1$$.
2. By the inductive hypothesis and 1, $$T(n + 1) = 2 \cdot (2^n - 1) + 1$$.
3. Note that $$2 \cdot (2^n - 1) + 1 = (2^{n+1} - 2) + 1 = 2^{n+1} - 1$$.
4. By 2 and 3, $$T(n+1) = 2^{n+1} - 1$$.

## Session 4: some more proofs, the deduction theorem and transitivity

Don't forget that in tasks you can rely on results obtained in previous tasks.

### Task 1: Reasoning by Cases

Prove: $$A \vee B, A \supset C, B \supset C \vdash C$$.

#### Solution

1. $$A \supset C$$ by Premise introduction.
2. $$(A \supset C) \supset ((B\supset C) \supset ((A \vee B) \supset C))$$ by $$(A\vee 3)$$
3. $$(B\supset C) \supset ((A \vee B) \supset C)$$ by MP.
4. $$B\supset C$$ by premise introduction.
5. $$(A \vee B) \supset C$$ by MP.
6. $$A \vee B$$ by premise introduction.
7. $$C$$ by MP.

Show that if $$\Gamma \vdash A$$ for all $$A \in \Gamma'$$ and $$\Gamma' \vdash B$$, then also $$\Gamma \vdash B$$.

#### Solution

• Suppose $$\Gamma \vdash A$$ for all $$A \in \Gamma'$$ and suppose $$\Gamma' \vdash B$$.
• Hence, there is a proof of $$B$$ from $$\Gamma'$$.
• Since proofs are finite, there is a finite subset $$\Gamma'_f = \{B_{1}, \ldots, B_{n}\}$$ of $$\Gamma'$$ such that there is a proof $$\mathcal{P}'$$ from $$\Gamma'_f$$ of $$B$$.
• For each $$B_i$$ ($$i \in \{1, \ldots, n\}$$) we know that there is a proof $$\mathcal{P}_i$$ of $$B_i$$ from $$\Gamma$$.
• We now concatenate the proofs $$\mathcal{P}_1, \ldots, \mathcal{P}_n$$ (under renaming of the proof lines) resulting in a proof $$\mathcal{P}^{\star}$$.
• Let $$\mathcal{P}''$$ be the result of
• removing each line in $$\mathcal{P}'$$ where $$B_i$$ ($$i \in \{1, \ldots, n\}$$) is introduced by Premise introduction
• and replacing each reference to a line in $$\mathcal{P}'$$ where some $$B_i$$ ($$i \in \{1, \ldots, n\}$$) is introduced by premise introduction in a justification of some line by a reference to a line in $$\mathcal{P}^{\star}$$ where $$B_i$$ is derived.
• Now we concatenate $$\mathcal{P}^{\star}$$ and $$\mathcal{P}''$$.
• This is a proof of $$B$$ from $$\Gamma$$.

### Remark

Recall that last session we proved the deduction theorem. This can drastically simplify proofs.

Here's one example how it can be used in a proof.

Proof: $$\vdash (A \supset B) \supset ((B \supset C) \supset (A \supset C))$$.

1. Suppose $$A \supset B$$.
1. Suppose $$B \supset C$$.
1. Suppose $$A$$.
• Then by MP and 1, $$B$$.
• By MP and 1.1, $$C$$.
2. We have shown that $$A \supset B, B \supset C, A \vdash C$$.
3. By the deduction theorem, $$A \supset B, B \supset C \vdash A \supset C$$.
2. By the deduction theorem, $$A \supset B \vdash (B \supset C) \supset (A \supset C)$$.
2. By the deduction theorem, $$\vdash (A \supset B) \supset ((B \supset C) \supset (A \supset C))$$.

Show $$\vdash (A \supset B) \supset (\neg A \vee B)$$.

#### Solution

1. Suppose $$A \supset B$$.
1. Suppose $$A$$.
• By 1 and MP, $$B$$.
• By $$(A\vee 2)$$, $$B \supset (\neg A \vee B)$$.
• By MP, $$\neg A \vee B$$.
2. We have shown that $$A \supset B, A \vdash \neg A \vee B$$.
3. By the deduction theorem, $$A \supset B \vdash A \supset (\neg A \vee B)$$.
4. Suppose $$\neg A$$.
1. By $$(A\vee 1)$$, $$\neg A \supset (\neg A \vee B)$$.
2. By MP, $$\neg A \vee B$$.
5. We have shown that $$A \supset B, \neg A \vdash \neg A \vee B$$.
6. By the deduction theorem, $$A \supset B \vdash \neg A \supset (\neg A \vee B)$$.
7. Note that by $$(A \neg 3)$$, also $$A \supset B \vdash A \vee \neg A$$.
8. By lines 3, 6, 7, Task 1 and Task 2, also $$A \supset B \vdash \neg A \vee B$$. (Do you see why?)
2. By the deduction theorem $$\vdash (A \supset B) \supset (\neg A \vee B)$$.

Show $$\vdash (\neg A \vee B) \supset (A \supset B)$$.

#### Solution

1. Suppose $$\neg A \vee B$$.
1. By $$(A \supset 1)$$, $$B \supset (A \supset B)$$.
2. Suppose $$\neg A$$.
1. Suppose $$A$$.
1. By a previous task, $$B$$.
2. By the deduction theorem, $$\neg A \vee B, \neg A \vdash A \supset B$$.
3. By the deduction theorem, $$\neg A \vee B \vdash \neg A \supset (A \supset B)$$.
4. Hence, by 1, 3, Task 1 and Task 2, also $$A \supset B$$. (Do you see why?)
2. We have shown that $$\neg A \vee B \vdash A \supset B$$.
3. By the deduction theorem, $$\vdash (\neg A \vee B) \supset (A \supset B)$$.

## Session 5: Dynamic Proofs and Adaptive Strategies

Note that in tasks you can always use results you obtained in previous tasks.

Show that: if $$\Gamma \vdash A \vee \Dab(\Delta)$$ (where $$\vdash$$ is the derivability relation of the lower limit logic) then $$A$$ is derivable on a (non-empty) condition $$\Delta$$ in an adaptive proof from the premise set $$\Gamma$$.

For the courageous: Show the other direction. (You will need an inductive argument over the length of the dynamic proof of $$A \vee \Dab(\Delta)$$.)

#### Solution

($$\Rightarrow$$)

1. Let $$\Gamma \vdash A \vee \Dab(\Delta)$$.
2. Hence, there is a proof in LLL of $$A \vee \Dab(\Delta)$$ from $$\Gamma$$.
3. Hence, there is a finite subset $$\{A_1, \ldots, A_n\}$$ of $$\Gamma$$ from which $$A \vee \Dab(\Delta)$$ is provable.
4. We now construct a dynamic proof of $$A$$ from $$\Gamma$$ as follows:
 1 $$A_1$$ PREM $$\emptyset$$ 2 $$A_2$$ PREM $$\emptyset$$ $$\vdots$$ $$\vdots$$ PREM $$\emptyset$$ $$n$$ $$A_n$$ PREM $$\emptyset$$ $$n{+}1$$ $$A$$ $$1,\ldots,n;$$ RC $$\Delta$$

($$\Leftarrow$$)
We prove this by induction over the length of the proof of $$A$$ on the condition $$\Delta$$. For simplification we let '$$\vee \Dab(\emptyset)$$' denote the empty string.

Base case (line 1): $$B$$ is introduced as a premise on the condition $$\emptyset$$. Hence, $$B \in \Gamma$$ and this $$\Gamma \vdash B$$ and hence $$\Gamma \vdash B \vee \Dab(\emptyset)$$.

Inductive Step (line $$n{+}1$$): The inductive hypothesis is that, where $$C$$ is derived on the condition $$\Theta$$ at a line $$m \leq n$$, then $$\Gamma \vdash C \vee \Dab(\Theta)$$.

Suppose $$B$$ is derived at line $$n{+}1$$ on the condition $$\Lambda$$ with the justification J.

1. Case J = PREM: The argument is analogous to the base case.
2. Case J = $$l_1, \ldots, l_m$$; RU:
1. Suppose $$A_i$$ is derived on the condition $$\Delta_i$$ at line $$l_{i}$$ (where $$1 \le i \le m$$).
2. Hence, $$A_1, \ldots, A_m \vdash B$$ (by the definition of the rule RU) and $$\Theta = \Delta_1 \cup \ldots \cup \Delta_m$$.
3. For all $$i \in \{1, \ldots, m\}$$ we have:
1. By the inductive hypothesis, $$\Gamma \vdash A_i \vee \Dab(\Delta_i)$$.
2. Hence, $$\Gamma \vdash A_i \vee \neg\neg \Dab(\Delta_i)$$.
3. Hence, $$\Gamma \vdash \neg\neg \Dab(\Delta_i) \vee A_i$$. (Recall: $$A \vee B \vdash B \vee A$$.)
4. Thus, $$\Gamma \vdash \neg\Dab(\Delta_i) \supset A_i$$ (Recall: $$\neg A \vee B \vdash A \supset B$$.)
5. Thus, $$\Gamma \cup \{\neg \Dab(\Delta_i)\} \vdash A_i$$. (Resolution Theorem)
6. Thus, $$\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash A_i$$ (Monotonicity)
4. Hence, $$\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash A_1 \wedge \ldots \wedge A_m$$ (Recall that $$A \vdash B_1$$ and $$A \vdash B_2$$ implies $$A \vdash B_1 \wedge B_2$$.)
5. By 2.2 also $$A_1 \wedge \ldots \wedge A_m \vdash B$$ (Follows by transitivity.)
6. By 2.4 and 2.5, $$\Gamma \cup \{\neg \Dab(\Delta_j) \mid j \in \{1, \ldots, m\}\} \vdash B$$ (this is transitivity).
7. Thus, $$\Gamma \vdash B \vee \Dab(\Delta_1) \vee \Dab(\Delta_2) \vee \ldots \vee \Dab(\Delta_m)$$. (By the deduction theorem.)
8. Hence, $$\Gamma \vdash B \vee \Dab(\Delta_1 \cup \ldots \cup \Delta_m)$$ and $$\Gamma \vdash B \vee \Dab(\Theta)$$.
3. Case J = $$l_1, \ldots, l_m$$; RC: this can be shown in a similar way as the previous case.

Show that:

1. Whenever $$A$$ is derivable on a condition $$\Delta \cup \{B\}$$ in a dynamic proof from $$\Gamma$$ then also $$A \vee B$$ is derivable from $$\Gamma$$ on the condition $$\Delta$$.
2. Whenever $$A$$ is derivable on a condition $$\Delta$$ and $$\neg A$$ is derivable on the condition $$\emptyset$$ in a dynamic proof from $$\Gamma$$, then also $$\Dab(\Delta)$$ is derivable from $$\Gamma$$ on the condition $$\emptyset$$.

#### Solution

1. Suppose $$A$$ is derived on the condition $$\Delta \cup \{B\}$$ from $$\Gamma$$.
2. By Task 1, $$\Gamma \vdash A \vee \Dab(\Delta \cup \{B\})$$.
3. Hence, $$\Gamma \vdash A \vee \Dab(\Delta) \vee B$$ and thus $$\Gamma \vdash (A \vee B) \vee \Dab(\Delta)$$. (Recall that $$\vee$$ is associative ($$A \vee (B \vee C) \vdash (A \vee B) \vee C$$) and commutative ($$A \vee B \vdash B \vee A$$).)
4. Again, by Task 1, $$A \vee B$$ is derivable from $$\Gamma$$ on the condition $$\Delta$$.

1. Suppose $$A$$ is derivable on the condition $$\Delta$$ and $$\neg A$$ is derivable on the condition $$\emptyset$$ in a proof from $$\Gamma$$.
2. By the first part of Task 2, $$\Gamma \vdash A \vee \Dab(\Delta)$$ and $$\Gamma \vdash \neg A$$.
3. Thus, $$\Gamma \vdash \Dab(\Delta)$$ (disjunctive syllogism).

Task 2 shows that the following generic rules are derived rules for dynamic proofs in adaptive logics:

$$\begin{array}{rl} A & \Delta \cup \Delta' \\ \hline A \vee \Dab(\Delta) & \Delta' \end{array} ~~~ [\rm RD1]$$

$$\begin{array}{rl} A & \Delta \\ \neg A & \emptyset \\\hline \Dab(\Delta) & \emptyset \end{array} ~~~[\rm RD2]$$

Indicate mistakes in the following proof fragment from $$\Gamma = \{(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q), (\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q), \circ p, \circ q\}$$ where the reliability strategy is used:

 1 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 2 $$(\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 3 $$(\circ p \wedge \neg p)$$ 1,2; RU $$\emptyset$$ 4 $$\circ p$$ PREM $$\emptyset$$ 5 $$p$$ 4; RU $$\{\circ p \wedge \neg p\}$$ 6 $$\circ q$$ RU $$\emptyset$$ $$\checkmark$$ 7 $$q$$ 6; RC $$\{\circ q \wedge \neg q\}$$

#### Solution

 1 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 2 $$(\circ p \wedge \neg p) \vee \neg (\circ q \wedge \neg q)$$ PREM $$\emptyset$$ 3 $$(\circ p \wedge \neg p)$$ 1,2; RU $$\emptyset$$ 4 $$\circ p$$ PREM $$\emptyset$$ RU -> RC, marking 5 $$p$$ 4; RU $$\{\circ p \wedge \neg p\}$$ RU -> PREM 6 $$\circ q$$ RU $$\emptyset$$ no marking $$\checkmark$$ 7 $$q$$ 6; RC $$\{\circ q \wedge \neg q\}$$

Let $$\Gamma = \{\circ p, \circ( p \supset q), \neg q\}$$. Extend the following proof so that line 2 gets marked (where the reliability strategy is used).

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$

#### Solution

 1 $$\circ p$$ PREM $$\emptyset$$ $$\checkmark$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ(p \supset q)$$ PREM $$\emptyset$$ 4 $$\neg q$$ PREM $$\emptyset$$ 5 $$p \supset q$$ 3; RC $$\{\circ(p \supset q) \wedge \neg (p \supset q)\}$$ 6 $$q$$ 2,5; RU $$\{\circ p \wedge \neg p,\circ(p \supset q) \wedge \neg (p \supset q)\}$$ 7 $$(\circ p \wedge \neg p) \vee (\circ(p \supset q) \wedge \neg (p \supset q))$$ 4,6; RD2 $$\emptyset$$

Let $$\Gamma = \{\circ p, \circ q, \neg q, \neg(p \wedge q)\}$$. Extend the following proof in such a way that $$p$$ is finally derived at line 2 (where the reliability strategy is used).

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ q$$ PREM $$\emptyset$$ 4 $$\neg(p \wedge q)$$ PREM $$\emptyset$$ 5 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ 1,3,4; RU $$\emptyset$$

#### Solution

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ q$$ PREM $$\emptyset$$ 4 $$\neg(p \wedge q)$$ PREM $$\emptyset$$ 5 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q)$$ 1,3,4; RU $$\emptyset$$ 6 $$\neg q$$ PREM $$\emptyset$$ 7 $$\circ q \wedge \neg q$$ 3,6; RU $$\emptyset$$

Let $$\Gamma = \{\circ p, \circ q, \neg p \vee \neg q \vee \neg s\}$$. Finally derive $$\neg s$$ (where the reliability strategy is used).

#### Solution

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 3 $$\circ q$$ PREM $$\emptyset$$ 4 $$q$$ 3; RC $$\{\circ q \wedge \neg q\}$$ 5 $$\neg p \vee \neg q \vee \neg s$$ PREM $$\emptyset$$ 6 $$\neg q \vee \neg s$$ 2,5; RU $$\{\circ p \wedge \neg p\}$$ 7 $$\neg s$$ 4,6; RU $$\{\circ q \wedge \neg q\}$$

Let $$\Gamma = \{\circ p, \circ q, \neg p \vee \neg q \vee \neg s, \circ s\}$$.

1. Is $$\neg s$$ still finally derivable?
2. Finally derive $$p \vee q \vee s$$ with the minimal abnormality strategy.

#### Solution

Ad 1: Nope. Here's a demonstration

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$\circ q$$ PREM $$\emptyset$$ 3 $$\circ s$$ PREM $$\emptyset$$ $$\checkmark_6$$ 4 $$\neg s$$ 1,2; RC $$\{\circ p \wedge \neg p, \circ q \wedge \neg q\}$$ 5 $$\neg p \vee \neg q \vee \neg s$$ PREM $$\emptyset$$ 6 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q) \vee (\circ s \wedge \neg s)$$ 1,2,3,5; RU $$\emptyset$$

 1 $$\circ p$$ PREM $$\emptyset$$ 2 $$\circ q$$ PREM $$\emptyset$$ 3 $$\circ s$$ PREM $$\emptyset$$ $$\checkmark_6$$ 4 $$\neg s$$ 1,2; RC $$\{\circ p \wedge \neg p, \circ q \wedge \neg q\}$$ 5 $$\neg p \vee \neg q \vee \neg s$$ PREM $$\emptyset$$ 6 $$(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q) \vee (\circ s \wedge \neg s)$$ 1,2,3,5; RU $$\emptyset$$ 7 $$p$$ 1; RC $$\{\circ p \wedge \neg p\}$$ 8 $$p \vee q \vee s$$ 7; RU $$\{\circ p \wedge \neg p\}$$ 9 $$p \vee q \vee s$$ 2; RC $$\{\circ q \wedge \neg q\}$$ 10 $$p \vee q \vee s$$ 3; RC $$\{\circ s \wedge \neg s\}$$

Note that $$\Phi_{10}(\Gamma) = \bigl\{ \{\circ p \wedge \neg p\}, \{\circ q \wedge \neg q\}, \{\circ s \wedge \neg s\} \bigr\}$$. Hence, lines 8–10 are not marked at this point according to the marking definition of minimal abnormality.

## Session 7: Smoothness and Well-Foundedness

Let in the following $$\mathbb{N} = \{1, 2, 3, \ldots \}$$ be the set of natural numbers.

### Smoothness and Well-foundedness

Let $${\prec} \subseteq X \times X$$.

• An element $$x \in X$$ is $$\prec$$-minimal iff there is no $$y \in X$$ such that $$y \prec x$$.
• $$\prec$$ is smooth (or stuttered) iff for each $$x \in X$$, either $$x$$ is $$\prec$$-minimal or there is a $$\prec$$-minimal $$y \in X$$ such that $$y \prec x$$.
• $$\prec$$ is well-ordered iff there are no countable infinitely descending sequences $$\ldots \prec x_1 \prec x_{0}$$.

Take $$<$$ to be the natural ordering on the natural numbers $$\mathbb{N}$$ (e.g., $$4<6$$, etc.).

1. Is $$<$$ smooth?
2. Is $$<$$ well-founded?

#### Solution

1. Yes. For any $$x \in \mathbb{N}$$, $$x =1$$ or $$1 < x$$. Clearly, $$1$$ is minimal.
2. Yes. If we fix a starting point $$x_{0}$$ there are always finitely many steps down to $$1$$.

Let $${\prec} \subseteq \mathbb{N}$$ be defined by $$x \prec y$$ iff $$y < x$$.

1. Is $$\prec$$ smooth?
2. Is $$\prec$$ well-founded?

#### Solution

1. No. There are no $$\prec$$-minimal elements because there are no $$<$$-maximal elements in $$\mathbb{N}$$.
2. If it is not smooth, it cannot be well-founded. (Try to see why!)

Let $$<$$ be the natural ordering on $$\mathbb{N} \cup \{\infty\}$$ where $$x < \infty$$ for all $$x \in \mathbb{N}$$.

1. Is $$<$$ smooth?
2. Is $$<$$ well-founded?

#### Solution

1. Yes, same reason as in task 1.
2. No, taking the starting point $$\infty$$, we have an infinite descending chain to 1.

Let $${\prec} \subseteq \mathbb{N} \cup \{\infty\}$$ be defined by $$x \prec y$$ iff $$y < x$$ (where $$<$$ is defined in Task 3).

1. Is $$\prec$$ smooth?
2. Is $$\prec$$ well-founded?

#### Solution

1. Yes. Now we have the $$\prec$$-minimal element $$\infty$$.
2. No. Given any starting point $$x_0$$ there are infinitely many steps '$$\prec$$-down to' $$\infty$$.

Suppose our first order language contains the relational symbol $$<$$. Our premise set $$\Gamma$$ consists of

1. $$\forall x \exists y (x < y)$$
2. $$\forall x \forall y ((x < y) \rightarrow \neg(x = y))$$
3. $$\forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$

Recall that a model $$M$$ comes with a domain $$D$$ and it interprets $$<$$ by giving it an extension $$\prec_M \subseteq (D \times D)$$. Every model interprets $$=$$ in the same way relative to the domain of $$M$$, namely by $${=_M} = \{(d,d) \mid d \in D\}$$.

One model $$M$$ of $$\Gamma$$ is for instance the one with domain $$\mathbb{N}$$ where $$<_M$$ is simply the natural ordering on $$\mathbb{N}$$ (so $$1 < 2< 3 <4 \ldots$$).

Formulas are then evaluated as expected, e.g.,

• $$M \models \forall x \exists y (x < y)$$ iff for all $$a \in D_M$$ there is a $$b \in D_M$$ for which $$(a,b) \in <_M$$.

#### Question 1

Suppose we define $${\sf ab}(M) = \{ (a,b) \mid (a,b) \in {<_{M}} \}$$ and order our models by $$M \prec M'$$ iff $$D_M = D_{M'}$$ and $${\sf ab}(M) \subset {\sf ab}(M')$$.

Let $$\mathcal{M}(\Gamma)$$ be the set of all models of $$\Gamma$$.

Is the order $$\prec$$ on $$\mathcal{M}(\Gamma)$$ smooth?

##### Solution

No! Take models with domain $$\mathbb{N}$$. Assume some such model $$M$$ is $$\prec$$-minimal in $$\mathcal{M}(\Gamma)$$. Note that there is an $$a \in \mathbb{N}$$ for which there is a $$b \in \mathbb{N}$$ such that $$(b,a) \in {<_M}$$ since $$M \models \forall x \exists y (x < y)$$ and $$\mathbb{N} \neq \emptyset$$. Let $$M_a$$ be just like $$M$$, except that $${<_{M_a}} = \{ (c,d) \in <_M \mid d \neq a\}$$. Note that $${\sf ab}(M_a) = {\sf ab}(M) \setminus \{ (c,d) \in <_M \mid d = a \}$$.

If $$M_a \in \mathcal{M}(\Gamma)$$ we have reached a contradiction to the $$\prec$$-minimality of $$M$$.

We now show that $$M_a \models \forall x \exists y (x < y)$$. Let $$c \in \mathbb{N}$$. Since $$M \in \mathcal{M}(\Gamma)$$, there is a $$d \in \mathbb{N}$$ for which $$(c,d) \in <_M$$. If $$d \neq a$$ then also $$(c,d) \in <_{M_a}$$. Suppose $$d = a$$. Then, there is a $$d' \in \mathbb{N}$$ such that $$(a,d') \in <_M$$ (since $$M \models \forall x \exists y (x < y)$$). Since $$M \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$ also $$(c,d') \in <_M$$. Since $$M \models \forall x \forall y ((x < y) \rightarrow (x \neq y))$$, $$a \neq d'$$. Hence, $$(c,d') \in <_{M_a}$$.

We now show that $$M_a \models \forall x \forall y ((x < y) \rightarrow (x \neq y))$$. Let therefore $$c,d \in \mathbb{N}$$ such that $$(c,d) \in <_{M_a}$$. Hence, $$(c,d) \in <_M$$. Since $$M \models \forall x \forall y ((x < y) \rightarrow (x \neq y))$$, $$(c,d) \notin =_M$$ and thus $$(c,d) \notin =_{M_a}$$.

We now show that $$M_a \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$. Let therefore $$c,d,e \in \mathbb{N}$$ such that $$(c,d) \in <_{M_a}$$ and $$(d,e) \in <_{M_a}$$. Then also $$(c,d) \in <_M$$ and $$(d,e) \in <_M$$. Since $$M \models \forall x \forall y \forall z (((x < y) \wedge (y < z)) \rightarrow (x < z))$$, also $$(c,e) \in <_M$$. By the definition of $$<_{M_a}$$, the only way for $$(c,e)$$ not to be in $$<_{M_a}$$ is if $$e = a$$. However, this is impossible since in that case $$(d,e)$$ were not in $$<_{M_a}$$ either. Hence, $$(c,e) \in <_{M_a}$$.

#### Question 2

Suppose now we define $$\Gamma'$$ adding the following to $$\Gamma$$:

1. $$\forall x \forall y (( x < y) \rightarrow \neg( y < x))$$
2. $$\forall x \forall y ((x < y) \vee (y < x))$$

Is the order $$\prec$$ on $$\mathcal{M}(\Gamma')$$ smooth?

##### Solution

Yes now it is. To see this suppose there are two models $$M, M' \in \mathcal{M}(\Gamma')$$ for which $$M \prec M'$$. That means that $${\sf ab}(M) \subset {\sf ab}(M')$$ and $$D_M = D_{M'}$$. Hence, there are $$a,b \in D_M$$ for which $$(a,b) \in <_{M'} \setminus <_M$$. Since $$M \models \forall x \forall y ((x < y) \vee (y < x))$$ this means that $$(b,a) \in <_M$$. Since $$M \prec M'$$, also $$(b,a) \in <_{M'}$$. However, $$M' \models \forall x \forall y (( x < y) \rightarrow \neg( y < x))$$ and hence since $$(a,b) \in <_{M'}$$, $$(b,a) \notin <_{M'}$$,—a contradiction.

This shows that in fact every model of $$\Gamma'$$ is $$\prec$$-minimal!

## Session 8: Some properties

### Recall

Deduction Theorem
$$\Gamma \cup \{A\} \vdash B$$ implies $$\Gamma \vdash A \supset B$$.
Resolution Theorem
$$\Gamma \vdash A \supset B$$ implies $$\Gamma \cup \{A\} \vdash B$$.

Does the resolution theorem hold for the reliability strategy?

Consider $$\Gamma = \{\circ p, \neg(\circ q \wedge \neg q) \vee \neg p\}$$.

Can you construct a counter-example for the resolution theorem on the basis of $$\Gamma$$.

Does the resolution theorem hold for minimal abnormality.

If you found a counter-example in Task 1, can it be reused for minimal abnormality?

Does the deduction theorem hold for the reliability strategy?

Consider $$\Gamma = \{(\circ p \wedge \neg p) \vee (\circ q \wedge \neg q), \circ p, \circ q\}$$.

Can you construct on the basis of $$\Gamma$$ a counter-example for the deduction theorem?

• $$\Gamma \cup \{A\} \Vdash^m B$$ implies $$\Gamma \Vdash^m A \supset B$$
Prove: If $$\Gamma \Vdash^m A$$ for all $$A \in \Delta$$ then $$\mathcal{M}^m(\Gamma) = \mathcal{M}^m(\Gamma \cup \Delta)$$.