# Logic for Computer Science/Logics

# Sub-FO Logics edit

## From Propositional Logic to FO edit

### Propositional Logic edit

### Monadic FO edit

### Existential FO edit

### Universal FO edit

## Modal Logic edit

Modal logic expands propositional logic by operators for the modalities *possibility* and *necessity*. These basic *modal operators* are usually written (or *L*) for *necessarily* and (or *M*) for *possibly*. Each can be defined from the other in the following way:

## Conjunctive Logic edit

Conjunctive Logic is a fragment of FO that is restricted to atomic FO expressions joined by conjunctions that can be preceded by existential quantifiers. It is especially important in Database theory.

**Definition** CON

- Atom

**Remarks**

- In other words: a CON-formula is a (relational) FO-formula built from atomic formulas using only conjunction and existential quantification.
- The semantics of FO applies here correspondingly.
- The 'usual' notions of free and bound variables also apply here.
- A formula of the form , where each φ is quantifier free, is said to be in Normal Form
- Each CON formula is equivalent to one in Normal Form
- All CON formulas are satisfyable

**Examples**

- has a bound x and a free variable y
- is equivalent to . The latter is said to be in normal form.

**Equality**

Additionally Equality between variables or constants can be be introduced by

Example. Not always satisfiable. Each satisfiable CON formula is equivalent to one without equality.

# Higher-order Logics edit

## Fixed-Point Logics edit

Fixed-Point Logics augment the syntax of FO by a fixed-point operator, a mechanism that enables recursion on expressions, and so allows higher expressive power. This idea is quite similar to adding the construct of while-loop to an imperative programming language.

**Example** Transitive Closure

Consider the transitive closure of a graph G. For distance one we have:

for distance up to two:

so we can go on by expanding this disjunction for further fixed distances (only). In order to express transitive closure for an arbitrary distance (what is vital for most situations) recursion can be introduced employing:

where T is the transitive closure up to some i. Thus φ(T) gives the transitive closure up to i + 1. So it can be defined a sequence

that clearly gives the transitive closure for some k, i.e. the sequence converges (i.e. J_{k} = J_{k+j}, for all j > 0) and its limit is the transitive closure.

In the illustration we have an initial graph G as in (a) and thus

(a) as

(b) as

and the fixed point (c) as

and thus J_{3} = J_{4} = ... .

♣

As we have seen in this example, recursion can be used to express formulas of arbitrary length. But since we are always dealing with finite formulas here some terminator must be given for the recursion. This role can be played by the fixpoint. In the above example such a fixpoint exists. But this is not always the case, as we see next.

**Example**

TODO Jumper

♣

As we have seen in this example, a fixpoint of a recursion not always exists. So we must be careful defining a recursion-based logic. We extend FO by a fixpoint operator to PFP for cases where a fixpoint exists.

### Partial Fixed-Point Logics (PFP) edit

TODO

♣

The above assumption that a fixpoint exists it not really satisfying, so one now can think of restrictions on the recursion in order to enforce the existence in every case. Therefore three ideas are introduced: monotonicity, positiveness and inflation of the recursion.

### Inflationary Fixed-Point Logics (IFP) edit

**Notion**

TODO Life-Game

**Definition**

TODO

### Least Fixed-Point Logics (LFP) edit

**Notion**

TODO

**Definition**

TODO

♣

### Comparison of IFP and LFP edit

TODO

♣