Understanding:
 Jeff feels VERY strongly that the students need to feel
physical pain
when they see incorrect statements like "∀Inputs
I, ∃Java program J, J(I)=P(I)".
All the way through grad
school they don't get this.
How can they do any math without it?
 We will cover lots of intuition for lots of practical
examples
that are crucial for understanding Computer Science.
 And we will balance informal intuitive proofs and formal
mechanical proofs.
How to Think About Proof Systems:
 Jeff's goal is to get the students to be as
passionate about the material as he is.
 He wants to teach them, not just what to think,
but how to think.
 This means that he tends to make up new things to
teach.
 Every term some students and professors complain
that this makes it harder for them.
 But many really like it.
NonStandard:
 Jeff tries to explain, but he upsets some professors.
They say "The Rules
taught should be the ones our forefathers devised. All else is
dangerous".
 Adding meaning (semantics) or intuition is also bad.
 Jeff found our forefathers' formal proof system
overly complicated and lacking (at least his) intuition.
Its a Game:
 For 25 years, in every course, Jeff
covers an informal Game as a proof system for first order logic.
 Reading left to right, the prover provides for ∃x &
the adversary provides for ∀y.
The key is that the order in which the players go is crucial to who
wins.
 Lets each choose a number
and the one with the biggest number wins. You go first.
 The formula is proofed to be a tautology by giving a winning
strategy for the corresponding game.
 For example, with "∀Inputs I, ∃Java
program J, J(I)=P(I)":

The adversary tells me his worst input I.

Knowing his I and P(I), the prover writes the one line of code J
that outputs the answer P(I).
 Even worse,
even if the prover is unable to compute P(I) from I,
he still knows that it has some well defined answer
and hence the Java program outputting it does exist, i.e. ∃J.
Local Rules and Meaning:
 Our forefathers teach some meta rules which say
how to symbolically
translate a proof with one syntax into a
Hilbert style proof, i.e. one in which:

One line follows mechanically from previous lines.

But too hard to follow directly even for the experts.
 These meta rules sometimes have the identical
line in two places in the proof

which mean completely different things
and allow different next steps

based on some subtle rule
about what happened much earlier in the proof.
 To solve both of these problems,
Jeff introduces new notation, x' and
y_{∃}.
 Local Rules:

While the old rule required global knowledge of the proof,
this new notation locally flags the required global information
so that which rule can be applied next is both local and clear.

The new rules seem even more symbolically mechanical and Hilbert like.
 Local Meaning:

Against our forefathers' wishes, Jeff emphasizes the semantic meaning
of each line.

Each line of the proof has a longer
implied syntax that gives it its sound semantics.

It is a generalization of the standard concept of a Global Quantification.

Jeff feels the students should not just be a mechanical automita,
but need to understand this meaning.
 For Instructors and Experts: See
(ppt)
for a quick comparison of our forefathers' proof systems and ours.
Propositional Proof System from the Recursive Search Algorithm:
 Some sections of 1090 spend much of the term
teaching a complicated formal proof system
for proving propositional formulas are
tautologies
(i.e. whether a given And/Or/Not formula evaluates to be true under every
true/false assignment.)
 Jeff feels that theoretically, a 2^{n}
sized truth table serves as an easier to understand proof.
And according to NP vs CoNP theory
there are not smaller proofs in general.
 Godel, himself, had a nice parallel between Proofs and
Computation. Lets elaborate on this.
 Sometimes a truth table search
can be shortened using a simple RecursiveBacktracking DavisPutnam algorithm
for finding a good
True/False assignment for a propositional formula.
 Because this course is to teach formal proof systems
for computer science,
Jeff writes down a few obvious rules.

These rules produce a proof that mirrors this Davis Putnam computation.

Because we know that Davis Putnam can handle any instance,
this mirroring ensures that this new proof system
is Complete,
i.e. it can prove all propositional
tautologies.

Because each rule follows from a tautology (with a
small truth table),
we also know that the proof system
is also Sound.

Because it has a constant size proof of each of the rules of the other
proof systems,
its smallest proof for a given formula can't be much bigger or
complicated than theirs.
Propositional Proof System from Standard Proof Techniques:

The proof techniques that need to be covered are:
Deduction,
Proof by Cases,
Modus Ponens,
Contrapositive
(Correlation of → does imply Causality),
Using easy Tautologies (De Morgan's Law, Distributive, Communicative)
Transitivity,
Proof by Contradiction, and
Concluding anything from A and ¬A.
 So that this is not just magical, Jeff makes each of
these into a simple propositional tautology formula.
 Jeff then makes each into a formal proof system rule
so
that the formal proof system follows the intuitive proofs that people
actually use in practice.
Short and Elegant vs Useful:
 Because it is fun to have a really trimmed down
proof system from which you can prove everything,
one with a short list of rules is covered.
 Because it is also fun to have a really
intuitive and robust proof system from which proving things is easy,
one with an additional list of rules is also covered.
 All of these fore mentioned rules
are then neatly subsumed by the one basic rule
(and in the darkness bind them)
that allows one to make a lemma out of
any previously proved tautology.
All Made Up:
 One could say that
Jeff just "made up"
these proof systems, but that would be too generous to him.
 Please indulge him.
 He does believe that they are
easier to understand.
 And do not veer too far from what our forefathers did.
 He hopes that they are better for first year students.
 Warning: Jeff has never taught 1090 before.
Cover ALL of Computer Science Theory:
 Generalizing:
One point of formal proofs is to prove theorems with as few
assumptions as possible
about the nature
of the objects in our Universe
so that we can find a wide range of strange new objects for
which the same theorems are true.
 We intend to cover
the first order logic needed to be able to express all the
concepts covered in 2001, 3101, 6111,...
Computable vs Uncomputable,
Deterministic vs Probabilistic Algorithms,
Time Complexity,
Compile Time vs Run Time, BigOh Notation,
Compiling a Java program into a TM,
Universal TM,
Deciding vs Accepting, P vs NP,
Reductions,
Uncomputable Problems
Graphs, Finite Fields,
Vector Spaces, Information Theory
 You must be saying "too hard"!!! But our goal is
much more humble.
We want first year students to be able to read and understand
the key first order logic
statements that arise from these topics. Examples:

Classifying Functions:
∃c∃n_{0}∀n≥n_{0}
f(n) ≤ cg(n)
We can certainly prove this
for example f&g using our proof system.

Uncomputable Problems:
∃P∀M∃I M(I)≠P(I)
Without a doubt, we want the students to learn that this means that
there are uncomputable problems.
It may to ambitions, but the formal proof of this fits on a quarter
page.

Godel's Incompleteness Theorem:
Could the students handle a 20 min explanation.
If there was a sound and complete proof system,
then it would be decidable whether a given first order logic sentence
was tautology.
and then it would be decidable whether a given TM halted.
But this is not the case.

Compiling: ∀ Java Programs J, ∃ TM M, ∀ input I, J(I)=M(I)
We would say that TM is a theoretical model of
computation. (Flash a picture
of one.)
The students would need to know that a proof of this involves
compiling a given Java program into an equivalent TM.
(Zero details. Maybe flash a picture.)

Graphs: Out degree one ≡
∀u∃v[Edge(u,v) & ∀v'≠v
¬Edge(u,v')]
If you can only talk about nodes and edges than you can't say that the
graph is connected or that it is finite.
Similarly, you can't say whether the your
field contains an infinite number of integers.
This is partly why we have nonstandard models.
On the other hand, if you can talk about finite paths, then you
define whether the graph is connected, i.e.
∀u,v ∃ path P(u,v).

Vector Spaces:
∀V,
∀⟨w_{1}...w_{d}⟩,
∀v∈V,
∃⟨c_{1}...c_{d}⟩,
v=c_{1}w_{1}+...+c_{d}w_{d}
Suppose they were promised that this was true.
We would define a Vector Space V to be an arbitrary set of objects
for which v=c_{1}w_{1}+...+c_{d}w_{d} is
well defined.
They then could define their own Vector Space V
and two different bases
⟨w_{1}...w_{d}⟩ and
⟨w'_{1}...w'_{d}⟩.
And be ensured to have black box oracle for translating a vector of
coefficients
⟨c_{1}...c_{d}⟩
with respect to the first basis
into a vector of coefficients
⟨c'_{1}...c'_{d}⟩
with respect to the second.
We would then devote 2 min per application, eg
Error Correcting Codes, Fourier Transformations, JPEG,
Integer Multiplication, Dimension Reduction, & Differentiation of Functions.
 If it becomes too much, we will cut things out as
we go.
 We think it would be good for the students
to have this early introduction.
Jeff's
Course Outline
History of Proofs (ppt):  
Euclid Geometry  One God & one possibility for the universe 
NonEuclid Geometry  Different Universes 
Hilbert's Formal Proofs  Put mathematics on a solid foundation 
Theoretical Computer Science  Key for EECS2001 and EECS3101 
Godel's Incompleteness Theorem  No formal proof system is capable of proving all true and no false formulas about the arithmetic of natural numbers. 
Propositional Logic (ppt):  
Three Proof Systems/Techniques:  Its is fun to have a really trimmed down proof system from which you can prove everything! 
Its is fun to have a really intuitive and robust proof system from which proving things is easy!  
 DavisPutnam Axioms:  These rules are designed to mirror a DavisPutnam computation. 
This ensures that the system is Complete in that it can prove all propositional tautologies.  
 Deduction:  Assume α, prove β, conclude α→β 
 Building:  From: α, α', ¬β, and ¬β' 
Conclude: α∧α', ¬(β∧γ), ¬(γ∧β), α∨γ, γ,∨α ¬(β∨β'), γ→α, β→γ, ¬(α→β), and ¬¬α.  
 Merge:  From α→Φ, and ¬α→Φ, conclude Φ 
 Proof Technique Axioms:  For making it easier to prove things. 
 Modus Ponens:  From α and α→β, conclude β 
 Excluded Middle:  α∨¬α. The universe must decide. 
 Double Negation:  From ¬¬α, conclude α 
 Contrapositive:  From α→β, conclude ¬β→¬α 
 Separating And:  From α∧β, conclude α and β 
 Selecting Or:  From α∨β and ¬α, conclude β 
 Transitivity:  From α→β and β→γ, conclude α→γ 
 Proof by Cases:  From α∨β, α→γ, and β→γ, conclude γ 
 Proof by Contradiction:  From ¬α→β and ¬α→¬β, conclude α 
 Inconsistent:  From β and ¬β, conclude anything 
 Tautology Axioms:  Soundness proved using these 
 Modus Ponens:  From α and α→β, conclude β 
 Tautologies:  Once proved can be used 
 Lemmas:  Once proved can be used 
Example Proof:  Prove "A World with out Racism" 
Hilbert style proof:  Each sentence in the Hilbert sequence follows from axioms or previous 
Operator's Truth Tables  And ∧, Or ∨, Not ¬, Implies →, iff, ExclusiveOr ⊕ 
Propositional  Combining operators ∧, ∨, and ¬ can compute any function 
 Formulas:  (¬P∧Q)∨(R⊕S)→T 
 Circuits:  Reuse computations 
Tautologies:  Relationships that are true for all setting of the variables. 
 De Morgan's Law:  ¬(P∧Q) iff ¬P∨¬Q 
 Distributive  P∧(Q∨R) iff (P∧Q)∨(P∧R) and P∨(Q∧R) iff (P∨Q)∧(P∨R) 
 Not Causality/Contrapositive:  P→Q iff ¬(P∧¬Q) iff ¬P∨Q iff ¬Q→¬P 
 Transitivity:  [(P→Q)∧(Q→R)]→(P→Q) 
 Both Direction Equivalence:  (P iff Q) iff (P→Q ∧ Q→P) 
 ExclusiveOr  (P⊕Q) iff (¬P∧Q)∨(¬P∧¬Q) 
Proofs of Tautologies:  
 Exponential sized truth tables:  Determine true/false for each of the 2^{n} setting of the variables 
 NP & CoNP:  There are not likely smaller proofs witnessing that a formula is satisfiable or a tautology 
 Davis Putnam Alg:  Heuristic that might be fast 
 Short List Proof  The proof mirrors the Davis Putnam Algorithm 
Prove Lemmas with Tautologies:  
Compare:  
 True/Valid/Tautology:  
 Γ⊧β  β is Logical Consequence of Γ 
For every setting of variables (and universe) that makes Γ true, β is also true.  
 α→β:  → is a Boolean operator 
For every setting of variables that makes α true, β is also true.  
 Proofs:  
 Γ⊢β:  β is Theorem or Syntactic Consequence of Γ. 
ie: From formulas in Axioms_{fixed}∪Γ, β can be proved.  
 α⇒β:  If α is a line in your proof, then you can add β as a line. 
 α→β:  This can be a line of your proof. 
 Interchangeable?  α⊧β, α⊢β, α⇒β, and α→β are interchangeable when α&β are interpreted appropriately. 
But Careful. α(x)⇒∀xα(x) even though α(x)→∀xα(x) is not true.  
Horn Clauses  A solution can be found faster 
First Order Logic  Informal Understanding & Informal Proofs (ppt):  
Predicates & Quantifiers:  ∀xP(x) vs ∃xP(x) 
Order of Quantifiers:  ∀x∃yP(x,y) vs ∃y∀xP(x,y) 
Sentence Truth:  An Infinite Iterative Algorithm 
Skolem/Auxiliary Functions:  ∀x∃yP(x,y) ⇒ P(x,y_{∃}(x)) 
Negations:  ¬∃xP(x) ≡ ∀x¬P(x) 
Understanding from the Inside:  ∀x [∃y x+y=0] ≡ ∀x Property(x) 
Constructive & Contradiction:  I don't like proves of ∀x P(x) that assume the contradiction so that it can construct an x. 
Instead, say "Let x be an arbitrary object."  
The Proof Viewed as a Game:  the prover provides for ∃x & the adversary provides for ∀y 
Game Tree:  ∃M^{w}_{1}∀M^{b}_{1}∃M^{w}_{2}∀M^{b}_{2}... WhiteWins(M^{w}_{1},M^{b}_{1},M^{w}_{2},M^{b}_{2},...) 
Examples:  
 Reals:  ∃a∀x∃y x=a ∨ x×y=1 
 Classifying Functions:  ∃c∃n_{0}∀n≥n_{0} f(n) ≤ cg(n) 
Our Informal Proof:  ∃y∀xP(x,y) ⇒ ∀x∃yP(x,y) 
Syntax (ppt):  Which strings are grammatically correct formulas? 
Context Free Grammar of Syntax:  S ⇒ ∀xS  ∃xS  ¬S  S∧S  S∨S  R(T,T) 
Meta Syntax:  α(x) is not itself a first order logic sentence, but denotes one 
Closure of Syntax:  If α(x) is a sentence then so is ∀xα(x) 
Parsing of Syntax:  Matching {[]()} using a stack 
Semantics (ppt):  
Meaning?:  What does formula α mean? 
 Human:  ∀x∃y x+y=0 ≡ "Integers are closed under negation" 
The Model/Universe M:  Defines the universe of objects, the functions and relations (but not free variables) 
 Sentences Truth:  Infinite recursive algorithm looping over all models and objects 
 Informal Proof:  Strategy for the ∀ ∃ game 
 Formal Proof:  Each sentence in the Hilbert sequence follows from the previous 
Free and Bound Variables:  P(x)∧∀xQ(x) 
Skolem/Auxiliary Functions:  P(x,y_{∃}(x)) implies ∀x∃yP(x,y) 
Deduction Variables x':  Assume α(x'), prove β(x'), conclude α(x)→β(x) 
Implications for Hilbert:  We need each line standing alone to state something valid. 
 Free Variables and Models:  α(x,y_{∃}(x)) implies ∀M∃y_{∃}∀x α(x,y_{∃}(x)) 
 Deduction:  β(x') implies ∀M∃y_{∃}∀x∀x' [α(x')→β(x')] 
Sound & Complete (ppt, ppt):  Proof of α iff it is valid/tautology 
Truth/Valid/Tautology:  
 M⊧α[x]:  Formula α is true under variable assignment x in universe/model M. 
 M⊧Γ[x]:  If M⊧α[x] for every formula α∈Γ 
  If M⊧α[x] for some M and x, then α is Satisfiable. 
 ⊧α:  If M⊧α[x] for all M and x, then α is Valid/Tautology. 
 Γ⊧α:  If M⊧α[x] for all M and x for which M⊧Γ[x], then α is a Logical Consequence of Γ. 
Proofs:  
 Γ⊢α:  If from formulas in Axioms_{fixed}∪Γ, one can prove α, then α is a Theorem or Syntactic Consequence of Γ. 
Sound:  A Proof system is Sound if a proof of α ensures that it is valid/tautology/true, i.e. Γ⊧α ensures Γ⊢α 
Complete:  A Proof system is Complete if α being valid/tautology/true ensures that it has a proof. i.e. Γ⊢α ensures Γ⊧α 
Formal Proofs (ppt):  
Game and Γ Example:  Comparing Informal and Formal Proofs 
NonLogical Axioms Γ (e.g., to do number theory)  
Gödel:  Completeness and Incompleteness 
Formal Proof Systems:  Really trimmed down proof system from which you can prove everything! 
Intuitive and robust proof system from which proving things is easy!  
Our Formal Proof System:  
 Lemmas via Substitutions:  Add all propositional tautologies as axioms. 
 Modus Ponens:  If α and α→β then β 
 True/False:  If Φ(P,Q), then Φ(α,β) If [α iff β], then [Φ(α) iff Φ(β)] 
 Objects:  If α(x,y), then α(t_{1},t_{2}) If [t_{1}=t_{2}], then [α(t_{1}) iff α(t_{2})] and [f(t_{1})=f(t_{2})] 
 Deduction:  Assume α(x'), prove β(x'), conclude α(x)→β(x) 
 Key Example:  
 Quantifiers:  
 Removing ∃:  ∃y α(x,y) ⇒ α(x,y_{∃}(x)) 
 Adding ∃:  α(term) ⇒ ∃y α(y) unless term contains y_{∃}(x) for x bound with ∀x 
 Removing ∀:  ∀x α(x) ⇒ α(x) 
 Adding ∀:  α(x) ⇒ ∀x α(x) unless fixed x_{∃} or x' 
 Negations:  ¬∃α(x) iff ∀x¬α(x) 
 Warning:  Each line of the proof has a longer implied syntax that gives it its sound semantics. 
Intuition:  Balance intuitive understanding and formal mechanics 
Examples of Proofs:  
 Free Variable Fail  α(x)→∀xα(x) and its dual ∃xα(x)→α(x) are not true 
 Order of Quantifiers:  ∃y∀xP(x,y) → ∀x∃yP(x,y) 
 Distributive Laws:  Eg ∀x[α(x)→β(x)] → [∀xα(x)→∀xβ(x)] and ∀x[α(x)∧β(x)] iff [∀xα(x)∧∀xβ(x)] 
 Proof by Cases:  From α∨β, α→γ, and β→γ, prove γ 
 Proof by Duality:  Switch true/false, ∧/∨, ∀/∃, →/← 
Sequent Calculus LK:  Tree of Sequents: α_{1}∧...∧α_{n} → β_{1}∨...∨β_{m} 
 Prove each its Rules  Left and Right rule for ∧, ∨, ¬ ∃, and ∀ 
Gödel (ppt):  
Sound and Complete:  Proved iff a Tautology (i.e. True in every Model) 
Incompleteness Thm:  No formal proof system is capable of proving all tautologies and proving no nontautologies about the integers. 
To guarantee a proof, α needs to also be true in every nonstandard model.  
EXAMPLES:  
Computational Complexity (ppt):  Key for EECS2001 and EECS3101 
Computable vs Uncomputable:  ∃M∀I M(I)=P(I) vs ∃P∀M∃I M(I)≠P(I) 
Time Complexity:  ∃M∃c∀I M(I)=P(I) and Time(M,I)≤ I^{c} 
Table Look Up vs TM:  ∀k∃M_{table}∀x,y≤k M_{table}(x,y)=x×y vs ∃M∀x,y M(x,y)=x×y 
Compile Time vs Run Time:  ∃M∃k∀I∃m M(I) has k states and uses m tape cells 
Probabilistic Algorithms:  ∃M∀I Pr_{R}[M(I,R)≠P(I)] ≤ 2^{R} 
Compiling:  ∀ Java Programs J, ∃ TM M, ∀ input I, J(I)=M(I) 
Universal TM:  ∃ TM M_{U}, ∀ TM M, ∀ input I, M_{U}(M,I) = M(I) 
Deciding vs Accepting:  Halt(M,I,t) is computable and Halt(M,I) ≡ ∃t Halt(M,I,t) is not 
P vs NP:  Sat(I,S) is in poly and Sat(I) ≡ ∃S Sat(I,S) is likely not 
NP:  ClauseSat(I) ≡ ∃S∀C∃x "Assignment S satisfies variable x in clause C" 
Reductions:  Sat≤P ⇒ ∀I [Sat(I) iff P(InstanceMap(I))] 
Uncomputable Problems:  ∃P∀M∃I M(I)≠P(I) 
Induction (ppt):  
Weak:  [S(0)&∀i[S(i1)→S(i)] → ∀iS(i) 
Strong:  [∀i [∀j<i S(j)]→S(i)] → ∀iS(i) 
Eg. Planar Graph:  For every planar graph, #Edges = #Nodes + #Faces  #Components  1 
Every planar graph can be coloured with 6 colours  
Loop Invariants:  
 Establishing:  ⟨PreCond⟩ & ⟨PreCode⟩ ⇒ ⟨LoopInv⟩ 
 Maintaining:  ⟨LoopInv_{t}⟩ & ¬⟨ExitCond⟩ & ⟨LoopCode⟩ ⇒ ⟨LoopInv_{t+1}⟩ 
 Ending:  ⟨LoopInv⟩ & ⟨ExitCond⟩ ⇒ ⟨PostCond⟩ 
Recursion:  Trust friends to solve any smaller instances meeting the preconditions 
and if small enough solve yourself  
Konig's Lemma:  Infinite Tree → Infinite Path 
Representing Numbers (ppt):  
Calculus Epsilon/Delta Proofs:  ∀x∀ε∃δ∀x'∈[x+δ,xδ] f(x')∈[f(x)ε,f(x)+ε] 
Vector Spaces:  ∀V, ∀⟨w_{1}...w_{d}⟩, ∀v∈V, ∃⟨c_{1}...c_{d}⟩, v=c_{1}w_{1}+...+c_{d}w_{d} 
Graphs:  ∀u∃v[Edge(u,v) & ∀v'≠v ¬Edge(u,v')] 
Induction:  [S(0)&∀i[S(i)→S(i+1)] → ∀iS(i) 
Fields:  
 Rules:  ∃1∀x x×1=x ∀x∃y x×y=1 
 Constructing Integers:  ∀x∃y y=x+1 
 Finite Fields:  ∀x∃y y=x+1 but 4+1=0 
 Integers mod p:  ∀primes p ∀x∃y x×y=1 (mod p) 
 Proofs for Integers:  Integers ⇒ axioms ⇒ prove great facts 
Non Constructive Proofs (pdf pg 23):  
 Expander Graphs  The probability that it exists is greater than zero. 
 An infinite number of primes  ∀n∃prime p>n (It divides into n!+1) 
Vector Spaces (ppt):  
 Spanning the Space:  ∀V, ∀⟨w_{1}...w_{d}⟩, ∀v∈V, ∃⟨c_{1}...c_{d}⟩, v=c_{1}w_{1}+...+c_{d}w_{d} 
 Linear Transformations:  A matrix times the vector ⟨c_{1}...c_{d}⟩ for one v give the vector ⟨c'_{1}...c'_{d}⟩ for another v' 
 Change of Basis:  A matrix times the vector ⟨c_{1}...c_{d}⟩ given one basis ⟨w_{1}...w_{d}⟩ gives the ⟨c'_{1}...c'_{d}⟩ for another ⟨w'_{1}...w'_{d}⟩ 
One proof & many applications  
 Colour:  Red/Blue/Green forms a basis for humans, but in science light is infinitely dimensional 
 Error Correcting Codes:  Encoding is a linear transformation from the message to the code vectors 
 Fourier Transformation:  A change from the sine basis to the time basis 
 Integer Multiplication:  A change from the polynomial basis to the evaluation basis 
 JPEG Image Compression:  A change from the pixel basis to the sine basis 
 Dim Reduce & Face Recog:  A change from xy basis to rotated basis 
 Calculus:  Integrating is the inverse linear transformation of differentiating 
 Probabilistic Markov Process:  Each time step is a linear transformation of the probability vector 
 Quantum Probability:  Each time step is a different linear transformation of the probability vector 
Information Theory Example (ppt):  
 Knowing Mud Problem:  I know that my mom knows that my dad know that I am good. 
 Common Knowledge:  Everybody knows that everybody knows that everybody knows 
 Def^{n} of "Knowing":  It is true in every universe that I consider possible 