COSC6115, Winter 2009
Instructor: Eric Ruppert
Office: Computer Science Building, room 3042
Email: [my last name] @cse.yorku.ca
Telephone: (416) 736-2100 ext. 33979
Facsimile: (416) 736-5872
Lectures: Tuesdays and Thursdays from 10:00 to 11:30 in room 325 of Bethune College.
Office Hours: Tuesdays and Fridays 13:00-14:00. If you cannot make it at those times, feel free to contact me to make an appointment at a different time. You can also try dropping by my office.
The best way to contact me is probably by email. Please use your cs account when sending me email, and start your subject line with "". Send messages in plain text, without attachments.
- (Apr 8) In 4(b), delta should be a positive constant. And in 4(c) I was assuming that you start counting characters from 0 (i.e. the leftmost character of a string is the 0th character); thus k should really be <= |x| -1.
- (Apr 5) In 4(c) "boolean" should be "binary".
- (Apr 1) For assignment 3(a), you can assume f(n) >= cn for some constant c>1.
- (Mar 19) For assignment 2(b): It turns out to be harder than I thought to get a tight bound on the space. So, I just want you prove that $O(S(n))$ space is sufficient and $Omega(S'(n))$ space is needed. You don't have to make $S(n)=S'(n)$, but the closer together they are, the better. (For question 2(a), if you are having trouble getting matching upper and lower bounds on the time, give me the best upper bound and the best lower bound you can.)
- (Mar 17) Check out the York programming contests. The first contest of the Winter term will take place on Friday March 20, 15:00-17:00 in CSEB 1006. Check out also this facebook group.
- (Mar 12) I will be out of town March 20 to 25, so there will be no class on March 24. (We can reschedule an extra one at the end of the term.)
- (Mar 10) Clarification of Assignment 1, part (c): I want you to show that TMs are inherently slower than STMs for some problems, but that your simulation is optimal for some such problems. More formally, I want you to show that there is a language L such that
L can be decided in worst-case time T(n) by an STM M,
your simulation of M in part (a) uses T'(n) steps to decide L in the worst case,
no TM can decide L in O(T(n)) steps, and
every TM that decides L takes Omega(T'(n)) steps.
This course is intended to give graduate students a solid grasp of the fundamentals of complexity theory, which is an important part of the foundations of the entire field of computer science.
We'll look at models of computers (e.g. Turing machines) and see how they are used to measure the resources (time, space and others) required to solve problems. If we pick a model and a bound on one or more resources, we can define a complexity class (e.g. P, NP, NC) to be the set of problems that can be solved in that model with the given resources. These classes give us a systematic way of understanding how efficiently problems can be solved.
In studying complexity theory, we learn about techniques that can be used to design efficient algorithms. We also learn how to identify those problems for which efficient solutions do not exist or are unlikely to exist. Such information is useful, because it often suggests how to modify our approach when faced with an intractable problem: for example, we might look for approximate solutions instead of exact solutions, or try to solve a special case of the problem that is sufficient for our needs.
Much of the material will be taught from first principles, so there are no specific prerequisites. However, some background in algorithms and theoretical models of computing (e.g. Turing machines, finite automata) would be helpful, and you should be comfortable reading and writing mathematical proofs.
The exact set of topics to be covered will depend partly on students' interests and background, but here is a tentative list of topics:
- Models of computation: various types of Turing Machines, Random Access Machines
- Definitions of some basic complexity classes (LOGSPACE, P, NP, coNP, PSPACE) & relationships between them
- Time & space hierarchy theorems
- Reductions, NP-completeness
- An introduction to randomized computation
- Parallel computation (circuits, Parallel Random Access Machines, NC, P-completeness)
- Approximating solutions to intractable problems (if time permits)
|Homework exercises || 80% |
|Class presentation || 20% |
March 5: Introduction, why study complexity, how to measure complexity, definition of Turing machine. [AB 1.1, 1.2]
March 10: Time complexity of palindromes, simulating 2-way infinite tape. [Pap 2.1] and writeup of lower bound.
March 12: Simulating multiple tapes with a single tape [AB 1.2.2] or [Pap 2.3]. Relationship between RAMs and TMs [Pap 2.6] or [CR73].
March 17: Relationship between RAMs and TMs [Pap 2.6] or [CR73].
March 19: SPACE(c) = regular languages [HU79 2.6] (originally proved in 1959 by Rabin and Scott and by Shepherdson).
March 26: Constant factors don't matter [Pap 2.4] or [HS65]. Universal Turing Machine [AB 1.3 or Pap 3.1].
March 31: Diagonalization arguments: halting problem, time hierarchy Theorem [AB 3.1 or Pap 7.2].
April 2: Several basic relationships between TIME and SPACE classes [Pap 7.3]
April 7: Reachability arguments: Savitch's Thm: REACHABILITY is in SPACE(log^2 n). Thus, NSPACE(S(n)) is contained in SPACE((S(n))^2)
April 9: Reachability arguments: Immerman-Szelepsc&eacu;nyi Thm: REACHABILITY is in NSPACE(log n). Thus, co-NSPACE(S(n))=NSPACE(S(n))
April 14: Reductions (with examples)
April 16: Reductions and Complete problems.
April 21: Reachability and 2SAT are NL-complete
April 23: CKTVAL is P-complete, SAT is NP-complete (Cook's Thm).
April 28: Several NP-completeness proofs.
April 30: NP-completeness of 3D-MATCHING, 3-COLOURABILITY
May 5: Randomized algorithms.
May 7: Randomized complexity classes.
May 12: Parallel classes.
May 14: Presentations. Metric Structures and Probabilistic Computation, IP=PSPACE, A class of O(n^2) Geometry Problems
May 19: Presentations. Hamilton Paths in Grid Graphs, Storage Modification Machines, Word Problems Requiring Exponential Time.
[AB] Sanjeev Arora and Boaz Barak.
Complexity Theory: A Modern Approach.
To be published by Cambridge University Press in 2009. In the meantime,
a draft of the book is available online.
[BDG] José Luis Balcázar, Josep Diaz, Joaquim Gabarró.
Structural Complexity I, 2nd ed.
Nicely written, accessible introduction.
[BC] Daniel Pierre Bovet and Pierluigi Crescenzi.
Introduction to the Theory of Complexity.
Prentice Hall, 1994.
[DK] Ding-Zhu Du and Ker-I Ko.
The Theory of Computational Complexity.
[Gol] Oded Goldreich.
Computational Complexity: A Conceptual Perspective.
Cambridge University Press, 2008.
Preliminary versions of this book are available from the author's
[HS] Steven Homer and Alan L. Selman.
Computability and Complexity Theory.
[HU79] John E. Hopcroft and Jeffrey D. Ullman.
Introduction to Automata Theory, Languages and Computation, 1st ed.
[Imm] Neil Immerman.
Emphasizes connections to formal logic.
[Pap] Christos Papadimitriou.
[Sav] John E. Savage.
Models of Computation: Exploring the Power of Computing.
Part III of this book discusses complexity theory.
[Sip] Michael Sipser.
Introduction to the Theory of Computation, 2nd edition.
Part III of this book discusses complexity theory.
[CR73] S.A. Cook and R.A. Reckhow. Time bounded random access machines, J. of Computer and System Sciences, volume 7, pages 354-375, 1973.
[HS65] J. Hartmanis and R.E. Stearns. On the computational complexity of algorithms, Trans. of the AMS, volume 117, pages 285-306, 1965.
Resources for Special Topics
[ACG] G. Ausiello, P. Crescenzi, G. Gambosi, V. Kann, A. Marchetti-Spaccamela and M. Protasi.
Complexity and Approximation: Combinatorial Optimization Problems and Their Approximability Properties.
Associated with this book is the online Compendium of NP Optimization Problems, which is like a version of Garey and Johnson's appendix (with more emphasis on approximations).
[GJ] Michael R. Garey and David S. Johnson.
Computers and Intractability.
W. H. Freeman, 1979.
The bible of NP-completeness.
Johnson has also published a sequence of columns (available here) that form a kind of sequel to the book.
[GHR] Raymond Greenlaw, James Hoover, Walter L. Ruzzo.
Limits to Parallel Computation: P-completeness Theory.
Oxford University Press, 1995.
(See their web site.)
[SP] Uwe Schöning and Randall Pruim.
Gems of Theoretical Computer Science.
Each chapter presents a beautiful theorem from various fields of computer science (with a number from complexity theory).
More advanced complexity resources
[Aar] Scott Aaronson (zookeeper).
The Complexity Zoo.
This wiki has information on hundreds of complexity classes.
includes a huge number of papers on complexity theory.
[HO] Lane A. Hemaspaandra and Mitsunori Ogihara.
The Complexity Theory Companion.
Arranges Theorems according to the techniques used to prove them.
[RW] Steven Rudich and Avi Wigderson, eds.
Computational Complexity Theory.
AMS and Institute for Advanced Study, 2004.
Lecture notes, starting from some basics, but moving quickly to more advanced topics.
[Zim] Marius Zimand.
Computational Complexity: A Quantitative Perspective.
Focussed on questions of what fraction of problems in a class have certain properties.
Updated May 15, 2009