Time and Place: Tuesdays and Thursdays, 2:30-4:00, 4-270 (or virtual, as needed) Lectures will be recorded! If you are enrolled, go here to see lecture 1!
The original theory of computation initiated by Alan Turing (and his contemporaries) studies what is computable in principle, without regard to physical constraints on computing. In this theory, the only distinguishing characteristic of what is "tractable" to compute (what can be solved) is the difference between the finite and the infinite: algorithms have finite descriptions, "good" algorithms produce answers after finitely many computation steps, "bad" algorithms run for infinitely many steps.
Computational complexity theory studies the impact of limited resources on computation, and gives many new refined answers to the general problem of "what is tractable." For a computational problem we care about solving often, we want to solve it with the minimum necessary resources. What is a resource? Practically anything scarce that a computation could consume: we can measure computation steps/time, memory usage, computation "size" and "depth" (if we consider Boolean logic circuits), energy consumption, communication between processors, number of processors, number of random bits used, number of qubits used, quality of approximate solutions, number of inputs correct(!) ... the list goes on. (Other names for the field could have been "computational resource theory", or "computational measure theory", but "complexity" is darn catchy!) Complexity also studies how we may "trade" one resource for another: if I want to use less memory to solve a problem, how much more time do I need to take?
We will study this general problem of limited resources on a completely abstract, mathematical level. It may look strange that it's even possible to mathematically study things like the "amount of time a program takes" when there are gazillions of programming languages, architectures, and operating system issues that could affect running time at any given moment. Nevertheless, we can get a handle on what is efficiently computable at a high level, by starting with Turing's model and carefully defining what resources means, so they will not be model-dependent. This project began in the 1960s, and is now one of the major new mathematical programs in the 21st century. Complexity is, as Arora and Barak put it, an "infant science" -- the best kind of science to get into!
The main goal for this course is to develop this mathematical theory and demonstrate its power for understanding efficient computation. Along the way, we will learn whatever necessary math is needed to get the job done -- complexity theory often uses interesting math in unexpected ways.
Expected workload for the course:
Four problem sets, about two weeks apart. Psets will account for 60% of your grade.
(Check out pset partners)
A final project (by yourself, or with $\leq$ 2 others). This will consist of a project proposal (1-2 pages), two progress updates (1-2 pages), a final project paper ($\geq$ 5 pages), and a final presentation in class. It could be a survey of a complexity-related topic that we haven't covered in class, or it could be a new theorem (or new propositions) about some complexity-related topic. In total, the project will be 40% of your grade.
Prerequisites:
This is a graduate course, but it is open to anyone. Formally, the prerequisite is 18.404/6.840 (introduction to the theory of computation). This version of the course will not require that. You should probably come with knowledge at the level of either 18.404/6.840 or 6.045/18.400 (automata, computability, complexity) or be ready to pick it up as you go. It might help to have also had 6.046/18.410 (algorithms) but that is not necessary.
Textbook:
Most of the course will be topics from
Sanjeev Arora and Boaz Barak. Complexity Theory: A Modern Approach. Alternate location. (The two links go to online versions of the book that should be freely accessible to MIT students. Please let me know if you have any troubles with the link.)
Your instructor has his own preferred way of presenting things, so we will also provide lecture notes. For the first few lectures of the course, Sipser also covers some of the topics. Another reference that is awesome for the intuition (but short on the proofs) is Moore and Mertens' The Nature of Computation. It's like bedtime reading for wee complexity theorists.