6.842 Randomness and Computation
Brief Course description:
The power and sources of randomness in computation.
Connections and applications to computational complexity,
computational learning theory, cryptography and combinatorics.
(1) Basic tools: polynomial zero testing, uniform generation and
approximate counting, random variables with limited independence,
Fourier analysis, testing properties of
boolean functions, the influence of variables, random walks.
(2) Randomness in various computational settings: algorithms,
computational learning theory,
communication complexity, probabilistic proofs, complexity theory.
(3) Generating randomness: pseudorandom generators, derandomization,
graph expansion, extractors.
- Time: MW 1:00-2:30
- Place: 36-155
- (02/08) Lecture 1: Introduction. The probabilistic method
(examples: hypergraph coloring, sum-free subsets).
Begin the Lovasz Local Lemma.
- (02/13) Lecture 2: The Lovasz Local Lemma. Example: hypergraph
coloring. Begin Moser-Tardos Algorithm for finding solutions
guaranteed by LLL.
- (02/15) Lecture 3:
- (02/21) Lecture 4:
Randomized complexity classes. Derandomization via
enumeration. Pairwise independence and derandomization
- (02/22) Lecture 5:
Pairwise independent hash functions. A construction.
2-point sampling. Begin interactive proofs.
- (02/27) Lecture 6:
Interactive proofs. The class IP. Graph nonisomorphism
is in IP. IP models with public versus private coins.
A protocol for lower bounding the size of the set (using
pairwise independent hashing).
- (02/29) Lecture 7:
Derandomizing via the method of conditional expectations.
Intro to Markov chains and random walks on graphs.
- (03/5) Lecture 8:
Stationary distributions. Cover times.
Undirected st-connectivity is in RL.
Linear algebra and mixing times.
- (03/7) Lecture 9:
Saving randomness via random walks.
Self-correcting linear functions.
- (03/12) Lecture 10:
Begin linearity testing. Fourier basics.
- (03/14) Lecture 11:
More Fourier basics: Plancherel's and Parseval's theorems.
Finish proof of linearity test. Saving randomness in linearity testing.
- (03/19) Lecture 12:
Learning under the uniform distribution.
Model. Brute force algorithm. Learning conjunctions.
Estimating one Fourier coefficient.
- (03/21) Lecture 13:
Fourier representations for basic functions.
Learning via the Fourier representation -- the low degree algorithm.
- (04/2) Lecture 14:
Applications of the low degree algorithm. Learning AC0 functions.
Learning linear threshold functions.
- (04/4) Lecture 15:
Noise sensitivity and Fourier concentration. Applications
to learning linear threshold functions.
- (04/9) Lecture 16:
Learning parity with noise: Goldreich-Levin algorithm (motivation).
- (04/11) Lecture 17:
Learning parity with noise: Goldreich-Levin algorithm (finish).
- (04/18) Lecture 18:
Applications of learning parity with noise to learning.
Begin weak learning of monotone functions.
not here yet [pdf]
- (04/23) Lecture 19:
Finish weak learning of monotone functions.
Begin strong learners from weak learners: Boosting algorithms.
- (04/25) Lecture 20:
More on boosting algorithms.
- (04/30) Lecture 21:
- (05/2) Lecture 22:
Finish Boosting. Boosting and complexity theory: Amplifying
hardness. Yao's XOR lemma.
- (05/7) Lecture 23: Pseudorandomness. Computational Indistinguishability.
Next bit unpredictability.
- (05/9) Lecture 24: Pseudorandomness vs. next bit unpredictability.
One way functions.
- (05/14) Lecture 25: One way permutations and pseudo-random generators.
- (05/16) Lecture 26: Hardness vs. randomness. The Nisan-Wigderson
- Homework 1.
Posted February 23, 5pm. Last update February 27, 12:15pm.
Due March 14, 2012.
Posted March 18, 1:45pm. Last update March 18, 1:45pm.
Due April 4, 2012.
- Homework 3.
Posted April 4, 4:45pm. Last update April 4, 4:45pm.
Due Monday, April 23, 2012.
hint for problem 2.
- References on Markov, Chebyshev and tail inequalities:
Salil Vadhan's Pseudorandomness
- Information for scribes:
- Here is a
sample file and
the preamble.tex file that
Please get a first draft to
me within two days of the lecture.
When using the definition environment, make sure to emphasize the term being defined.
- Here are some