- Instructor: Ronitt Rubinfeld
- Time: MW 1:00-2:30
- Place: 36-155

- (02/08) Lecture 1: Introduction. The probabilistic method (examples: hypergraph coloring, sum-free subsets). Begin the Lovasz Local Lemma. [pdf]
- (02/13) Lecture 2: The Lovasz Local Lemma. Example: hypergraph coloring. Begin Moser-Tardos Algorithm for finding solutions guaranteed by LLL. [pdf] [latex]
- (02/15) Lecture 3: Moser-Tardos Algorithm. [pdf] [latex]
- (02/21) Lecture 4: Randomized complexity classes. Derandomization via enumeration. Pairwise independence and derandomization (example: max-cut). [pdf] [latex]
- (02/22) Lecture 5: Pairwise independent hash functions. A construction. 2-point sampling. Begin interactive proofs. [pdf] [latex]
- (02/27) Lecture 6: Interactive proofs. The class IP. Graph nonisomorphism is in IP. IP models with public versus private coins. A protocol for lower bounding the size of the set (using pairwise independent hashing). [pdf]
- (02/29) Lecture 7: Derandomizing via the method of conditional expectations. Intro to Markov chains and random walks on graphs. [pdf] [tex]
- (03/5) Lecture 8: Stationary distributions. Cover times. Undirected st-connectivity is in RL. Linear algebra and mixing times. [pdf] [tex]
- (03/7) Lecture 9: Saving randomness via random walks. Self-correcting linear functions. [pdf] [latex]
- (03/12) Lecture 10: Begin linearity testing. Fourier basics. [pdf] [latex]
- (03/14) Lecture 11: More Fourier basics: Plancherel's and Parseval's theorems. Finish proof of linearity test. Saving randomness in linearity testing. draft [pdf]
- (03/19) Lecture 12: Learning under the uniform distribution. Model. Brute force algorithm. Learning conjunctions. Estimating one Fourier coefficient. [pdf] [latex]
- (03/21) Lecture 13: Fourier representations for basic functions. Learning via the Fourier representation -- the low degree algorithm. [pdf]
- (04/2) Lecture 14: Applications of the low degree algorithm. Learning AC0 functions. Learning linear threshold functions. [pdf] [latex]
- (04/4) Lecture 15: Noise sensitivity and Fourier concentration. Applications to learning linear threshold functions. [pdf] [latex]
- (04/9) Lecture 16: Learning parity with noise: Goldreich-Levin algorithm (motivation). draft [pdf]
- (04/11) Lecture 17: Learning parity with noise: Goldreich-Levin algorithm (finish). draft [pdf]
- (04/18) Lecture 18: Applications of learning parity with noise to learning. Begin weak learning of monotone functions. not here yet [pdf]
- (04/23) Lecture 19: Finish weak learning of monotone functions. Begin strong learners from weak learners: Boosting algorithms. [pdf] [latex]
- (04/25) Lecture 20: More on boosting algorithms. [pdf] [latex]
- (04/30) Lecture 21: More Boosting. [pdf] [latex]
- (05/2) Lecture 22: Finish Boosting. Boosting and complexity theory: Amplifying hardness. Yao's XOR lemma. [pdf] [tex]
- (05/7) Lecture 23: Pseudorandomness. Computational Indistinguishability. Next bit unpredictability.
- (05/9) Lecture 24: Pseudorandomness vs. next bit unpredictability. One way functions. (draft)[pdf]
- (05/14) Lecture 25: One way permutations and pseudo-random generators. Hardcore bits. (draft)[pdf]
- (05/16) Lecture 26: Hardness vs. randomness. The Nisan-Wigderson pseudo-random generator. [pdf] [latex]

- Homework 1. Posted February 23, 5pm. Last update February 27, 12:15pm. Due March 14, 2012.
- Homework 2. Posted March 18, 1:45pm. Last update March 18, 1:45pm. Due April 4, 2012.
- Homework 3. Posted April 4, 4:45pm. Last update April 4, 4:45pm. Due Monday, April 23, 2012. hint for problem 2.
- Solutions

- References on Markov, Chebyshev and tail inequalities:
- Avrim Blum's Handout on Tail inequalities.
- Avrim Blum's Handout on probabilistic inequalities.
- Jin Yi Cai's book containing derivations of useful probabilistic inequalities (see pages 63-67).

- Salil Vadhan's Pseudorandomness text.

- Information for scribes:
- Here is a sample file and the preamble.tex file that it uses. Please get a first draft to me within two days of the lecture. When using the definition environment, make sure to emphasize the term being defined.
- Here are some tips.

- Information for graders.