6.842 Randomness and Computation (Spring 2022)
- Instructor:
Ronitt Rubinfeld
- Teaching Assistant:
Amartya Shankha Biswas
-
Course email: 6842sp22 csail mit edu (with at and dots in proper places)
- Time: MW 11:00-12:30
- Place: 5-134
Brief Course description:
We study the power and sources of randomness in computation, concentrating on connections and applications to computational complexity, computational learning theory, cryptography and combinatorics.
Topics include:
(1) Basic tools: probabilistic, proofs, Lovasz local lemma, uniform generation and approximate counting, Fourier analysis, influence of variables on functions, random walks, graph expansion, Szemeredi regularity lemma.
(2) Randomness vs. Predictability: pseudorandom generators, weak random sources, derandomization and recycling randomness, computational learning theory, Fourier based learning algorithms, weak vs. strong learning, boosting, average vs. worst case hardness, XOR lemma.
course information handout.
Tentative pset due dates and project timeline
Project ideas
Announcements:
Lecture Notes: (lecture videos on Canvas only)
- (1/31) Lecture 1: Introduction. The probabilistic method
(examples: hypergraph coloring, dominating sets).
[handwritten notes] and
[scribe notes]
- (2/2) Lecture 2: The Lovasz Local Lemma. Example: hypergraph coloring revisited. An algorithm for finding solutions guaranteed by LLL.
[handwritten notes] and
[scribe notes]
- (2/7) Lecture 3: Finish LLL. Intro to Polynomial identity testing.
[handwritten notes] (only up to page 22) and
[scribe notes]
- (2/9) Lecture 4:
Polynomial Identity Testing with applications to "person on the moon"
and bipartite matching.
[handwritten notes] and
[scribe notes]
- (2/14) Lecture 5: Uniform generation of DNF satisfying assignments.
Counting problems.
Approximate counting and approximate uniform generation.
[handwritten notes] and
[scribe notes]
- (2/16) Lecture 6:
Finish approximate counting and approximate uniform generation.
Randomized complexity classes. Derandomization via enumeration.
Pairwise independence and derandomization (example: max-cut).
[handwritten notes] (note minor updates to notes posted for lecture 5.)
[scribe notes]
- (2/22) (Monday schedule on Tuesday) Lecture 7: MaxCut: randomized algorithm, derandomizing via pairwise independence. Generating pairwise independent bits
[handwritten notes] and
[scribe notes]
- (2/23) Lecture 8: More applications of pairwise independence: reducing error, interactive proofs IP, Graph isomorphism, Public coins vs private coins
[handwritten notes]
[scribe notes]
- (2/28) Lecture 9: .
More applications of pairwise independence: interactive proofs - public coins vs. private coins. Derandomization via method of conditional expectations.
[handwritten notes]
[scribe notes]
- (3/2) Lecture 10: Markov Chains and Random Walks: Stationary Dist, Cover Times
[handwritten notes] [scribe notes]
- (3/7) Lecture 11: Undirected s-t connectivity. Linear algebra and random walks. Saving random bits via random walks. [handwritten notes] [scribe notes]
- (3/9) Lecture 12: Linear algebra and mixing times. Saving random bits via random walks.
[handwritten notes]
[scribe notes]
- (3/14) Lecture 13: Finish saving random bits via random walks. Linearity testing and self-correcting.
[handwritten notes]
[scribe notes]
- (3/16) Lecture 14: Linearity testing and self correcting. Basics of Fourier Analysis on Boolean cube.
(see lec 13 "handwritten notes before lecture"
for notes up to linearity testing)
[handwritten notes]
[scribe notes]
- (3/28) Lecture 15: Basics of Fourier Analysis on Boolean cube (some review, some new). Analysis of Linearity test. Learning Boolean functions - a model.
[handwritten notes]
[scribe notes]
- (3/30) Lecture 16: Learning Boolean functions: a model. An example: conjunctions. Occam's razor. Fourier-based learning algorithms.
[handwritten notes]
[scribe notes]
- (4/4) Lecture 17: Fourier-based learning algorithms: learning one Fourier coeff. The low degree algorithm.
[handwritten notes]
- (4/6) Lecture 18: Fourier-based learning algorithms: The low degree algorithm. Fourier concentration. Noise sensitivity.
[handwritten notes]
[scribe notes]
- (4/11) Lecture 19: Fourier-based learning algorithms. Fourier Concentration via Noise Sensitivity. Learning heavy Fourier coeffs (with queries)
[handwritten notes]
[scribe notes]
- (4/13) Lecture 20: Learning heavy Fourier coeffs (with queries) (cont.)
Weak learning of monotone fctns.
[handwritten notes]
- (4/20) Lecture 21: Weak learning of monotone fctns. Begin: distribution-free weak learning => strong learning.
[handwritten notes]
[scribe notes]
- (4/25) Lecture 22: Distribution-free weak learning =>strong learning. "Boosting." Average vs. worst case complexity.
[handwritten notes]
[scribe notes]
- (4/27) Lecture 23: Probabilistically Checkable Proof Systems. - From earlier lectures/homeworks. -- Freivald's test. -- self-testing correcting linear fctns.
- model. - NP < PCP(n^3, 1) --arithmetization
[handwritten notes]
[scribe notes]
- (5/2) Lecture 24: Szemeredi's Regularity Lemma. Testing dense graph properties via SRL: triangle-freeness.
[handwritten notes from lecture 24]
[scribe notes]
Homework:
Useful information:
- References on Markov, Chebyshev and tail inequalities:
- Texts:
Information for scribes:
Instructions for graders.
Signup for graders.