6.842 Randomness and Computation (Spring 2020)
- Instructor:
Ronitt Rubinfeld
- Teaching Assistant:
Amartya Shankha Biswas
-
Course email: 6842-sp20 csail mit edu (with at and dots in proper places)
- Time: TTh 11:00-12:30
- Place: 4-149 (NOTE NEW ROOM)
Brief Course description:
We study the power and sources of randomness in computation, concentrating on connections and applications to computational complexity, computational learning theory, cryptography and combinatorics.
Topics include:
(1) Basic tools: probabilistic, proofs, Lovasz local lemma, uniform generation and approximate counting, Fourier analysis, influence of variables on functions, random walks, graph expansion, Szemeredi regularity lemma.
(2) Randomness vs. Predictability: pseudorandom generators, weak random sources, derandomization and recycling randomness, computational learning theory, Fourier based learning algorithms, weak vs. strong learning, boosting, average vs. worst case hardness, XOR lemma.
Course Information Handout
Original
course information sheet.
Requirements and rubric overidden as
FOLLOWS.
Announcements:
-
Problem Set 6 posted, due May 5.
-
Problem Set 4
grading signups
are now open.
-
Problem Set 5 posted, due April 23.
-
Problem Set 4 posted, due April 14.
-
Problem Set 3
grading signups
are now open.
-
All recordings of lectures are posted to the course lmod site.
-
Zoom meeting information
and
link
for all class activities (lectures and office hours).
-
(3/18)
note that pset 3 has a new due date!
-
In case you are interested in doing a project, see
this.
-
Note new rubric and requirements for course
here
-
(3/18)
note that pset 3 has a new due date!
-
(3/12) Today we will have our last in-person meeting of the course.
I will attempt a zoom link, and if it doesn't work, I will at least
attempt to record today's lecture. Handwritten lecture notes are
already posted.
We move to fully online meetings after spring break.
-
Scribe Signup
-
Grading Signup
Lecture Notes:
- (2/4) Lecture 1: Introduction. The probabilistic method
(examples: hypergraph coloring, sum-free subsets).
[scanned notes] and
[scribe notes]
- (2/6) Lecture 2: The Lovasz Local Lemma. Example: hypergraph coloring revisited. Moser-Tardos Algorithm for finding solutions guaranteed by LLL.
[scanned notes] and
[scribe notes]
- (2/11) Lecture 3: Polynomial identity testing. Uniform generation
of DNF formula.
[scanned notes] and
[scribe notes]
- (2/13) Lecture 4:
Approximate counting and approximate uniform generation.
[scanned notes] and
[scribe notes]
- (2/20) Lecture 5: Finish Approximate counting and approximate
uniform generation. Randomized complexity classes. Derandomization via enumeration. [scanned notes] and
[scribe notes]
- (2/25) Lecture 6:
Pairwise independence and derandomization (example: max-cut). [scanned notes] (note minor updates to notes posted for lecture 5.)
[scribe notes]
- (2/27) Lecture 7:
Pairwise independent hash functions.
2-point sampling. Interactive proofs. The class IP. IP protocols
for lower bounding the size of a set.
[scanned notes] and
[scribe notes]
- (3/3) Lecture 8:
Derandomizing via the method of conditional expectations.
Intro to Markov chains and random walks on graphs.
[scanned notes]
[scribe notes]
- (3/5) Lecture 9: Random Walks, Stationary Distribution, Cover Times.
[scanned notes]
[scribe notes]
- (3/10) Lecture 10: Undirected st-connectivity is in RL.
Linear algebra and mixing times.
[scanned notes] [scribe notes]
- (3/12/) Lecture 11: Reducing Randomness via Random Walks on Special Graphs [scanned notes] [scribe notes]
- (3/31) Lecture 12: Undirected S-T Connectivity revisited (deterministic logspace)
[scanned notes (before lecture)]
[notes generated during lecture]
[scribe notes] [latex]
- (4/2) Lecture 13: Linearity Testing, Self-Correcting, Begin Fourier Analysis of Boolean Functions
[scanned notes (before lecture)]
[pdf generated after lecture]>
- (4/7) Lecture 14: More Fourier, finish linearity testing.
(see lec 13 "scanned notes before lecture"
for notes up to linearity testing)
[pdf generated during lecture]
- (4/9) Lecture 15:
Begin learning theory. PAC model: motivation, Occam's razor,
learning conjunctions.
[scanned notes (before lecture)]
[pdf generated during lecture]
- (4/14) Lecture 16:
Learning via Fourier representation. Some nice functions and their
Fourier representations. The low degree algorithm.
Fourier concentration.
[scanned notes (before lecture)]
[pdf generated during lecture]
- (4/16) Lecture 17:
Learning via Fourier representation continued.
The low degree algorithm (continued). Fourier concentration.
[scanned notes - start at p. 13 of pdf file, which says "p. 12" on upper right hand corner (before lecture)]
[may reach first pages of these scanned notes (before lecture)]
[pdf generated during lecture]
- (4/21) Lecture 18: Learning large Fourier coefficients (with queries).
[scanned notes]
[pdf generated during lecture]
- (4/23) Lecture 19:
Weakly learning monotone functions.
[scanned notes]
[pdf generated during lecture]
- (4/28) Lecture 20:
Weak vs. strong learning, boosting.
[scanned notes]
[pdf generated during lecture]
- (4/30) Lecture 21:
Amplifying hardness: Yao's XOR lemma.
[scanned notes]
[pdf generated during lecture]
- (5/5) Lecture 22:
Hardness vs. Randomness
[scanned notes]
[pdf generated during lecture]
- (5/7) Lecture 23:
NP has probabilistically checkable proof systems requiring only constant queries
[scanned notes]
[pdf generated during lecture]
- (5/12) Lecture 24:
NP has probabilistically checkable proof systems requiring only constant queries
(cont.)
[scanned notes from lecture 23]
[pdf generated during lecture]
Homework:
- Homework 1.
Posted February 4. Last update February 7 11:30am (removed a "not to be turned in" problem).
Due February 20 in class (NOTE NEW DUE DATE).
Hint for problem 1.
Hint for problem 5.
-
Homework 2.
Posted February 25, 2020. Last updated Feb 28, 2020 (11:45am).
Due March 5, 2020.
Grading Signup
-
Homework 3.
Posted March 9, 2020.
NEW Due April 2, 2020.
-
Homework 4.
Posted April 2, 2020.
Due April 14, 2020.
Hint for problem 1.
-
Homework 5.
Posted April 15, 2020.
Due April 23, 2020.
-
Homework 6.
Posted April 23, 2020.
Due May 5, 2020.
-
Solutions
Useful information:
- References on Markov, Chebyshev and tail inequalities:
- Texts:
Information for scribes:
- Here is a
sample file and
the preamble.tex file that
it uses.
Please get a first draft to
me within four days of the lecture.
When using the definition environment, make sure to emphasize the term being defined.
- Here are some
tips.
-
Scribe
signup
Instructions for graders.