6.S084/18.S096 Linear Algebra and Optimization
Fall 2020
This is the pilot offering of a new course in linear algebra and optimization. We will not assume any prior knowledge of linear algebra and will start from the basics including vectors, matrices, eigenvalues, singular values and least squares. We will somewhat downplay solving examples by hand and will instead emphasize conceptual, geometric and computational aspects.
Building on insights from linear algebra, we will cover the basics in optimization including convex optimization, linear/quadratic programming, gradient descent and regularization. We will explore a variety of applications in science and engineering where the tools we have developed give powerful ways to learn from data.
Relation to 18.06: This course will count towards the linear algebra requirement for Math majors and for EECS MEng students! Also, you should not take this course if you have already taken 18.06. Both Math and EECS will not give you credit for the course if you've taken 18.06 already.
Announcement: There will be no recitation the first week of classes. The first recitation will be on Tuesday, September 8th.
Course Information
- Instructors: Ankur Moitra and Pablo Parrilo
- Lectures: Monday, Wednesday and Friday 2-3
- Canvas Site: All assignments, zoom links for lectures and their recordings will be posted on the course's Canvas Site. We will still post all lecture notes in pdf format both here and there, to make at least some of the materials available outside MIT.
- Prerequisites: 18.02
- TAs: Kaarel Haenni, Rene Reyes, Yi Wang
- Recitations: Tuesdays and Thursdays, 8:30-9:30, 11-12, 12-1, 4-5 or 7-8
- Assessment: Students will be expected to solve biweekly problem sets (40%), attend recitation and complete miniprojects (10%), and take two take home exams (15% + 15%) and a final exam (20%).
Syllabus
Here is a tentative syllabus for the course. We will add links to instructor notes as we go along.
Part I: Working with Vectors and Matrices
- Lecture 1: A Panoramic View of Linear Algebra
- Lecture 2: The Geometry of Linear Equations
- Lecture 3: Gaussian Elimination and Applications to Circuit Analysis
- Lecture 4: Multiplying Matrices and Applications to Counting Walks
- Lecture 5: Visualizations: Projections, Reflections, Rotations and Permutations
- Lecture 6: Vector Spaces, Linear Combinations and Column/Null Spaces
Part II: Geometric Foundations
- Lecture 7: The Rank, and its Equivalent Formulations
- Lecture 8: Linear Independence, Dimension and Bases
- Lecture 9: Orthogonality and Gram-Schmidt
- Lecture 10: The Determinant and its Properties
- Lecture 11: The Matrix Inverse, Existence and Projections
- Lecture 12: Least Squares and Regularization
Part III: The Singular Value Decomposition and Applications
- Lecture 13: The Singular Value Decomposition
- Lecture 14: The Condition Number and Stability
- Lecture 15: Principal Component Analysis and Applications to Genetics
- Lecture 16: Word Embeddings and Exploring Biases in Data
- Lecture 17: Eigenvalues and Eigenvectors
- Lecture 18: The Eigendecomposition and Algebraic vs. Geometric Multiplicity
- Lecture 19: The Power Method
- Lecture 20: Markov Matrices and Applications to PageRank
Part IV: Quadratic Programming and Applications
- Lecture 21: Linear and Quadratic Programming
- Lecture 22: Support Vector Machines and the Kernel Trick
- Lecture 23: The Perceptron Algorithm
- Lecture 24: Robust Optimization for SVMs
Part V: Convex Optimization and Gradient Descent
- Lecture 25: Convex Functions and Convex Domains
- Lecture 26: Gradient Descent and Examples
- Lecture 27: Analysis of Gradient Descent
- Lecture 28: Acceleration (Special Case of Quadratic Minimization)
- Lecture 29: Stochastic Gradient Descent for Nonconvex Functions
- Lecture 30: General Convex Optimization
- Lecture 31: Convex Surrogates and the Hinge Loss
- Lecture 32: Zero Sum Games and the Minimax Theorem
Instructor Notes
- Lecture 1: A Panoramic View of Linear Algebra
Links
Here are links to other courses with overlapping content: