Show The Graduate Center Menu

Randomized Algorithms for Matrices

Professor Liang Zhao

Course Rationale

Matrices are a popular way to model data, such as term-document data, social network data, machine learning kernels, and so on. With the size of real-world datasets rapidly growing, classical deterministic algorithms for processing such datasets are often no longer feasible. A common theme is the use of efficient randomized methods to provide approximating results with high probability. This course gives an introduction to modern algorithmic techniques to tackle large-scale matrix computations efficiently.

Course Description

This course will cover the theory and practice of randomized algorithms for large-scale matrix problems arising in modern massive data analysis. 


General mathematical sophistication and a solid understanding of algorithms, linear algebra, and probability theory at the advanced undergraduate level.

Topic List

  • Subspace embedding
  • Approximating matrix multiplication
  • Scalar and matrix concentration
  • Random projections and Johnson-Lindenstrauss lemmas
  • Least squares regression
  • Least absolute deviations regression
  • Additive-error low rank approximation
  • Relative-error low rank approximation
  • Distributed low rank approximation
  • CUR approximation
  • Tensor decomposition

Learning Objectives


  •  Learn the mathematical theory of the methodology and algorithms of matrix processing techniques.
  • Understand the numerical and computational issues that arise in implementing algorithms in different environments.
  •  Gain experience in applying randomized algorithms to large-scale data applications.
Problem sets will be assigned on most weeks. Each student will present a critical review, summarizing the problems and solutions given in a selected research paper. Grading policy: Homework – 50%, Final project – 50%.