# Convex Relaxation of Sharp-P-Hard Problems & the Kadison-Singer Conjecture

###
**Instructor:** Professor Leonid Gurvits

###
**Description**

The Kadison-Singer Problem was stated by two famous mathematical physicists in 1959 in terms of the operator theory in infinite-dimensional Hilbert Spaces and is known to be equivalent to many problems ranging from foundations of quantum physics to signal processing and computer science. It was eventually distilled by Nic Weaver to the transparent finite-dimensional **vector discrepancy** conjecture. The Weaver conjecture was very recently (June of 2013) proved by three theoretical computer scientists [1].

Not only did they solve an important long-standing problem, but their proof can be fully understood by reasonably educated humans.

###
**Course Objectives**

The ultimate goal of the course is to present the fairly complete proof from [1], with some shortcuts. The proof is magically woven of several-rather elementary-small pieces. Yet, there is an underlying, unifying concept behind the magic: **the hyperbolicity (aka (real) stability) of multivariate polynomials**; and two notable predecessors(inspirers) of the proof: The (relatively old) Heilmann-Lieb Theory of the Matching Polynomials (aka **Monomer-Dimer problem in Statistical Physics**) and the (quite recent) approach, developed by the instructor, to the lower bounds via the induction by the partial differentiation of stable polynomials.

###
**Course Content**

How come so many highly skilled mathematicians and physicists failed for 50+ years to solve the problem that had been finally nailed down by three (particular) computer scientists?

We will try throughout the course to answer this puzzle illuminating the multidisciplinary nature of the **Modern (Theoretical) Computer Science**, which combines the mundane and the sacred; practical, yet not-always-kosher, heuristics and cool, yet rigorous, mathematics; the discrete and the continuous; the old and the new...

The course will start with **the permanent** and will gently proceed to the ultimate goal via **the mixed discriminant, the mixed volume, the (generalized) matching polynomials, Ramanujan Graphs**, some hidden convexities and concavities, convex relaxations of #P-HARD problems and algorithms to compute them, and many useful inequalities including Newton's and the entropic one used in the Baum-Welsh algorithm for Hidden Markov Models.

After the completion of the main proof, a few new conjectures and applications will be presented, including new properties of Quantum Linear Optical Distributions.

The course should be of interest to computer scientists, mathematicians, physicists, control theoreticians, poets... to name a few.

It is probably the first ever semester-long course about this cutting edge research and presented by one of the main active players of the game. A few guest lecturers will be invited as well, possibly including Adam Marcus (Yale), Elliot Lieb (Princeton) and Jim Renegar (Cornell). A solid grasp of the standard university level undergraduate mathematics is required.

###
**References**

[1] Adam Markus, Daniel A Spielman and Nikhil Srivastava; Interlacing Families II:Mixed Characteristic Polynomials and the Kadison-Singer Problem, 2013, available at http://arxiv.org/pdf/1306.3969v3.pdf