Looks like the Kadison-Singer conjecture has finally been proven.  This conjecture has a remarkably large number of equivalent formulations across operator algebra, high-dimensional geometry, and linear algebra; one formulation is the "paving conjecture", that any n x n matrix of operator norm 1 and zeroes on the diagonal can have the indices {1,...,n} partitioned into O(1) cells, such that the principal minor associated to each cell has operator norm less than any specified constant, say 1/2.  (Bourgain and Tzafriri showed that at least one large cell of this type exists, which is necessary but not sufficient for this conjecture.  The equivalence of the paving conjecture with the original formulation of the Kadison-Singer conjecture was established somewhat earlier by Anderson.)

It turns out that the proof uses some random matrix theory, in that it is based on understanding the operator norm of a sum of independent random rank one matrices, although it seems that the heart of the argument is instead coming from the theory of interlacing polynomials rather than random matrix theory.  (I had briefly looked at this conjecture some time ago when it was suggested to me that random matrix theory might be useful to solve this problem, but it quickly became clear that while it was somewhat relevant, it wasn't going to be the primary tool to resolve matters.)

[Amusingly, it looks like Marcus, Spielman and Srivastava actually do the one thing that experts in random matrix theory counsel NOT to do, which is to try to control eigenvalues by first controlling the characteristic polynomial.  Usually the characteristic polynomial is so difficult to understand that we run to other spectral statistics - traces, moments, resolvents, etc - in order to avoid it, but apparently in this setting the properties of this polynomial are in fact key to solving the whole problem.]