Approximating the Exponential, the Lanczos Method and an O(m polylog(n))-Time Spectral Algorithm for Balanced Graph Partitioning
ABSTRACT: In the balanced graph partitioning problem, given a graph, the goal is to output a cut of minimum conductance such that each side of the cut contains at least a constant fraction of the vertices. This problem plays a central role in many settings such as markov chains, VLSI layout, clustering, computer vision and social networks. Thus, more so with explosion of data, fast and high quality algorithms for this problem are desirable. Recursive applications of spectral algorithms, based on relating the second eigenvector to the standard random walk on the graph which in turn relates to the conductance, have been very popular for this problem. However, in the worst case, these algorithms can end up taking \Omega(nm) time where m is the number of edges and n the number of vertices in the graph.
The main content of this talk will be an algorithm for the balanced graph partitioning problem which runs in time O(m) upto polylog(n) factors and achieves the approximation guarantee of the Cheeger inequality based spectral algorithm mentioned above, thus settling an important open problem.
Technically, first a new kind of random walk, the "accelerated heat kernel" (AHK) walk, is introduced and it is shown that O(log n) runs of it allow one to recover the cut with the desired approximation guarantee or prove that there is none. The task then remains to give a simulation of this AHK walk in O(m polylog(n)) time. This is achieved by giving an O(m polylog(n)) algorithm that computes the product of a vector with a matrix of the form exp(-A) where A is the sum of the graph laplacian and a rank one "accelerator" matrix.
This latter part relies on low degree uniform-approximations to e^{-x} and how they translate to algorithms for efficient computation of the matrix exponential-vector product via a variant of the Lanczos method from numerical linear algebra.
This talk, which will be based on a joint work with Lorenzo Orecchia (MIT) and Sushant Sachdeva (Princeton), will highlight both the random walk aspect of this result and the work that goes in coming up with an O(m polylog(n)) algorithm to compute exp(-A)v which combines techniques from numerical linear algebra and approximation theory.
BIO: Nisheeth’s research interests span several aspects of algorithms and complexity. He has worked primarily in approximation algorithms, hardness of approximation and linear time algorithms for cut/flow problems. In addition, he has become interested in theoretical and computational problems arising out of computer vision and viral evolution.
Nisheeth obtained his Ph.D. from Georgia Tech and studied CS at IIT Bombay as an undergrad. Among notable honors, he is the recipient of the Best Paper Award at FOCS 2005, IBM Research Pat Goldberg Memorial Award and the Indian National Science Academy Young Scientist Award.