Sublinear Optimization

In many modern optimization problems, specifically those arising in machine learning, the amount data is too large to apply standard convex optimization methods. We’ll discuss new optimization algorithms that make use of randomization to prune the data produce a correct solution albeit running in time which is smaller than the data representation, i.e. sublinear running time. We’ll present such sublinear-time algorithms for linear classification, support vector machine training, semi-definite programming and other optimization problems. These new algorithms are based on a primal-dual approach, and use a combination of novel sampling techniques and the randomized implementation of online learning algorithms. We’ll describe information-theoretic lower bounds that show our running times to be nearly best possible in the unit-cost RAM model.

The talk will be self-contained – no prior knowledge in convex optimization is assumed

Speaker Details

Elad Hazan is an assistant professor at the Technion, Israel Institute of Technology, faculty of IE&M. His main research area is machine learning and its relationship to game theory, optimization and theoretical computer science. Prior to joining the Technion, Elad spent four years as a research staff member in the Theory Group at the IBM Almaden Research Center. He received his Ph.D. in Computer Science from Princeton University, where he was supervised by Prof. Sanjeev Arora. Before diving into machine learning and optimization, Elad was obsessed by computational complexity, and his master’s thesis is on the topic of hardness of approximation (completed at Tel Aviv University under Muli Safra).

Date:
Speakers:
Elad Hazan
Affiliation:
Israel Institute of Technology
    • Portrait of Jeff Running

      Jeff Running

Series: Microsoft Research Talks