Sham M. Kakade

Principal Research Scientist
Microsoft Research, New England


Publications  


Research

The focus of my work is on designing (and implementing) both statistically and computationally efficient algorithms for machine learning, statistics, and artificial intelligence.
Recently, I have been focusing on three areas: 1) designing effective algorithms for estimating probabilistic models with latent structure (such as HMMs, LDA, Mixtures of Gaussians, etc) 2) efficient optimization algorithms in statistical settings (i.e. how fast can we optimize when we are interested in statistical accuracy rather than numerical accuracy?) 3) what is responsible for the recent (and remarkable) successes in deep learning? (this last question is largely an empirical exploration, in both computer vision and speech applications).
More broadly, I am currently interested in probability theory, algebraic and tensor methods, signal processing/information theory, and numerous domain specific settings (with a recent focus on natural language processing and computer vision).
My previous body of work has addressed problems in unsupervised (and representational) learning, concentration of measure, reinforcement learning, statistical learning theory, optimization, algorithmic game theory, and economics. As a graduate student, I focused on reinforcement learning and computational neuroscience. My thesis was on sample complexity issues in reinforcement learning.

Bio

I am a principal research scientist at Microsoft Research, New England, a lab in Cambridge, MA. Previously, I was an associate professor at the Department of Statistics, Wharton, University of Pennsylvania (from 2010-2012), and I was an assistant professor at the Toyota Technological Institute at Chicago. Before this, I did a postdoc in the Computer and Information Science department at the University of Pennsylvania under the supervision of Michael Kearns. I completed my PhD at the Gatsby Unit where my advisor was Peter Dayan. Before Gatsby, I was an undergraduate at Caltech where I did my BS in physics.

Activities and Services

Program committee for the third New England Machine Learning Day, May 13th, 2014.
Co-chair for New York Computer Science and Economics Day V, Dec 3rd, 2012.
Program committee for the first New England Machine Learning Day, May 16th, 2012.
Program chair for the 24th Annual Conference on Learning Theory (COLT 2011) which took place in Budapest, Hungary, on July 9-11, 2011.

Tutorials

Tensor Decompositions for Learning Latent Variable Models, AAAI 2014

Course Links

Stat 928: Statistical Learning Theory
Stat 991: Multivariate Analysis, Dimensionality Reduction, and Spectral Methods
Large Scale Learning
Learning Theory

Former Postdocs

Daniel Hsu (while at UPenn)

Former Interns (in reverse chronological order)

Aaron Sidford
Roy Frostig
David Belanger
Chen Wang
QingQing Huang
Jaehyun Park
Karl Stratos
Do-kyum Kim
Praneeth Netrapalli
Rashish Tandon
Rong Ge
Adel Javanmard
Matus Telgarsky
Daniel Hsu (at TTI-C)
Sathyanarayan Anand (at TTI-C)

Contact Info

Email: skakade [at] microsoft [dot] com

Microsoft Research, Office 14060
One Memorial Drive
Cambridge, MA 02142