An Introduction to Concentration Inequalities and Statistical Learning Theory

The aim of this tutorial is to introduce tools and techniques that are used to analyze machine learning algorithms in statistical settings. Our focus will be on learning problems such as classification, regression, and ranking. We will look at concentration inequalities and other commonly used techniques such as uniform convergence and symmetrization, and use them to prove learning theoretic guarantees for algorithms in these settings.

The talk will be largely self-contained. However, it would help if the audience could brush up basic probability and statistics concepts such as random variables, events, probability of events, Boole’s inequality etc. There are several good resources for these online and I do not wish to recommend one over the other. However, a couple of nice resources are given below

  1. Https://www.khanacademy.org/math/probability
  2. Http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/
  3. Https://en.wikipedia.org/wiki/Boole’sinequality

Speaker Details

I am a post-doctoral Research Fellow working with the Machine Learning and Optimization Group at the Microsoft Research Lab, Bengaluru. Previously I finished my PhD dissertation work from IIT Kanpur where I was jointly advised by Prof. Harish C. Karnick and Prof. Manindra Agrawal. I am interested in all aspects of machine learning with special emphasis on large scale learning for kernel machines. I also maintain an avid interest in complexity theory, data streaming algorithms, computational geometry and cognitive science.

Date:
Speakers:
Purushottam Kar
Affiliation:
MSRI