Applied Nonparametric Bayes and Statistical Machine Learning

Bayesian approaches to learning problems have many virtues, including their ability to make use of prior knowledge and their ability to link related sources of information, but they also have many vices, notably the strong parametric assumptions that are often invoked in practical Bayesian modeling. Nonparametric Bayesian methods offer a way to make use of the Bayesian calculus without the parametric handcuffs.

In this talk I describe several recent explorations in nonparametric Bayesian modeling and inference, including various versions of “Chinese restaurant process priors” that allow flexible structures to be learned and allow sharing of statistical strength among sets of related structures. I discuss computational issues and applications to problems in bioinformatics.

[Joint work with David Blei and Yee Whye Teh].

Speaker Details

Michael Jordan is Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley. He received his Masters from Arizona State University, and earned his PhD from the University of California, San Diego. He was a professor at the Massachusetts Institute of Technology for eleven years. He has published over 200 research papers on topics in computer science, electrical engineering, statistics, biology and cognitive science. His research in recent years has focused on probabilistic graphical models, on kernel machines, and on applications of statistical machine learning to problems in bioinformatics, information retrieval, and signal processing.

Date:
Speakers:
Michael Jordan
Affiliation:
University of California at Berkeley
    • Portrait of Jeff Running

      Jeff Running