Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Efficient Online Bootstrapping for Large Scale Learning

Zhen Qin, Vaclav Petricek, Nikos Karampatziakis, Lihong Li, and John Langford

Abstract

Bootstrapping is a useful technique for estimating the uncertainty of a predictor, for example, confidence intervals for prediction. It is typically used on small to moderate sized datasets, due to its high computation cost. This work describes a highly scalable online bootstrapping strategy, implemented inside Vowpal Wabbit, that is several times faster than traditional strategies. Our experiments indicate that, in addition to providing a black box-like method for estimating uncertainty, our implementation of online bootstrapping may also help to train models with better prediction performance due to model averaging.

Details

Publication typeTechReport
NumberMSR-TR-2013-132
Publisherarxiv.org
> Publications > Efficient Online Bootstrapping for Large Scale Learning