Jeremy Jancsary, Sebastian Nowozin, Toby Sharp, and Carsten Rother
10 April 2012
We introduce Regression Tree Fields (RTFs), a fully conditional random field model for image labeling problems. RTFs gain their expressive power from the use of nonparametric regression trees that specify a tractable Gaussian random field, thereby ensuring globally consistent predictions. Our approach improves on the recently introduced decision tree field (DTF) model  in three key ways: (i) RTFs have tractable test-time inference, making efficient optimal predictions feasible and orders of magnitude faster than for DTFs, (ii) RTFs can be applied to both discrete and continuous vector-valued labeling tasks, and (iii) the entire model, including the structure of the regression trees and energy function parameters, can be efficiently and jointly learned from training data. We demonstrate the expressive power and flexibility of the RTF model on a wide variety of tasks, including inpainting, colorization, denoising, and joint detection and registration. We achieve excellent predictive performance which is on par with, or even surpassing, DTFs on all tasks where a comparison is possible.
|Published in||IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR)|
|Publisher||IEEE Computer Society|
The Institute of Electrical and Electronics Engineers (IEEE)