# Quantifying the Value of Constructive
Induction, Knowledge, and Noise
Filtering on Inductive Learning

**Carl M. Kadie**

Microsoft Research, Bldg 9S

Redmond 98052-6399, WA

**Author Email: ****carlk@microsoft.com**

### Abstract:

Learning
research, as one of its central goals, tries to measure, model, and understand
how learning-problem properties effect average-case learning performance. For
example, we would like to quantify the value of constructive construction, noise
filtering, and background knowledge. This paper describes the *effective
dimension*, a new learning measure that helps link problem properties to
learning performance. Like the Vapnik-Chervonenkis (VC) dimension, the effective
dimension is often in a simple linear relation with problem properties. Unlike
the VC dimension, the effective dimension can be estimated empirically and makes
average-case predictions. It is therefore more widely applicable to machine and
human learning research. The measure is demonstrated on several learning systems
including Backpropagation. Finally, the measure is used precisely predict the
benefit of using FRINGE, a feature construction system. The benefit is found to
decrease as the complexity of the target concept increases.

*Proceedings of the Eighth International Conference on Machine Learning*,
Evanston, Illinois, 1991. (postscript)