Quantifying the Value of Constructive Induction, Knowledge, and Noise Filtering on Inductive Learning

  • Carl Kadie

Published by Morgan Kaufmann Publishers

Learning research, as one of its central goals, tries to measure, model, and understand how learning-problem properties effect average-case learning performance. For example, we would like to quantify the value of constructive induction, noise filtering, and background knowledge. This paper works towards this goal by combining psychology’s mathematical learning theory with computational learning theory. This paper defines the effective dimension, a new learning measure that empirically links problem properties to learning performance. Like the Vapnik-Chervonenkis(VC) dimension, the effective dimension is often in a simple linear relation with problem properties. Unlike the VC dimension, the effective dimension is estimated empirically and makes average-case predictions. It is therefore more widely applicable to machine-and human-learning research. The measure is demonstrated on several learning systems including Backpropagation. Finally, the measure is used to precisely predict the benefit of using FRINGE, a feature construction system. The benefit is found to decrease as the complexity of the target concept increases.