Infer.NET user guide : Tutorials and examples : Multi-class classification
## Sparse Multi-class Bayes Point MachineHere, we present the details for implementing a sparse multiclass Bayes point machine using Infer.NET. Sparse, in this context, represents the fact every data point has only a subset of relevant features. Hence, a data point is represented by a set of indices (into the feature set) and the corresponding values for these features. During the training phase, we have labeled data, and the goal is to find the (distribution over) weight vectors for each class. During the testing phase, given unlabeled data, the goal is to find the posterior distribution over the class labels for the data. Hence, we specify two models, a trainModel and a testModel that can be used during training and testing, respectively. ## Model for trainingDuring training, all examples are labeled. So, we group training examples by their label. For a particular class, we have the the number of items nItem, a range over these items, the number of features in each item (represented using a VariableArray), the range over these features, and of course, the values and the indices of the features. This means that the values and features need to be represented using a three-deep jagged array indexed by class, item, and feature, where each outer index is over a range that is dependent on the inner index. The syntax is as follows:
Weights corresponding to each feature have an independent Gaussian prior that is set during the start of training phase. We require this independence assumption since we want to work with only a subset of these features for each training example.
A data point belonging to a particular class, k, should have maximal score under that class. To ascertain this, we constrain that the examples have maximal score only under the class it is being labeled to belong. In the snippet below,
## Computing class scores for sparse modelSince, only relatively small number of features are pertinent for any given example, score can be efficiently computed. We can obtain the weights, wSparse, for pertinent features of an item using Variable.SubArray. (Variable.SubArray is efficient when there are unique indices to retrieve. For non-unique indices, use Variable.GetItems). We compute the score by summing (using Variable.Sum) the element-wise product between wSparse and observed feature values.
## Training the modelThe above model can be nicely encapsulated as trainModel, which can then be used to train, either in batch mode, or incrementally. h mode, orTraining the model in batch mode We set the prior over the weights to be standard Gaussians. Then, use the observed data to set the observed values for the variables. Then, inference is performed to obtain the posterior over w. We also reset the prior to the inferred posterior distribution so that we can perform incremental training, and use at the testing time.
## Training the model in incremental modeIn this case, we use the posterior distribution inferred during the previous learning stage as the prior distribution for the w. Perform inference to obtain updated posterior distribution over w.erior distribution over w. ## Using Shared variablesWe can easily convert the sparse Bayes Point Machine to make use of shared variables. See Multi-class Bayes Point machine using shared variables for an example. |