Knowledge about local invariances with respect to given pattern transformations can greatly improve the accuracy of classification. Previous approaches are either based on regularisation or on the generation of virtual (transformed) examples. We develop a new framework for learning linear classifiers under known transformations based on semidefinite programming. We present a new learning algorithm - the Semidefinite Programming Machine (SDPM) - which is able to find a maximum margin hyperplane when the training examples are polynomial trajectories instead of single points. The solution is found to be sparse in dual variables and allows to identify those points on the trajectory with minimal real-valued output as virtual support vectors. Extensions to segments of trajectories, to more than one transformation parameter, and to learning with kernels are discussed. In experiments we use a Taylor expansion to locally approximate rotational invariance in pixel images from USPS and find improvements over known methods.

}, author = {Thore Graepel and Ralf Herbrich}, booktitle = {Advances in Neural Information Processing Systems 16}, month = {January}, pages = {33–40}, publisher = {MIT Press}, title = {Invariant Pattern Recognition by Semidefinite Programming Machines}, url = {http://research.microsoft.com/apps/pubs/default.aspx?id=65631}, year = {2004}, }