We present a modified version of the perceptron learning algorithm (PLA) which solves semidefinite programs (SDPs) in polynomial time. The algorithm is based on the following three observations: (i) Semidefinite programs are linear programs with infinitely many (linear) constraints; (ii) every linear program can be solved by a sequence of constraint satisfaction problems with linear constraints; (iii) in general, the perceptron learning algorithm solves a constraint satisfaction problem with linear constraints in finitely many updates. Combining the PLA with a probabilistic rescaling algorithm (which, on average, increases the size of the feasable region) results in a probabilistic algorithm for solving SDPs that runs in polynomial time. We present preliminary results which demonstrate that the algorithm works, but is not competitive with state-of-the-art interior point methods.

}, author = {Thore Graepel and Ralf Herbrich and Andriy Kharechko and John Shawe-Taylor}, booktitle = {Advances in Neural Information Processing Systems 16}, month = {January}, pages = {457–464}, publisher = {MIT Press}, title = {Semidefinite Programming by Perceptron Learning}, url = {http://research.microsoft.com/apps/pubs/default.aspx?id=65630}, year = {2004}, }