Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Multi-Relational Latent Semantic Analysis

Kai-Wei Chang, Wen-tau Yih, and Chris Meek

Abstract

We present Multi-Relational Latent Semantic Analysis (MRLSA) which generalizes Latent Semantic Analysis (LSA). MRLSA provides an elegant approach to combining multiple relations between words by constructing a 3-way tensor. Similar to LSA, a low-rank approximation of the tensor is derived using a tensor decomposition. Each word in the vocabulary is thus represented by a vector in the latent semantic space and each relation is captured by a latent square matrix. The degree of two words having a specific relation can then be measured through simple linear algebraic operations. We demonstrate that by integrating multiple relations from both homogeneous and heterogeneous information sources, MRLSA achieves state-of-the-art performance on existing benchmark datasets for two relations, antonymy and is-a.

Details

Publication typeInproceedings
Published inProceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP)
PublisherACL – Association for Computational Linguistics
> Publications > Multi-Relational Latent Semantic Analysis