Efficacy of A Constantly Adaptive Language Modeling Technique for Web-Scale Applications

  • Kuansan Wang ,
  • Xiaolong(Shiao-Long) Li

Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP'2009) |

Published by Institute of Electrical and Electronics Engineers, Inc.

Taipei, Taiwan, April 19-24,2009

In this paper, we describe CALM, a method for building statistical language models for the Web. CALM addresses several unique challenges dealing with the Web contents. First, CALM does not rely on the whole corpus to be available to build the language model. Instead, we design CALM to progressively adapt itself as Web chunks are made available by the crawler. Second, given the dynamic and dramatic changes in the Web contents, CALM is designed to quickly enrich its lexicon and N-grams as new vocabulary and phrases are discovered. To reduce the amount of heuristics and human interventions typically needed for model adaptation, we derive an information theoretical formula for CALM to facilitate the optimal adaptation in the maximum a posteriori (MAP) sense. Testing against a collection of Web chunks where new vocabulary and phrases are dominant, we show CALM can achieve comparable and satisfactory model measured in perplexity. We also show CALM is robust against over training and the initial condition, suggesting that any assumptions made in obtaining the initial model can gradually see their impacts diminished as CALM runs its full course and adapt to more data.