Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
RNNLM - Recurrent Neural Network Language Modeling Toolkit

Tomas Mikolov, Stefan Kombrink, Anoop Deoras, Lukar Burget, and Jan Honza Cernocky

Abstract

We present a freely available open-source toolkit for training recurrent neural network based language models. It can be easily used to improve existing speech recognition and machine translation systems. Also, it can be used as a baseline for future research of advanced language modeling techniques. In the paper, we discuss optimal parameter selection and different modes of functionality. The toolkit, example scripts and basic setups are freely available at http://rnnlm.sourceforge.net/.

Details

Publication typeInproceedings
PublisherIEEE Automatic Speech Recognition and Understanding Workshop
> Publications > RNNLM - Recurrent Neural Network Language Modeling Toolkit