Microsoft Research
Computational User Experiences

Search Vox: Leveraging Multimodal Refinement and Partial Knowledge for Mobile Voice Search

In conjunction with MLAS (Machine Learning and Applied Statistics) and Speech Technology, we are working on a mobile search interface called Search Vox that not only facilitates touch and text refinement whenever speech fails, but also allows users to assist the recognizer via text hints. Search Vox can also take advantage of any partial knowledge users may have about the business listing by letting them express their uncertainty in an intuitive way using verbal wildcards.

 

Project Team

Search Vox

     


In light of the challenges of mobile voice search, we developed Search Vox, a multimodal interface that tightly couples speech with touch and text in two directions; users can not only use touch and text to refine their queries whenever speech fails, but they can also use speech whenever text entry becomes burdensome. We facilitate this tight coupling through interaction techniques that leverage wildcard queries; that is, search queries that utilize wildcards (*) to match zero or more characters. Wildcard queries allow users to take advantage of any partial knowledge they may have about the words in the business listing. For example, a user may only remember that the listing starts with a word beginning with “s” and also contains “avenue.” Likewise, the user may only remember “saks something,” where “something” is used to express uncertainty about what words follow.

 

Search Vox Video for UIST 2008 [.wmv (2.2MB), .mov (32.7MB)]

Publications

Search Vox: Leveraging Multimodal Refinement and Partial Knowledge for Mobile Voice Search

Tim Paek, Bo Thiesson, Y.C. Ju, Bongshin Lee

ACM UIST 2008, pp.141-150

Contact Us Terms of Use Trademarks Privacy Statement ©2010 Microsoft Corporation. All rights reserved.Microsoft