Microsoft Research
Computational User Experiences

 

Archived Projects

Trajectory Search

Trajectory Search

Trajectory search predicts your trajectory of movement and your potential destinations based on GPS readings to make your mobile search results more relevant to where you are headed.

 
Voice Typing

Voice Typing

We introduce Voice Typing, a new speech interaction model where users’ utterances are transcribed as they produce them to enable real-time error identification. For fast correction, users leverage a marking menu using touch gestures. Voice Typing aspires to create an experience akin to having a secretary type for you, while you monitor and correct the text.

 
PocketTouch

PocketTouch

PocketTouch utilizes capacitive sensing to detect finger-strokes through fabric (e.g., while your phone is still in your pocket).

 
HumAntenna

Humantenna:
Sensing Gestures Using the Body as an Antenna

Home environments frequently offer a signal that is unique to locations and objects within the home: electromagnetic noise. In this work, we use the body as a receiving antenna and leverage this noise for gestural interaction.

 
SoundWave

SoundWave:
Using the Doppler Effect to Sense Gestures

We present SoundWave, a technique that leverages commodity speakers and microphones to sense in-air gestures. We generate an inaudible tone, which gets frequency-shifted when it reflects off of moving objects; we measure this shift with the microphone to infer various gestures.

 
Cross-Device User Experiences

Cross-Device User Experiences

People often use several different computing devices throughout the day, at each moment selecting the one that offers the right balance of convenience, input expressivity, and display requirements. We have been working to understand how systems might support more seamless experiences across our PCs and mobile phones and better handle activities that span these devices.

 
Mobile Phone Interaction Techniques

Mobile Phone Interaction Techniques

With their small screens and limited input abilities, mobile phones demand novel methods for inputting, viewing and interacting with data. Here we highlight some of our explorations into improving data input and access on mobile phones.

 
GyroTab

GyroTab

We present GyroTab, a relatively flat handheld system that utilizes the gyro effect to provide torque feedback. GyroTab relies on the user to produce an input torque and provides fee dback by opposing that torque, making its feedback reactive to the user’s motion.

 
Patient-Friendly Medical Information Displays

AnatOnMe

High rates of patient non-compliance with self-care instructions is a serious challenge in care. In this work, we expore how lightweight handheld projector technologies can enhance patient-doctor communication in clinical settings with the goal of bridging the motivation gap and improving patient compliance with physical therapy exercise homework.

 
ClassSearch

ClassSearch

The ClassSearch system provides shared awareness of Web search activity in classroom environments. Through this prototype, we explore the use of social learning — improving knowledge skills by observing peer behavior — for Web search skill acquisition.

 
Muscle-Computer Interfaces

Muscle-Computer Interfaces

Muscle-computer interfaces directly sense and decode human muscular activity rather than relying on physical actuation or perceptible user actions. We believe that this is the first step towards tapping into the vast amount of information contained within the human physiology.

 
Skinput: Appropriating the Body as an Input Surface

Skinput: Bioacoustic Sensing for Input

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body.

 
Multi-touch (Graph) Manipulation on Surface

Multi-touch (Graph) Manipulation on Surface

The design of natural and intuitive gestures is a difficult problem as we do not know how users will approach a new multi-touch interface and which gestures they will attempt to use. We study whether familiarity with other environments influences how users approach interaction with a multi-touch surface computer especially for the graph manipulation and understanding.

 
Bioinformatics Visualization

Bioinformatics Visualization

We are building visual interfaces for the domain of biology. Examples include PhyloDet, a Scalable Visualization Tool for Mapping Multiple Traits to Large Evolutionary Trees, GeneShelf, a web-based visual interface for spinal cord injury study, and GOTreePlus, an interactive Gene Ontology visualization for Proteomics projects.

 
Brain-Computer Interfaces

Brain-Computer Interfaces

We are working on making brain sensing technologies accessible and useful to the mass consumer market. Our work in this area includes modeling cognitive state for interface evaluation and adaptive interfaces, and also using implicit cognitive processing to perform useful tasks such as image classification.

 

Bionic Contact Lenses

Millions of people use contact lenses daily. We are incorporating technology directly into the structure of these contacts, which provides an unparalleled opportunity to build augmented reality displays and explore new interaction paradigms, as well as to perform continuous healthcare monitoring.

 
EnsembleMatrix

EnsembleMatrix

EnsembleMatrix is an interactive visualization system that presents a graphical view of confusion matrices to help users understand relative merits of various classifiers. It allows users to directly interact with the visualizations in order to explore and build combination models.

 
Phrase Builder

Phrase Builder

Phrase Builder is an RTQE interface that reduces keystrokes by facilitating the selection of individual query words and by leveraging back-off query techniques to offer completions for out-of-index queries.

 
Search Vox

Search Vox

Search Vox is a mobile search interface that not only facilitates touch and text refinement whenever speech fails, but also allows users to assist the recognizer via text hints. Search Vox can also take advantage of any partial knowledge users may have about the business listing by letting them express their uncertainty in an intuitive way using verbal wildcards.

 
Social Computation

Collabio: Social Computation

Your friends have complex models of your history, opinions, personality, interests and expertise. This is the information that interactive applications need to perform tasks like personalization, expert matching, and friend finding. To draw out this latent information for interactive applications, we propose an approach that fuses human computation with online social networks.

 
iSee

iSee: Interactive Scenario Explorer for Entertainment

iSee is a system of interactive visualizations that allows players to project potential outcomes as well as explore future scenarios. This makes online tournament-style fantasy games vastly more compelling (socially and cognitively).

 
Songsmith

Songsmith

Songsmith generates musical accompaniment to match a singer’s voice. Just choose a musical style, sing into your PC’s microphone, and Songsmith will create backing music for you. Then share your songs with your friends and family, post your songs online, or create your own music videos. Songsmith evolved directly from a research collaboration between the CUE and Knowledge Tools groups at MSR.

 
Data-Driven Exploration of Musical Chord Sequences

Data-Driven Exploration of Musical Chord Sequences

We present data-driven methods for supporting musical creativity by capturing the statistics of a musical database. Specifically, we introduce a system that supports users in exploring the high-dimensional space of musical chord sequences by parameterizing the variation among chord sequences in popular music. We provide a novel user interface that exposes these learned parameters as control axes, and we propose two automatic approaches for defining these axes. One approach is based on a novel clustering procedure, the other on principal components analysis.

 
Dynamic Mapping of Physical Controls for Tabletop Groupware

Mapping Physical Controls for Tabletop Groupware

Multi-touch interactions are a promising means of control for interactive tabletops. However, a lack of precision and tactile feedback makes multi-touch controls a poor fit for tasks where precision and feedback are crucial. We present an approach that offers precise control and tactile feedback for tabletop systems through the integration of dynamically re-mappable physical controllers with the multi-touch environment, and we demonstrate this approach in our collaborative tabletop audio editing environment.

 
MySong

MySong

MySong automatically generates chords to accompany a vocal melody, and lets a user with no knowledge of chords or harmony manipulate those chords with intuitive parameters.

 
SuperBreak

SuperBreak

SuperBreak adds hands-free interactivity to traditional ergonomic break-reminder software, using vision-based input.

 
SearchBar

SearchBar

SearchBar is a browser history centered around search topics and queries, instead of the less-intuitive constructs like domain and date that current browsers use to organize Web history.

 
Contact Us Terms of Use Trademarks Privacy Statement ©2010 Microsoft Corporation. All rights reserved.Microsoft