Our research
Content type
+
Downloads (6)
+
Events (1)
 
Groups (0)
+
News (21)
 
People (0)
 
Projects (9)
+
Publications (56)
+
Videos (157)
Labs
Research areas
Algorithms and theory47205 (0)
Communication and collaboration47188 (0)
Computational linguistics47189 (0)
Computational sciences47190 (0)
Computer systems and networking47191 (0)
Computer vision208594 (0)
Data mining and data management208595 (0)
Economics and computation47192 (0)
Education47193 (0)
Gaming47194 (1)
Graphics and multimedia47195 (0)
Hardware and devices47196 (4)
Health and well-being47197 (0)
Human-computer interaction47198 (8)
Machine learning and intelligence47200 (2)
Mobile computing208596 (2)
Quantum computing208597 (0)
Search, information retrieval, and knowledge management47199 (1)
Security and privacy47202 (0)
Social media208598 (0)
Social sciences47203 (0)
Software development, programming principles, tools, and languages47204 (0)
Speech recognition, synthesis, and dialog systems208599 (1)
Technology for emerging markets208600 (0)
1–9 of 9
Sort
Show 25 | 50 | 100
1
The SlideShow Gestures-WPF sample shows how you are able to use the Kinect for Windows SDK to control Windows applications through the use of gestures. It uses research from the Microsoft Research Cambridge lab to trigger events when the user performs a gesture.
Project details
Labs: Cambridge
We explore grip and motion sensing to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen-and-touch interaction. We can detect whether the user holds the pen in a writing grip or tucked between his fingers. We can distinguish bare-handed inputs, such as drag and pinch gestures, from touch gestures produced by the hand holding the pen, and we can sense which hand grips the tablet, and determine the screen's relative orientation to the pen.
Project details
Labs: Redmond
SecondLight is a new surface computing technology that can project images and detect gestures "in mid-air" above the display, in addition to supporting multitouch interactions on the surface.
Project details
Labs: Cambridge
Motivated by advances in touch-sensing technologies, Surface Computing and multi-touch input support in Windows 7, we have developed a number of prototype mice that consider different physical form-factors and sensing techniques. These devices allow us to do more than point-and-click: they can detect the position of users’ hands and fingers; recognise and react to gestures, and support novel interaction techniques.
Project details
SideSight expands the multi-touch capabilities of small mobile devices beyond the screen. Infrared sensors embedded along each side of device are capable of detecting the presence and position of fingers within the proximity of the device. When the device is rested on a flat surface, such as a table, the user can carry out single and multi-touch gestures using the space around the device. This gives a larger input space than would otherwise be possible, and which may be used in conjunction with,
Project details
Labs: Cambridge
Stroke Recovery with Kinect is an interactive rehabilitation system that helps stroke patients improve their upper-limb motor functioning in the comfort of their own home. By using the Microsoft Kinect sensor’s gesture recognition technology, the system recognizes and interprets the user’s movements, assesses their rehabilitation progress, and adjusts the level of difficulty for subsequent therapy sessions.
Project details
Labs: Asia
Conversational systems interact with people through language to assist, enable, or entertain. Research at Microsoft spans dialogs that use language exclusively, or in conjunctions with additional modalities like gesture; where language is spoken or in text; and in a variety of settings, such as conversational systems in apps or devices, and situated interactions in the real world.
Project details
Labs: Redmond
We contribute a thin, transparent, and low-cost design for electric field sensing, allowing for 3D finger and hand tracking, as well as in-air gestures on mobile devices.
Project details
Labs: Cambridge
We present a new type of augmented mechanical keyboard, sensing rich and expressive motion gestures performed both on and directly above the device.
Project details
Labs: Cambridge
1–9 of 9
Sort
Show 25 | 50 | 100
1
> Our research