Featured Research Videos
New this week
Krysta Svore on quantum computing and machine learning
Krysta Svore on quantum computing and machine learning
00:01:38

Senior Researcher Krysta Svore comments on the potential effect quantum computing can have on machine learning.



Recently featured
Handpose: Fully Articulated Hand Tracking
Handpose: Fully Articulated Hand Tracking
00:02:29

Introducing a new real-time articulated hand tracker which can enable new possibilities for human-computer interaction (HCI). Our system accurately reconstructs complex hand poses across a variety of subjects using only a single depth camera. It also allows for a high-degree of robustness, continually recovering from tracking failures. However, the most unique aspect of our tracker is its flexibility in terms of camera placement and operating range.

​Please note, we are using a standard Xbox One Kinect without any hardware modifications. The sunglasses are optional and were originally used for anonymity, and partly tongue-in-cheek.

Opening Keynote: Faculty Summit 2014
Opening Keynote: Faculty Summit 2014
01:30:32

Harry Shum, executive vice president of Microsoft’s Technology and Research group, opens the Faculty Summit by highlighting major efforts at Microsoft Research. Two of significance are the integration of Microsoft Academic Search into Bing with Cortana (Microsoft’s new personal digital assistant), and major improvements in computer vision via deep learning techniques.

Learning to Be a Depth Camera for Close-Range Human Capture and Interaction
Learning to Be a Depth Camera for Close-Range Human Capture and Interaction
00:03:42

Among Microsoft Research's contributions to SIGGRAPH 2014, a machine learning technique for estimating absolute, per-pixel depth using any conventional monocular 2D camera, with minor hardware modifications. Our approach targets close-range human capture and interaction where dense 3D estimation of hands and faces is desired. We use hybrid classification-regression forests to learn how to map from near infrared intensity images to absolute, metric depth in real-time. We demonstrate a variety of humancomputer interaction and capture scenarios. Experiments show an accuracy that outperforms a conventional light fall-off baseline, and is comparable to high-quality consumer depth cameras, but with a dramatically reduced cost, power consumption, and form-factor.

Haptic Feedback at the Fingertips
Haptic Feedback at the Fingertips
00:03:59

Presenting fingertip haptics: touch feedback on flat keyboards and touchscreens. Imagine feeling key clicks while typing on a Touch Cover or a Windows Phone, and locating a tile on a touchscreen through its unique tactile texture. Such effects are realized with piezoelectric actuators and electrostatic haptics technology.


Quantum Computing 101




FiRe2014: Artificial Intelligence Helping Humans: Future Research



An interview with Peter Lee, Corporate Vice President, hosted by Ed Butler, Presenter, BBC.



Skype Translator: Breaking down language barriers
Skype Translator: Breaking down language barriers
00:01:59

Peter Lee, Microsoft Research VP, shares insights and a sneak peek into the Skype Translator, derived from decades of research in speech recognition, automatic translation, and machine learning technologies. The Skype Translator is now being developed jointly by Skype and Microsoft Research teams, and combines voice and IM technologies with Microsoft Translator, and neural network-based speech recognition to deliver near real-time cross-lingual communication. With the Skype Translator, we're one step closer to universal communications across language barriers, allowing people to connect in ways never before possible. In Lee's words 'It's truly magical.'

Skype Translator demonstration from Code Conference 2014