Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Our research
Content type
+
Downloads (461)
+
Events (466)
 
Groups (151)
+
News (2815)
 
People (717)
 
Projects (1136)
+
Publications (12921)
+
Videos (5994)
Labs
Research areas
Algorithms and theory47205 (87)
Communication and collaboration47188 (105)
Computational linguistics47189 (52)
Computational sciences47190 (76)
Computer systems and networking47191 (288)
Computer vision208594 (83)
Data mining and data management208595 (18)
Economics and computation47192 (18)
Education47193 (36)
Gaming47194 (45)
Graphics and multimedia47195 (139)
Hardware and devices47196 (99)
Health and well-being47197 (32)
Human-computer interaction47198 (304)
Machine learning and intelligence47200 (183)
Mobile computing208596 (16)
Quantum computing208597 (1)
Search, information retrieval, and knowledge management47199 (205)
Security and privacy47202 (90)
Social media208598 (14)
Social sciences47203 (99)
Software development, programming principles, tools, and languages47204 (201)
Speech recognition, synthesis, and dialog systems208599 (13)
Technology for emerging markets208600 (5)
1–25 of 304
Sort
Show 25 | 50 | 100
1234567Next 
Project details
Labs: Redmond
This research project investigates the design of an open source peer economy platform designed with and for service providers. This project is an early prototype of a worker dispatch system.
Project details
Labs: FUSE Labs
Project details
Labs: Redmond
Microsoft Research is conducting a study of a new device, called a Timecard. Timecard allows you to organise photos and other content around a timeline, and display this on a dedicated screen in your home.
Project details
Labs: Cambridge
NUIgraph is a prototype Windows 10 app for visually exploring data in order to discover and share insight.
Project details
Labs: Redmond
In order to increase user awareness of third party tracking we have investigated designs for visualizing cookie traffic as the users browse the Web.
Project details
Labs: Cambridge
Platform for Situated Interaction
Project details
Labs: Redmond
Instructions for Mechanical Turk tasks on classifying images
Project details
Labs: Redmond
We present a new interactive approach to 3D scene understanding. Our system, SemanticPaint, allows users to simultaneously scan their environment, whilst interactively segmenting the scene simply by reaching out and touching any desired object or surface. Our system continuously learns from these segmentations, and labels new unseen parts of the environment. Unlike offline systems, where capture, labeling and batch learning often takes hours or even days to perform, our approach is fully online.
Project details
The Ability team is a virtual team consisting of members of MSR's Labs who work on accessible technologies for people with disabilities.
Project details
The RoomAlive Toolkit is an open source SDK that enables developers to calibrate a network of multiple Kinect sensors and video projectors. The toolkit also provides a simple projection mapping sample that can be used as a basis to develop new immersive augmented reality experiences similar to those of the IllumiRoom and RoomAlive research projects.
Project details
Labs: Redmond
Microsoft believes the Surface Hub will be as empowering and as transformative to teams and the shared work environment as the PC was to individuals and the desk. The Surface Hub creates new modalities for creating and brainstorming with its unique large-screen productivity apps and capabilities. We believe it will be a critical component for the modern workplace, home, or other venue where people need to come together to think, ideate, and produce. This RFP is now closed.
Project details
Labs: Redmond
Presenter Camera is a desktop application designed to improve the quality of video seen by remote attendees of a presentation.
Project details
Labs: Redmond
Project details
Labs: Redmond
The Eye Gaze keyboard is a project to enable people who are unable to speak or use a physical keyboard to communicate using only their eyes. Our initial prototypes are based around an on screen qwerty keyboard very similar to the 'taptip' keyboard built into Windows 8 which has been extended to response to eye gaze input from a sensor bar like the Tobii EyeX. Our goal is to improve communication speed by 25% compared to experienced users of off the shelf Speech Generating Devices.
This project aims to enable people to converse with their devices. We are trying to teach devices to engage with humans using human language in ways that appear seamless and natural to humans. Our research focuses on statistical methods by which devices can learn from human-human conversational interactions and can situate responses in the verbal context and in physical or virtual environments.
Project details
Labs: Redmond
Project Blush explorers the materiality of digital ephemera and people's receptiveness to 'digital jewellery' - exploring the materials and aesthetics that may allow wearables to become jewellables.Project Blush is a research project that originates from the Human Experience and Design group (HXD). HXD specialise in designing and fabricating new human experiences with computing. These play on many different kinds of human values, from amplifying efficiency and effectiveness to creating delight a
Project details
Labs: Cambridge
Team Three Rs is a group of Microsoft Researchers working on the Global Learning XPRIZE challenge, which aims to create software to help children in the developing world achieve success in learning the "Three Rs" (Reading, Writing, and Arithmetic.
Project details
We explore grip and motion sensing to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen-and-touch interaction. We can detect whether the user holds the pen in a writing grip or tucked between his fingers. We can distinguish bare-handed inputs, such as drag and pinch gestures, from touch gestures produced by the hand holding the pen, and we can sense which hand grips the tablet, and determine the screen's relative orientation to the pen.
Project details
Labs: Redmond
Mano-a-Mano is a unique spatial augmented reality system that combines dynamic projection mapping, multiple perspective views and device-less interaction to support face-to-face, or dyadic, interaction with 3D virtual objects. Its main advantage over more traditional AR approaches is users are able to interact with 3D virtual objects and each other without cumbersome devices that obstruct face to face interaction.
Project details
Labs: Redmond
We present a new real-time articulated hand tracker which can enable new possibilities for human-computer interaction (HCI). Our system accurately reconstructs complex hand poses across a variety of subjects using only a single depth camera. It also allows for a high-degree of robustness, continually recovering from tracking failures. However, the most unique aspect of our tracker is its flexibility in terms of camera placement and operating range.
Project details
Project details
Labs: Redmond
The physical charts are an attempt to make data and data visualisations legible to ordinary people in their daily lives. In response to the increasing sophistication of data visualisations and the seemingly unquestioning quest for novelty, the charts make playful use of long established and highly familiar representations like pie charts and bar graphs. Rather than estrange viewers, the objective is to enable them to, at a glance, engage with and comprehend data.
Project details
Labs: Cambridge
EmotoCouch is a furniture prototype that uses lights, changing patterns and haptic feedback to change its appearance and thereby convey emotion. EmotoCouch is built using the Lab of Things platform.
Project details
1–25 of 304
Sort
Show 25 | 50 | 100
1234567Next 
> Our research