MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop

CHI '12 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems |

Published by ACM

DOI

Instrumented with a single depth camera, a stereoscopic projector, and a curved screen, MirageTable is an interactive system designed to merge real and virtual worlds into a single spatially registered experience on top of a table. Our depth camera tracks the user’s eyes and performs a real-time capture of both the shape and the appearance of any object placed in front of the camera (including user’s body and hands). This real-time capture enables perspective stereoscopic 3D visualizations to a single user that account for deformations caused by physical objects on the table. In addition, the user can interact with virtual objects through physically-realistic freehand actions without any gloves, trackers, or instruments. We illustrate these unique capabilities through three application examples: virtual 3D model creation, interactive gaming with real and virtual objects, and a 3D teleconferencing experience that not only presents a 3D view of a remote person, but also a seamless 3D shared task space. We also evaluated the user’s perception of projected 3D objects in our system, which confirmed that users can correctly perceive such objects even when they are projected over different background colors and geometries (e.g., gaps, drops).

MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop

In MirageTable, a 3D stereoscopic projector projects content directly on top of the curved screen. The information is captured by the Kinect camera, which also tracks the user's gaze. This enables presentation of correct perspective use to a single user on top of the dynamic changing geometry of the real world.