MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop
Hrvoje Benko1, Ricardo Jota1,2, Andrew D. Wilson1
1Microsoft Research, 2Technical University of Lisbon
Instrumented with a single depth camera, a stereoscopic projector, and a curved screen, MirageTable is an interactive system designed to merge real and virtual worlds into a single spatially registered experience on top of a table. Our depth camera tracks the user's eyes and performs a real-time capture of both the shape and the appearance of any object placed in front of the camera (including user's body and hands). This real-time capture enables perspective stereoscopic 3D visualizations to a single user that account for deformations caused by physical objects on the table. In addition, the user can interact with virtual objects through physically-realistic freehand actions without any gloves, trackers, or instruments. We illustrate these unique capabilities through three application examples: virtual 3D model creation, interactive gaming with real and virtual objects, and a 3D teleconferencing experience that not only presents a 3D view of a remote person, but also a seamless 3D shared task space. We also evaluated the user's perception of projected 3D objects in our system, which confirmed that the users can correctly perceive such objects even when they are projected over different background colors and geometries (e.g., gaps, drops).
Hrvoje Benko, Ricardo Jota, and Andrew Wilson. 2012. MirageTable: freehand interaction on a projected augmented reality tabletop. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 199-208.