SnapToReality: Aligning Augmented Reality to the Real World

CHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems |

Published by ACM

DOI

Augmented Reality (AR) applications may require the precise alignment of virtual objects to the real world. We propose automatic alignment of virtual objects to physical constraints calculated from the real world in real time (“snapping to reality”). We demonstrate SnapToReality alignment techniques that allow users to position, rotate, and scale virtual content to dynamic, real world scenes. Our proof-of-concept prototype extracts 3D edge and planar surface constraints. We furthermore discuss the unique design challenges of snapping in AR, including the user’s limited field of view, noise in constraint extraction, issues with changing the view in AR, visualizing constraints, and more. We also report the results of a user study evaluating SnapToReality, confirming that aligning objects to the real world is significantly faster when assisted by snapping to dynamically extracted constraints. Perhaps more importantly, we also found that snapping in AR enables a fresh and expressive form of AR content creation.

SnapToReality: Aligning Augmented Reality to the Real World

Augmented Reality (AR) applications may require the precise alignment of virtual objects to the real world. We propose automatic alignment of virtual objects to physical constraints calculated from the real world in real time (“snapping to reality”). We demonstrate SnapToReality alignment techniques that allow users to position, rotate, and scale virtual content to dynamic, real world scenes. Our proof-of-concept prototype extracts 3D edge and planar surface constraints. We furthermore discuss the unique design challenges of snapping in AR, including the user’s limited field of view, noise in constraint extraction, issues with changing the view in AR, visualizing constraints, and more. We also report the results of a user study evaluating SnapToReality, confirming that aligning objects to the real world is significantly faster when assisted by…

FoveAR: Combining an Optically See-Through Near-Eye Display with Projector-Based Spatial AR

Optically see-through (OST) augmented reality glasses can overlay spatially-registered computer-generated content onto the real world. However, current optical designs and weight considerations limit their diagonal field of view to less than 40 degrees, making it difficult to create a sense of immersion or give the viewer an overview of the augmented reality space. We combine OST glasses with a projection-based spatial augmented reality display to achieve a novel display hybrid, called FoveAR, capable of greater than 100 degrees field of view, view dependent graphics, extended brightness and color, as well as interesting combinations of public and personal data display. We contribute details of our prototype implementation and an analysis of the interactive design space that our system enables. We also contribute four prototype experiences showcasing…