Share on Facebook Tweet on Twitter Share on LinkedIn Share by email

The goal of the Spin project is to enable users to capture photorealistic 3D models of objects using just an ordinary camera -- with no special lighting, sensors, or other equipment. Our approach works equally well for a mobile phone, a point-and-shoot camera, or a digital SLR camera. The results can be shared and viewed on a phone, in a web browser, or in a desktop application.

Photosynth Spin technology is available to the public in the latest Photosynth Preview. Give it a try!


Some of our inspiration comes from ideas first presented in Chen's 1995 "QuickTime VR" paper, which included techniques for encoding and viewing both panoramas and "object movies."  QuickTime VR's object movies are essentially just flip-book animations of a sequence of images, and therefore they require special capture rigs to ensure smooth rotation.  Because QuickTime VR object movies don't use any view interpolation, they are limited to a finite number of viewing angles.

The Photo Tourism project, a collaboration between Microsoft Research and the University of Washington, took these ideas into the realm of hand-held and crowd-sourced photography. The “spin” idea is called an “orbit” in the follow-on “Finding Paths” paper. However, in that work, the geometric proxies are just vertically oriented planes.

Photosynth was created when Microsoft's Live Labs (and then Bing Maps) turned the Photo Tourism idea into a web service and photo sharing community.  (See Photosynth's background page for more historical information.) The orbits are informally called “donuts” by the Photosynth team and also use planar proxies.

Work on view interpolation continued in Microsoft Research, with a paper on interactive 3D architectural modeling that showed the benefits of more detailed proxies (first demonstrated in the Lumigraph and Façade papers) and then a fully automated system for computing piecewise planar proxies, which forms the basis of the MSR Spin pipeline.

This software was refined and a mobile capture demo was produced for TechFest 2011 and demonstrated to the press as well as presented at MIX11, alongside related technology called Rich Interactive Narratives that Eric helped develop.

In parallel, the Photosynth team continued to enhance and expand the Photosynth family of services, including real-time mobile panorama acquisition and stitching software.

In Microsoft Research, we refined the Spin technology, producing an Azure-based service, a series of interactive viewers for various platforms, and better real-time capture apps for Windows Phone. The algorithm for generating smooth arcs around an object is described in our SIGGRAPH 2012 paper, "Image-based rendering for scenes with reflections."

The Photosynth team, meanwhile, was exploring a variety of multi-viewpoint “panorama packet” ideas, and the two efforts merged in the fall of 2012, when the Photosynth team adopted our Azure service for creating spins and started re-engineering the Photosynth system around a cloud-based offering.