Back to projects page
 
Generating Sharp Panoramas from Motion-blurred Videos

to appear at CVPR 2010

Yunpeng Li Sing Bing Kang
Neel Joshi
Steve Seitz Dan Huttenlocher


Cornell


Microsoft Research


U. of Washington


Cornell

  
Figure 1. Stitching example. (left) Result of directly stitching the input frames. (right) Result of our technique.
Abstract

In this paper, we show how to generate a sharp panorama from a set of motion-blurred video frames. Our technique is based on joint global motion estimation and multi-frame deblurring. It also automatically computes the duty cycle of the video, namely the percentage of time between frames that is actually exposure time. The duty cycle is necessary for allowing the blur kernels to be accurately extracted and then removed. We demonstrate our technique on a number of videos.
    






Paper (0.97 MB)
Supplementary Webpage


Back to projects page