Animating Pictures with Stochastic Motion Textures

Yung-Yu Chuang, Dan B Goldman, Ke Colin Zheng, Brian Curless, David H. Salesin, and Richard Szeliski

Abstract

In this paper, we explore the problem of enhancing still pictures with subtly animated motions. We limit our domain to scenes containing passive elements that respond to natural forces in some fashion. We use a semi-automatic approach, in which a human user segments the scene into a series of layers to be individually animated. Then, a “stochastic motion texture” is automatically synthesized using a spectral method, i.e., the inverse Fourier transform of a filtered noise spectrum. The motion texture is a time-varying 2D displacement map, which is applied to each layer. The resulting warped layers are then recomposited to form the animated frames. The result is a looping video texture created from a single still image, which has the advantages of being more controllable and of generally higher image quality and resolution than a video texture created from a video source. We demonstrate the technique on a variety of photographs and paintings.

Details

Publication typeArticle
Published inACM Transactions on Graphics
URLhttp://grail.cs.washington.edu/projects/StochasticMotionTextures/
Pages853-860
Volume24
Number3
PublisherAssociation for Computing Machinery, Inc.
> Publications > Animating Pictures with Stochastic Motion Textures