Illumination Normalization with Time-Dependent Intrinsic Images for Video Surveillance

  • Yasuyuki Matsushita ,
  • Ko Nishino ,
  • Katsushi Ikeuchi ,
  • Masao Sakauchi

Published by Institute of Electrical and Electronics Engineers, Inc.

Publication

Variation in illumination conditions caused by weather, time of day, etc., makes the task difficult when building video surveillance systems of real world scenes. Especially, cast shadows produce troublesome effects, typically for object tracking from a fixed viewpoint, since it yields appearance variations of objects depending on whether they are inside or outside the shadow. In this paper, we handle such appearance variations by removing shadows in the image sequence. This can be considered as a preprocessing stage which leads to robust video surveillance. To achieve this, we propose a framework based on the idea of intrinsic images. Unlike previous methods of deriving intrinsic images, we derive time-varying reflectance images and corresponding illumination images from a sequence of images instead of assuming a single reflectance image. Using obtained illumination images, we normalize the input image sequence in terms of incident lighting distribution to eliminate shadowing effects. We also propose an illumination normalization scheme which can potentially run in real time, utilizing the illumination eigenspace, which captures the illumination variation due to weather, time of day, etc., and a shadow interpolation method based on shadow hulls. This paper describes the theory of the framework with simulation results and shows its effectiveness with object tracking results on real scene data sets.