Sensor Localization and Camera Calibration using Low Resolution Cameras

Dimitrios Lymberopoulos, Andrew Barton-Sweeney, and Andreas Savvides

Abstract

Camera sensors constitute an information rich sensing modality with many potential applications in sensor networks. Their effectiveness in a sensor network setting however greatly relies on their ability to calibrate with respect to each other, and other sensors in the field. This paper examines node localization and camera calibration using the shared field of view of camera pairs. We compare two approaches from computer vision and propose an algorithm that combines a sparse set of distance measurements with image information to accurately localize nodes in 3D. We evaluate our algorithms in a real testbed using a COTS camera interfaced to our sensor nodes. Sensors identify themselves

to cameras using modulated LED emissions. Our indoor experiments yielded a 2-7cm error in a 6x6m room. Our outdoor experiments in a 30x30m field resulted in errors 20-80cm, depending on the method used.

Details

Publication typeInproceedings
Published inInternational Workshop on Broadband Advanced Sensor Networks (Basenets)
> Publications > Sensor Localization and Camera Calibration using Low Resolution Cameras