Real-Time Visual Localisation and Mapping with a Single Camera

In my work over the past five years I have generalised the Simultaneous Localisation and Mapping (SLAM) methodology of sequential probabilistic mapping, developed to enable mobile robots to navigate in unknown environments, to demonstrate real-time 3D motion estimation and visual scene mapping with an agile single camera. Via my MonoSLAM algorithm, a webcam attached to a laptop becomes a low cost but high-performance position and mapping sensor, which can be used by advanced mobile robots or coupled to a wearable computer for personal localisation.

When hard real-time performance (e.g. 30Hz+) is required, the limited processing resources of practical computers mean that fundamental issues of uncertainty propagation and selective visual attention must be addressed via the rigorous application of methods from probabilistic inference. The result is an approach which harnesses background knowledge of the scenario with the aim of obtaining maximum value from visual processing. I will present recent advances in information-theoretic active search, mosaicing, feature initialization and surface estimation, and compare my work with other state of the art approaches to real-time tracking and mapping. Practially, this technology has a host of interesting potential applications in many areas of robotics, wearable computing and augmented reality.

The presentation will include a live demonstration.

Speaker Details

Andrew Davison read physics at the University of Oxford, gaining his BA (first class honours) in 1994. Transferring to Oxford’s Robotics Research Group, in his doctoral research under the supervision of Prof. David Murray he developed one of the very first real-time Simultaneous Localisation and Mapping (SLAM) systems, particularly novel in using computer vision as the primary sensor. On receiving his D.Phil in 1998 he took up a European Union Science and Technology Fellowship and spent two years at AIST, Tsukuba, Japan, expanding his work on visual robot navigation. He returned to further postdoctoral work in Oxford in 2000 and was awarded a five year EPSRC Advanced Research Fellowship in 2002. He transferred to Imperial College London in 2005 to take up a lectureship and was promoted to Reader in Robot Vision in 2008. In his recent research he continues to work on advancing the basic technology of real-time localisation and mapping using vision while collaborating to apply these techniques in robotics, wearable computing and augmented reality. He has recently been awarded a five year ERC Starting Grant in the inaugural funding round of the new European Research Council.

Date:
Speakers:
Andrew Davison
Affiliation:
Imperial College London