Mobile Video Search

Established: February 17, 2014

Mobile video is quickly becoming a mass consumer phenomenon. More and more people are using their smartphones to search and browse video contents while on the move. This project is to develop an innovative instant mobile video search system through which users can discover videos by simply pointing their phones at a screen to capture a very few seconds of what they are watching.

The system is able to index large-scale video data using a new layered audio-video indexing approach in the cloud, as well as extract light-weight joint audio-video signatures in real time and perform progressive search on mobile devices. Unlike most existing mobile video search applications that simply send the original video query to the cloud, the proposed mobile system is one of the first attempts at instant and progressive video search leveraging the light-weight computing capacity of mobile devices.

The system is characterized by four unique properties:

  1. a joint audio-video signature to deal with the large aural and visual variances associated with the query video captured by the mobile phone,
  2. layered audio-video indexing to holistically exploit the complementary nature of audio and video signals,
  3. light-weight fingerprinting to comply with mobile processing capacity, and
  4. a progressive query process to significantly reduce computational costs and improve the user experience—the search process can stop anytime once a confident result is achieved.

We have collected 1,400 query videos captured by 25 mobile users from a dataset of 600 hours of video. The experiments show that our system outperforms state-of-the-art methods by achieving 90.79% precision when the query video is less than 10 seconds and 70.07% even when the query video is less than 5 seconds.

Mobile video query dataset

You can find the Mobile video query dataset here (opens in new tab).