Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
QBT-Extended: An annotated dataset of melodically contoured tapped queries

Blair Kaneshiro, Hyung-Suk Kim, Jorge Herrera, Jieun Oh, Jonathan Berger, and Malcolm Slaney

Abstract

Query by tapping remains an intuitive yet underdeveloped form of content-based querying. Tapping databases suffer from small size and often lack useful annotations about users and query cues. More broadly, tapped representations of music are inherently lossy, as they lack pitch information. To address these issues, we publish QBT-Extended—an annotated dataset of over 3,300 tapped queries of pop song excerpts, along with a system for collecting them. The queries, collected from 60 users for 51 songs, contain both time stamps and pitch positions of tap events and are annotated with information about the user, such as musical training and familiarity with each excerpt. Queries were performed from both short-term and long-term memory, cued by lyrics alone or lyrics and audio. In the present paper, we characterize and evaluate the dataset and perform initial analyses, providing early insights into the added value of the novel information. While the current data were collected under controlled experimental conditions, the system is designed for large-scale, crowdsourced data collection, presenting an opportunity to expand upon this richer form of tapping data.

Details

Publication typeInproceedings
Published inProceedings of the 2013 ISMIR
URLhttps://ccrma.stanford.edu/groups/qbtextended/
PublisherInternational Society for Music Information Retrieval
> Publications > QBT-Extended: An annotated dataset of melodically contoured tapped queries