Music Intelligence & the “Taste Profile” – What Computers Think of You and Your Music Taste

Over 200 million people now trust an algorithm they’ve never met to listen to and discover music. But music needs a bit more care than collaborative filtering or automated editorial approaches can give, and before we let Facebook automatically make mixtapes for our crushes, we should step back and see what the potential of music analysis is and how we can give it more respect.

For the past 10 years I’ve been working on automatic music analysis, first academically and now as the co-founder and CTO of the Echo Nest, a company you’ve never heard of but powers most music discovery experiences you have on the internet today, from Spotify to Clear Channel to MTV. I’ll show how the interaction between listeners and music is being modeled today, where it is amazing and where it falls flat, and how connections are being made between your music taste and your identity.

Speaker Details

Brian is recognized as a leading scientist in the area of music and text retrieval and natural language processing. He received his doctorate from MIT’s Media Lab in 2005 and co-founded The Echo Nest to provide music recommendation, search, playlisting, fingerprinting and personalization technology based on his research to much of the online music industry. As the CTO of the Echo Nest, Brian leads new product development and focuses on future taste profile and music analytic products.

Date:
Speakers:
Brian Whitman
Affiliation:
The Echo Nest