Enhancing the Musical Experience – From the Acoustic to the Digital… and Back.

Over the last decade, inspired and motivated by the prospect of innovating the core of the musical experience, I have explored a number of research directions in which digital technology bears the promise of revolutionizing the medium. The research directions identified – gestural expression, collaborative networks, and constructionist learning – aimed at creating musical experiences that cannot be facilitated by acoustic means. The first direction builds on the notion that through novel sensing and mapping techniques, new expressive musical gestures can be discovered that are not supported by current acoustic instruments. Such gestures, unconstrained by the physical limitation of acoustic sound production, can provide infinite possibilities for expressive and creative musical experiences for novice as well as trained musicians. The second research direction utilizes the digital network in an effort to create new collaborative experiences, allowing players to take an active role in determining and influencing not only their own musical output but also that of their co-performers. By using the network to interdependently share and control musical materials in a group, musicians can combine their musical ideas into a constantly evolving collaborative musical activity that is novel and inspiring. The third research direction utilizes constructionist learning, which bears the promise of revolutionizing music education by providing hands-on access to programmable music making.

While facilitating novel musical experiences that cannot be achieved by traditional means, the digital nature of these research directions often led to flat and inanimate speaker-generated sound, hampering the physical richness and visual expression of acoustic music. In my current work, therefore, I attempt to combine the benefits of digital computation and acoustic richness, by exploring the concept of “robotic musicianship.” I define this concept as a combination of musical, perceptual, and social skills with the capacity to produce rich acoustic responses in a physical and visual manner. The robotic musicianship project aims to combine human creativity, emotion, and aesthetic judgment with algorithmic computational capabilities, allowing human and robotic players to cooperate and build off one another’s ideas. A perceptual and improvisatory robot can best facilitate such interactions by bringing the computer into the physical world both acoustically and visually.

The first robot to demonstrate these capabilities is Haile – a perceptual and interactive robotic percussionist that is designed to “listen like a human and improvise like a machine”. Haile listens to live human players, analyzes perceptual aspects of their playing in real-time, and uses the product of this analysis to play along in a collaborative and improvisatory manner. Its perceptual modules include the detection and analysis of low-level musical percepts such as onsets, pitch and velocity as well as higher-level musical percepts such as beat, similarity, stability and tension. Haile’s interaction and improvisation modules utilize mathematical construct that are unlikely to be used by humans such as genetic algorithms and fractal functions that are embedded in a variety collaborative interaction schemes. When playing with human musicians, the robot’s improvisational techniques are designed to inspire players to interact with it in novel manners that may revolutionize the musical experience and maybe, in the future, music itself.

Speaker Details

Gil Weinberg is the Director of Music Technology at Georgia Institute of Technology, where he founded the Master of Science in Music Technology program and the new cross campus Music Technology Research Center. He holds professorship positions both in the Music Department and the College of Computation. Dr. Weinberg received his M.S. and Ph.D. degrees in Media Arts and Sciences from Massachusetts Institute of Technology, after co-founding and holding a number of positions in music and media software industry in his home country of Israel.In his academic work Weinberg attempts to expand musical expression, creativity, and learning through meaningful applications of technology. His research interests include new instruments for musical expression, musical networks, machine and robotic musicianship, sonification, and music education. Weinberg’s music has been featured in festivals and concerts such as Ars Electronica, SIGGRAPH, ICMC, and NIME, and with orchestras such as Deutsches Symphonie-Orchester Berlin, the National Irish Symphony Orchestra, and the Scottish BBC Symphony. He has published more than 30 peer- reviewed papers in publications such as Computer Music Journal (MIT Press), Leonardo Music Journal (MIT Press), Organized Sound (Cambridge University Press), and Personal Technologies (Springer Verlag), among others. His interactive musical installations have been presented in museums such as the Smithsonian Museum, Cooper- Hewitt Museum, and Boston Children’s Museum. With his perceptual robotic percussionist, Haile, he has traveled around the world, featuring dozens of concerts in Asia, Europe, and North America. As a result of this project, Weinberg has recently been a National Science Foundation grant to continue to explore the concepts of machine and robotic musicianship. Based on his most recent project – a set of musical applications for cell phones – he is currently in the process of establishing a new company that will attempt to bring innovative research in music technology to the general public.

Date:
Speakers:
Gil Weinberg
Affiliation:
Georgia Institute of Technology