Thursday, 14 March 2013

Leveraging Behavioral Models of Sounding Objects for Gesture-Controlled Sound Design


The authors of the paper present a new intuitive approach to Foley and sound design techniques through the use of "off the shelf" motion sensing devices that allow for quick and fluid interaction with sound trigging and sound modelling based on the everyday action of interacting with objects. The authors use the traditional methods of Foley and sound design and combine them with an exciting more natural workflow.


Foley artists and sound designers spend a long time automating effects, synchronising sounds, scrolling through banks and recording new sounds. This process can be very slow, normally requires manual setup and all the parameters are inputed through the keyboard and mouse. The authors of the paper have presented a faster, more natural way for Foley artists and sound designers to interacte with and create their soundscapes.

With the inventions of commercial motion detection devices and OSC (Open Sound Control) the possibilities for interaction with digital sound has widely opened. The authors of the paper have designed a toolbox of plugins that work off the basic behavioural models of physical sounding objects that combines the characteristic methods of a Foley workflow with the newly available digital sound design methods.

The core principal behind this system was to design a set of models that mimic the behavioural patterns of physical sounding objects. The user must interact with the controller (eg. Wii, Kinect, Move) in the same way they would interact with the physical object that they are modelling. The system is broken down into several pseudo-physical models that focus on instantly giving the sound designer a wide verity of sounds and modulation techniques.

Below is a video of the Wii controlling the same DAW (Ableton Live) that the authors used for testing .

I find this paper very interesting. This is currently a very busy area, especially in the field of digital music performance. The authors of the paper have broke their system down into a intuitive set of models that are aimed particularly at sound designers. This is the first purpose built system that uses motion detection technology I have seen for Foley artists and sound designers. It tested well with professional sound designers who said "the system was particularly suitable for exploring the possibilities of the sounds and quickly recording ideas". User testers were drawn to the fun factor of the system but did express the need for a high accuracy in the final sound.

For more information on David Black and to review many of the other sound design projects he has worked on visit:


Kristian Gohlke, David Black, and Jörn Loviscach. 2010. Leveraging behavioral models of sounding objects forgesture-controlled sound design. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 245-248. DOI=10.1145/1935701.1935750

No comments:

Post a Comment