Thursday 14 March 2013

Leveraging Behavioral Models of Sounding Objects for Gesture-Controlled Sound Design

Introduction:

The authors of the paper present a new intuitive approach to Foley and sound design techniques through the use of "off the shelf" motion sensing devices that allow for quick and fluid interaction with sound trigging and sound modelling based on the everyday action of interacting with objects. The authors use the traditional methods of Foley and sound design and combine them with an exciting more natural workflow.

Review

Foley artists and sound designers spend a long time automating effects, synchronising sounds, scrolling through banks and recording new sounds. This process can be very slow, normally requires manual setup and all the parameters are inputed through the keyboard and mouse. The authors of the paper have presented a faster, more natural way for Foley artists and sound designers to interacte with and create their soundscapes.

With the inventions of commercial motion detection devices and OSC (Open Sound Control) the possibilities for interaction with digital sound has widely opened. The authors of the paper have designed a toolbox of plugins that work off the basic behavioural models of physical sounding objects that combines the characteristic methods of a Foley workflow with the newly available digital sound design methods.

The core principal behind this system was to design a set of models that mimic the behavioural patterns of physical sounding objects. The user must interact with the controller (eg. Wii, Kinect, Move) in the same way they would interact with the physical object that they are modelling. The system is broken down into several pseudo-physical models that focus on instantly giving the sound designer a wide verity of sounds and modulation techniques.

Below is a video of the Wii controlling the same DAW (Ableton Live) that the authors used for testing .


I find this paper very interesting. This is currently a very busy area, especially in the field of digital music performance. The authors of the paper have broke their system down into a intuitive set of models that are aimed particularly at sound designers. This is the first purpose built system that uses motion detection technology I have seen for Foley artists and sound designers. It tested well with professional sound designers who said "the system was particularly suitable for exploring the possibilities of the sounds and quickly recording ideas". User testers were drawn to the fun factor of the system but did express the need for a high accuracy in the final sound.

For more information on David Black and to review many of the other sound design projects he has worked on visit: http://www.daveblack.org/

Reference:

Kristian Gohlke, David Black, and Jörn Loviscach. 2010. Leveraging behavioral models of sounding objects forgesture-controlled sound design. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 245-248. DOI=10.1145/1935701.1935750 http://doi.acm.org/10.1145/1935701.1935750

Thursday 7 March 2013

Hypo Chrysos: Mapping in Interactive Action Art Using Bioacoustic Sensing

Introduction:

Marco Donnarumma presents to us Hypo Chrysos. Hypo Chrysos is a work of action art and biophysical media. Audio and visual content is driven by continuous bioacoustic signals. These bioacoustic signals comprise of blood, muscle sound bursts and bone crackles as a result of pulling heavy weights attached to the performers arms.. The bioacoustic signals are amplified giving a low frequency sound and then distorted to give mid range and high frequencies. All the content is created in real time and solely depends on how the performer interacts with the weights on stage.


Body:

All audio and visual content is created in real time by the performers body using the Xth-sense. The Xth-sense, a new instrument and recently voted "world’s most innovative new musical instrument" by the Georgia Tech Center for Music Technology (US),  works by attaching wearable biosensors to the body. These biosensors work with a digital framework that processes real-time acoustic biosignals. These biosignals are then amplified and played out over several speakers.  The performer straps two weights (30kg combined) to their arms. The music is created by the performer pulling these weights around the stage. The performer must force themselves through the pain until the piece is completed. The strain level of the performer's body defines the piece of music being played back to the audience. At first the build of the bioacoustic sounds from the viens and muscles build slowly, delivering low punchy sounds. As the sounds build they are distorted and fed back into the back into the piece creating higher sounds. 

Watch Marco Donnarumma perform live here: Hypo Chrysos | Action art for vexed body and biophysical media (Xth Sense)

Hypo Chrysos is a fascinating and inspiring piece, to achieve a musical performance the performer must go through a certain level of pain and suffering. I see it almost as a physical metaphor to the hard work that musicians go through to bring a new piece of music to the world. This piece of art also represents inner musical workings of the body in a way never heard before.


Additional information: Marco Donnarumma's piece is inspired by "the sixth Bolgia of Dante’s Infernolocated in one of the lowest of the circles of hell. Here, the poet encounters the hypocrites walking along wearing gilded cloaks filled with lead. It was Dante’s punishment for the falsity hidden behind their behaviour; a malicious use of reason which he considered unique to human beings."

Thursday 28 February 2013

MirrorFugue: Communicating Hand Gesture in Remote Piano Collaboration

Introduction:

The authors of the paper present three interfaces the Shadow, Reflection, and Organ. These interfaces are designed for synchronous, remote collaboration, with focus on remote lessons for beginners. By displaying the to-hand gestures of a teacher on the piano a student can better grasp and understand the necessary skills and hand movements needed to play the piano.

Review:

Throughout the years many companies including Yamaha, Casio and Moog have added extra functions to their keyboards/pianos to help aid in the learning process. There has also been many remote learning platforms introduced to guild the students in leaning the piano. No platform or company however has offered a platform that allows the player to sit at the piano and watch the hand gestures of their teach at hand level. The authors of this paper have offered exactly that. 

Mirror Fugue is designed by using a mirror system. By using wide-angle cameras and projectors (and MAX/MSP and Jitter to manage video and sound) the team were able to deliver 640 x 480 at 30fps video over an ethernet connection from two different locations in the same building. This allowed for a lecture to sit at one keyboard and play, while the student at the other keyboard could watch and mimic the hand gestures and notes played by the lecture. 


The team at the Tangible Media Group have broke their development down into three sections.

Shadow:

This system works by projecting the shadows of  the lecture's hands directly onto the keyboard. In the implementation of this system a video of the lectures hands is projected instead of a silhouette.



Reflection

This system gains its inspiration from the reflections on a lacquered piano. This is where the keys and the players hands are reflected back at them on the surface in front of them. This system allows for the reflection of their remote partner’s hands on directly in front of the keys.


Organ 

  
This system displays an unaltered top-down image of the lectures keyboard. The keys from the projected image are directly lined up with the physical keys on the student's keyboard. 
 

After extensive testing the team concluded that the organ system offered itself as the best learning platform for the students to learn with.

The team plan to keep researching this area with plans to show more of the players body in the projection. 

The MirrorFugure is an exciting new development that in my opinion has a lot of potential to be added into the class rooms of many music schools. It offers a clever system that helps improve the learning of the student.


For more information visit there blog: 

http://web.media.mit.edu/~x_x/mirrorfugue/

There has also been a second paper, MirrorFugue2, which can be found here:

http://tmg-trackr.media.mit.edu:8020/SuperContainer/RawData/Papers/516-MirrorFugue2%20EmbodiedRepresentation%20of/Published/PDF

Also visit http://tangible.media.mit.edu/ to view many more interesting projects from the Tangible Media Group.



Link to paper:

Xiao Xiao and Hiroshi Ishii. 2010. MirrorFugue: communicating hand gesture in remote piano collaboration. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 13-20. DOI=10.1145/1935701.1935705 http://doi.acm.org/10.1145/1935701.1935705

Thursday 21 February 2013

Multisensor Broadband High Dynamic Range Sensing... for a Highly Expressive Step-based Musical Instrument

Introduction:

This paper presents two new results. The first result presents us with a High Dynamic Range (HDR) sensor for sensing touch, vibration, pressure, force, sound, or seismic waves. The second results presents us with the use of the HDR sensor to sense vibrations in solid matter that has a specific application to the newly invented musical instrument called the "Andantephone".


The "Andantephone"


Review:

The Andantephone is a newly invented musical instrument. The player interacts with the instrument by walking along the pads. Each pad is programmed to a particular chord or melody in a song, allowing the player to walk along the sequence to playback the song (walking through the song's timeline). The player dictates the tempo of the song as they speed up and slow down and if the player stops walking the music will stop. The pads are not programmed to play back keys in order, they are programmed to help teach people the basic principles of music, rhythm and tempo, as it transforms walking into a musical act. 



To build an instrument like the "Andantephone" the research team had to build a sensor with great dynamic range that would pick up heavy force like stomping, kicking, vehicles driving over it whilst at the same time be able to sense very subtle light touches like skin, feet, socks moving over it. The HDR sensor that was designed caters for the needs of this instrument. It has an extremely high dynamic range, that allows the player to play the instrument in similar ways to a piano or guitar (quick attack) combined with the ability to sustain notes over a long period of time (like an organ).



The "Andantephone" offers a great learning platform for children to interact with, learn with and have fun with. The "Andantephone" has been used in many schools and has also being installed as permanent fixtures in many playgrounds. I think it's a great way to introduce children to learning some of the core principals of music through a fun social user friendly interface.  


If you would like to read more about the Andantephone visit:

http://www.wearcam.org/andantephone

and 

http://www.thestar.com/entertainment/music/2010/03/11/andantephone_brings_church_organ_into_21st_century.html

Link to paper:

Steve Mann, Ryan E. Janzen, and Tom Hobson. 2010. Multisensor broadband high dynamic range sensing: for a highly expressive step-based musical instrument. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 21-24. DOI=10.1145/1935701.1935706 http://doi.acm.org/10.1145/1935701.1935706

Thursday 14 February 2013

Modular Musical Objects Towards Embodied Control Of Digital Music

Introduction:

The authors of the paper present an ensemble of tangible objects that are linked to software modules designed for musical interaction and performance within digital music. The central concept is to allow musicians to decide on the musical function of the objects. This favours customisation, assembling and repurposing existing everyday objects.






The Review:

As a digital music producer who has never been formally trained to play an instrument but is very interested in playing digital music live, I was very excited to review this paper. There is a big shift currently happening in electronic music, we are seeing the crossover from the traditional "DJ" (who plays digital music through a player (vinyl, cds, mp3s) manipulating it and remixing it) to a live digital music performer. Most live performers rely on midi controllers with some artists branching out further, for example Gastavo Bravetti and his show alternative controllers where he uses Ableton Live with specially designed hand controllers, Wii remotes and tubes to manipulate his music.



The authors of this paper present a new way to interact with digital music, encouraging musicians to build new controllers and make existing objects into controllers by using a design up method (http://en.wikipedia.org/wiki/Top-down_and_bottom-up_design). The team designed several hardware objects that allow the user to interact with digital sounds wirelessly, using either new techniques or traditional techniques. In this video you can see a person/persons interacting with an object. Different interactions  create different sounds or notes.




By using these hardware object with Max/Msp, musicians can program multipule ways in which to interact with and perform live digital music. With the addition of Max for Live (https://www.ableton.com/en/live/max-for-live/ ) it is even easier to create these digital hardware instruments. In the video below we see an object that can be attached to any surface and played, by using different hand gestures the musicians can create different notes and sounds.




In the following video we see people interacting with everyday kitchen objects, they connect a sensor to the object they want to perform with which allows for the objects to cause a reaction, thus creating music.



This research presents an exciting results for musicians that want to perform live. If you are interested and would like to read more about this subject check out their blog here: 

http://interlude.ircam.fr/wordpress/

and Creativedigitalmusic.com's full length review: 

http://createdigitalmusic.com/2011/03/what-makes-a-truly-new-instrument-human-gestures-power-winners-of-guthman-competition/


Nicolas Rasamimanana, Frederic Bevilacqua, Norbert Schnell, Fabrice Guedy, Emmanuel Flety, Come Maestracci, Bruno Zamborlin, Jean-Louis Frechin, and Uros Petrevski. 2010. Modular musical objects towards embodied control of digital music. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 9-12. DOI=10.1145/1935701.1935704 http://doi.acm.org/10.1145/1935701.1935704

Tuesday 5 February 2013

Water-Hammer Piano as an Interactive Percussion Surface

Introduction

The paper introduces the proposed use of the natural phenomenon known water-hammer as an interactive percussion surface. Water-hammer, which is generally viewed as an undesirable effect, occurs when water flowing through a pipe is suddenly cut off resulting in a clanging sound. Typically plumbers go to great lengths to stop this effect from happening. The paper presents several carefully designed instruments that use the water-hammer effect to create acoustic notes. Traditional musical instruments make their fundamental sound from vibrations in either a solid (strings and percussion) or gas (woodwinds and brass) whereas the water-hammer's fundamental sound comes from, at least in part, it's liquid state. The paper presents exciting embodiments which have been heavily tested and presented publicly through live performances and workshops in various schools and organisations which have had promising results. 

The Review

The strength of this paper  is in the very well documented process undertaken by the authors of the paper. It is well written and the physics of the instruments are well explained and relativity easy to understand. 

The first embodiment we are introduced to, named "Nessie", is made from twelve pipes.



These pipes have extremely tick walls when compared to the tickness of a normal pipe. This instrument works from water emerging from each of the twelve pipes, named "NessonatorTM"(the word Nessonator is a portmanteau formed from the words “Nessie” (it's name) and “resonator”) and is played by either abruptly stopping the water flow with your finger or hitting it with a rubber mallet. The sudden stop to the water flow creates a forceful impulse, which resonates as a musical note. The music note is depicted by a modified wine bottle which the water passes through, encased in concrete. It is encased in concrete to handle the pressure from the water-hammer and to ensure that the sound is created purely by the vibrations of the water and not the solid that it is encased in. 
In the following video we can watch people from around the world perform on "Nessie". 



As you can hear from the performances in the video, "Nessie" creates beautiful harmonic tones and has a full range scale. While it has a full scale, none of the "hydraulophones" follow the rules of the IPN (Internation Pitch Notation http://www.flutopedia.com/octave_notation.htm) as they ran into five problems with the IPN. To overcome this notion problem, the "hydraulophones" follows the NPN (Natural Pitch Notation where the notes follow a natural order ie. A - G) 

The second embodiment of the water-hammer piano that we see is comprised of solid pipes that are connected to tick elastic walled rubber hoses. 



This instrument is played by slapping the top of the cap, which results in shockwaves that send the column of water into transient disturbance resulting in a harmonic tone. Oscillations occur within the pipes from the interaction between the resistance of water in the pipe, and the elasticity of the end cap on the bottom of the pipe. In this video we see a more advanced version of this protoype being played live in a public performance https://vimeo.com/23136730.

My personal thoughts on both the instruments are positive, both create beautiful harmonic notes and offer musicians and non musicians alike a full interactive musical experience with water. It is the thought's of Steve Mann and the thought's of myself that this musical interaction with water is a way of reconnecting with nature as the human body is mainly made from water. I also found this research to be quite refreshing, we live in a time were everything is going digital and a lot of modern musical inventions have been digital but this paper presents new musical instruments in a very natural organic way. 

Steve Mann and his research team wanted to create a three dimensional experience as the first two embodiments are only one dimensional instruments. With these embodiments the user had to quickly run their hand up or down the instrument to cut off the water to achieve the desired note. A three dimensional embodiment of water can be touched, hit and swirled from side to side in anyway thus making it more interactive. 

To create this three dimensional instrument the researches took inspiration from an ancient African tradition called "Liquindi" (http://en.wikipedia.org/wiki/Liquindi) a form of water drumming performed by women and children from Baka. In groups they hit the water with their hands cupped to trap the air and create a percussive sound. Although to create these percussive sounds both liquid and air (gas) is needed so it is not a hydraulophonic sound. The following video shows a "Liquindi" performance.




The "Water-Drum"




The "water-drum" is an interesting instrument that needs to take many factors into account. Firstly, unlike the other instruments the water-drum needs to be filtered and sub and ultra sonic frequencies need to be pitch shifted to be within the human auditory range. Although these changes are made digitally it does not change the fact that this instrument is an acoustic instrument as the sounds are being processed from an acoustic signal, in the same way an acoustic guitar can be plugged into a mixing desk. The water-drum also needs 8 - 12 hydrophones (underwater microphones) to amplify the signal. Again this does not change the instrument from an acoustic to digital instrument, this instrument is a Physiphone (natural interact for musical expression). The water-drum works by the user percussively hitting the water with their hand, but there is also an added option with the water-drum that you don't get with traditional drums, the user can swirl their hand in any direction cause new harmonic tones. A jet spray can also be sprayed onto the water-drum and acts like that of a bow on a violin, the only difference is the bow effect on the water-drum can be infinite. 

Here is a video of the water-drum in action.



The water-drum is an exciting instrument, and ultimately looks enjoyable to play. It also seems to have a fun social element. I would love to get my hands on these instruments and try them out. From the videos I have watched they seem to perform and as I said earlier they tie in with the natural side of things. 

I am not sure if these instruments will ever surpass the traditional and digital instrument that we use today but I think they are a nice addition to the collection of instruments we have. I think this approach to creating music and instruments alike is very refreshing and exciting.

User tests also showed that these instruments were very popular with people from all demographics.
Here is one last video of the  hydraulophones, here we have Steve Mann and Ryan Janzen perform a duet.


These hydraulophonic instruments can now be found in water parks all around the world. They have also been given to many less well off countries that cannot afford musical instruments. Steve Mann the lead researcher of the paper has worked on many interesting papers and ideas. To find out more about him and this problem follow this link (http://www.singularityweblog.com/cyborg-steve-mann/) to watch a full interview. 
The authors of this paper are as follows, Steve Mann, Ryan Janzen, Jason Huang, Matthew Kelly, Lei Jimmy Ba, Alexander Chen from the University of Toronto.

For more information contact hydraulophone@gmail.com

Reference:

Steve Mann, Ryan Janzen, Jason Huang, Matthew Kelly, Lei Jimmy Ba, and Alexander Chen. 2010. User-interfaces based on the water-hammer effect: water-hammer piano as an interactive percussion surface. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 1-8. DOI=10.1145/1935701.1935703 http://doi.acm.org/10.1145/1935701.1935703