Playability maps as aid for musicians

Vasileios Chatziioannou –

Department of Music Acoustics, University of Music and Performing Arts Vienna, Vienna, Vienna, 1030, Austria

Alex Hofmann
Department of Music Acoustics
University of Music and Performing Arts Vienna
Vienna, Vienna, 1030

Popular version of 5aMU6 – Two-dimensional playability maps for single-reed woodwind instruments
Presented at the 185 ASA Meeting
Read the abstract at

Please keep in mind that the research described in this Lay Language Paper may not have yet been peer reviewed.

Musicians show incredible flexibility when generating sounds with their instruments. Nevertheless, some control parameters need to stay within certain limits for this to occur. Take for example a clarinet player. Using too much or too little blowing pressure would result in no sound being produced by the instrument. The required pressure value (depending on the note being played and other instrument properties) has to stay within certain limits. A way to study these limits is to generate ‘playability diagrams’. Such diagrams have been commonly used to analyze bowed-string instruments, but may be also informative for wind instruments, as suggested by Woodhouse at the 2023 Stockholm Music Acoustics Conference. Following this direction, such diagrams in the form of playability maps can highlight the playable regions of a musical instrument, subject to variation of certain control parameters, and eventually support performers in choosing their equipment.

One way to fill in these diagrams is via physical modeling simulations. Such simulations allow predicting the generated sound while slowly varying some of the control parameters. Figure 1 shows such an example, where a playability region is obtained while varying the blowing pressure and the stiffness of the clarinet reed. (In fact, the parameter varied on the y-axis is the effective stiffness per unit area of the reed, corresponding to the reed stiffness after it has been mounted on the mouthpiece and the musician’s lip is in contact with it). Black regions indicate ‘playable’ parameter combinations, whereas white regions indicate parameter combinations, where no sound is produced.

Figure 1: Pressure-stiffness playability map. The black regions correspond to parameter combinations that generate sound.

One possible observation is that, when players wish to play with a larger blowing pressure (resulting in louder sounds) they should use stiffer reeds. As indicated by the plot, for a reed of stiffness per area equal to 0.6 Pa/m (soft reed) it is not possible to generate a note with a blowing pressure above 2750 Pa. However, when using a harder reed (say with a stiffness of 1 Pa/m) one can play with larger blowing pressures, but it is impossible to play with a pressure lower than 3200 Pa in this case. Varying other types of control parameters could highlight similar effects regarding various instrument properties. For instance, playability maps subject to different mouthpiece geometries could be obtained, which would be valuable information for musicians and instrument makers alike.

Virtual Reality Musical Instruments for the 21st Century

Rob Hamilton –
Twitter: @robertkhamilton

Rensselaer Polytechnic Institute, 110 8th St, Troy, New York, 12180, United States

Popular version of 1aCA3 – Real-time musical performance across and within extended reality environments
Presented at the 184 ASA Meeting
Read the abstract at

Have you ever wanted to just wave your hands to be able to make beautiful music? Sad your epic air-guitar skills don’t translate into pop/rock super stardom? Given the speed and accessibility of modern computers, it may come as little surprise that artists and researchers have been looking to virtual and augmented reality to build the next generation of musical instruments. Borrowing heavily from video game design, a new generation of digital luthiers is already exploring new techniques to bring the joys and wonders of live musical performance into the 21st Century.

Image courtesy of Rob Hamilton.

One such instrument is ‘Coretet’: a virtual reality bowed string instrument that can be reshaped by the user into familiar forms such as a violin, viola, cello or double bass. While wearing a virtual reality headset such as Meta’s Oculus Quest 2, performers bow and pluck the instrument in familiar ways, albeit without any physical interaction with strings or wood. Sound is generated in Coretet using a computer model of a bowed or plucked string called a ‘physical model’ driven by the motion of a performer’s hands and the use of their VR game controllers. And borrowing from multiplayer online games, Coretet performers can join a shared network server and perform music together.

Our understanding of music, and live musical performance on traditional physical instruments is tightly coupled to time, specifically the understanding that when a finger plucks a string, or a stick strikes a drum head, a sound will be generated immediately, without any delay or latency. And while modern computers are capable of streaming large amounts of data at the speed of light – significantly faster than the speed of sound – bottlenecks in the CPUs or GPUs themselves, or in the code designed to mimic our physical interactions with instruments, or even in the network connections that connect users and computers alike, often introduce latency, making virtual performances feel sluggish or awkward.

This research focuses on some common causes for this kind of latency and looks at ways that musicians and instrument designers can work around or mitigate these latencies both technically and artistically.

Coretet overview video: Video courtesy of Rob Hamilton.