ASA PRESSROOM

Acoustical Society of America
ASA/EAA/DAGA '99 Meeting
Lay Language Papers


Course on The Perception of Music by The Human Brain: Cutting Edge Science For Music Students

Juan G. Roederer - Jgr@giuaf.gi.alaska.edu
Geophysical Institute, University of Alaska-Fairbanks
Fairbanks, Alaska, USA

Popular version of paper 2pMU1
Presented Tuesday Afternoon, March 16, 1999
ASA/EAA/DAGA '99 Meeting, Berlin, Germany

The study of the perception of music is a fascinating interdisciplinary endeavor in which musicians, physicists, physiologists, neuroscientists and psychologists work together. Recent advances in the understanding of how musical sounds are detected by the ear, converted into neural signals, processed and interpreted in the brain, make the subject very attractive as an elective introductory university course. Musical acoustics is a subject taught at a number of American and European universities, mostly from a physics point of view. It is now time to expand the horizon and offer a basic course on how the brain processes music.

Music students often shy away from science courses. They fear that "they will not understand", in other words, that they will not have the required background in physics, mathematics or biology. However, experience shows that in the field of music perception it is possible to present many subject matters in a generally understandable way, without science knowledge beyond that of the secondary school level [see J. G. Roederer, Physics and Psychophysics of Music, 3rd ed., Springer-Verlag, New York, 1995; Germ. ed.: Physikalische und Psychoakustische Grundlagen der Musik, Springer-Verlag, Heidelberg, 1993 (3rd ed. in preparation)]. In such a course, many questions that have puzzled musicians, physicists and psychoacousticians for a long time can be addressed--and answered!

One example is the fascinating history of the study of pitch perception. Von Helmholtz interpreted his first experiments with cochleas of dead animals by assuming that musical tones elicited standing waves on the basilar membrane (a band of elastic tissue in the snail-like inner ear duct which holds the organ for acoustical detection). These waves exhibited a maximum amplitude at a position along the basilar membrane which depended on the pitch of the tone in question. Many decades later, von Bksy showed that the basilar membrane really behaves like a "waving flag", with traveling waves reaching their maximum amplitude in resonance regions that indeed depend on pitch.

However, many questions remained unanswered. Why do we perceive only one pitch when we listen to a musical tone made up of many harmonics? Why do we perceive this pitch even when the fundamental frequency is absent? Why do we resolve pitch so well, despite the relatively broad mechanical resonance regions on the basilar membrane? Concerning the first two questions, psychoacoustical and neurophysiological experiments show that for pitch identification, in addition to the position of the resonance region the brain may also use information on the temporal shape of the acoustical vibration pattern. For low frequencies this information is encoded in the form of the time-sequence of electrical nerve pulses--a sort of "neural Morse code". This result may explain why when hearing a musical tone we perceive a single pitch--that of the first harmonic or fundamental--even if this frequency is physically absent in the stimulus. This part of the course can be illustrated with several psychoacoustical demonstrations using "laboratory" equipment such as a reasonably sized pipe organ!

Concerning the third question above, it was not until 15 years ago that in vivo measurements revealed the astounding fact that some of the acoustical detector cells on the basilar membrane have motility. They vibrate "on their own power" at acoustical frequencies and act as feedback-controlled "electromechanical" amplifiers of the external stimulus, thus greatly sharpening the resonance regions! At times these cells vibrate on their own without any external input--our ears are not only acoustical receptors, but can also be acoustical emitters!

Today, even the basic operation of pitch perception is viewed as a pattern recognition process by the brain, in analogy to the more familiar pattern recognition processes in the visual system. Our brain carries (or builds through experience) "templates" against which the signals of incoming sounds are compared--if there is a match with the template that corresponds to a harmonic tone, a musical tone sensation with definite pitch is evoked. This is not unlike the graphic symbol A evoking in your brain a single cognitive signal "it's an A" (and not "it's two inclined lines forming a vertex with a horizontal bar across"). And just as happens with the optical system, if part of the acoustical stimulus is missing, or if the stimulus is somewhat distorted, our brain is still capable of providing the correct sensation! Today, we even speak of "acoustical illusions" to describe many of these effects! An important spin-off of these studies is the formulation of a neural network-based theory of harmony, as well as the explanation of many musical "universals" such as the role of the almighty octave, consonance and dissonance, etc.

Another fascinating question that can be discussed in a modern interdisciplinary course of music perception is: Why is there music? Musicians are often startled when confronted with this question. They exclaim: "How silly! How trivial! The world is full of music--it is a sublime expression of human culture, the manifestation of human appreciation of beauty, a divine expression of human emotion!" Why has there been music since the dawn of humankind? There is no music in the environment, and there seems to be no immediately obvious evolutionary advantage to music (bird song is music to us, but for the birds its just their way to communicate). What information does music communicate? Why do the simple musical sounds of the mother's lullaby calm a baby, why can we get goose bumps when we listen to a passage of music which has no relation whatsoever to any sounds in our natural environment?

Many scientists today believe that there is a link between the motivation to listen to and create music, and the evolution of human language. In the perception of human speech the auditory perceptual and cognitive systems are pushed to their limits of information-processing. It is therefore conceivable that with the evolution of human language a drive emerged to train the acoustical sense in sophisticated sound pattern recognition and interpretation as part of a human instinct to acquire language from the moment of birth. Simple musical sounds as vocalized by the mother arouse the attention of an infant to listen to, analyze and store sounds as a prelude to the acquisition of language. This may be the ultimate basis of the motivation to listen to, analyze, store and also vocalize musical sounds, and the emotional reaction or limbic reward when this is done.

Finally, many of the recent advances in understanding brain function have come from the new non-invasive techniques of functional nuclear magnetic resonance and positron emission tomography. They provide amazing views on how music is processed, and their discussion could be the real highlight of a course on music perception.

For instance, the question of what happens in your brain when you imagine music (a process of crucial relevance to any composer!) is beginning to be understood. The principal stages of neural processing in the perception of sound have been known for some time. They involve neural signal transmission and processing starting in the inner ear and progressing through the brain stem to the so-called receiving areas in the auditory cortex; then on to the cortical association areas and to the frontal lobes, where the higher cognitive functions are performed. Now we know, in analogy with the visual sense (for which detailed experimental information is more easily obtained), that a mental image of sound results from the triggering of an inverse process: a neural command from the frontal and pre-frontal cortices elicits specific neural activity that propagates all the way "down" to the primary auditory cortex. The activity triggered is very similar to the one that would appear, had the imagined sound actually been fed into the ears. In other words, if you imagine the "Ta-ta-ta-taah" of the first bars of Beethoven's Fifth Symphony, not much different happens in your brain than when you actually hear that music played in concert!

In summary, a truly interdisciplinary course on the perception of music is an excellent venue to deliver interesting and useful information to university students of any orientation, offering them a glimpse into cutting edge science. It allows the student to experience not only "Science of Music" but also the beauty and harmony of the "Music of Science"!