ASA PRESSROOM


Acoustical Society of America
159th Meeting Lay Language Papers


[ Lay Language Paper Index | Press Room ]



How Does the Brain Pay Attention to Interesting Sounds? The Role of Top-down Signals in Shaping the Listening Preferences of Chameleon Neurons.

 

Jonathan B. Fritz - ripple@isr.umd.edu

Stephen V. David - svd@umd.edu

Daniel Winkowski - winkows@umd.edu

Pingbo Yin - pyin@umd.edu

Shihab A. Shamma - sas@isr.umd.edu

Institute for Systems Research

Department of Electrical and Computer Engineering

University of Maryland, College Park, MD 20742

 

Mounya Elhilali

Department of Electrical and Computer Engineering

Johns Hopkins University, 3400 N Charles Street

Baltimore, MD 21218

 

Popular version of paper 5pAB2

Presented Friday afternoon, April 23, 2010

159th ASA Meeting, Baltimore, MD

 

 

One of the greatest challenges in understanding how our brains function, now being intensely studied in many laboratories around the world, is discovering the neural mechanisms underlying attention. We can effortlessly zoom in on one conversation at a crowded cocktail party or focus our attention on a violin soloist playing in an orchestra. How do we do it? Over one hundred years ago, William James, the famous American psychologist wrote in his Principles of Psychology (1890): Everyone knows what attention is. It is the taking possession by the mind, in clear and vivid form, of one out of what may seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others and is a condition which has a real opposite in the confused, dazed, scatterbrained state of distraction in the blooming, buzzing confusion of the world.

 

In order to make progress in the study of attention, we have recently begun working on the neural basis of attention in the ferret, a smart and inquisitive carnivore. Our first insight into the role of attention in modulating brain function was the discovery that individual neurons in the auditory cortex of the ferret could swiftly change their listening preferences or receptive field properties, depending upon what the ferret was most interested in listening to at that moment. By developing techniques to take snapshot pictures of neurons dynamically changing listening preferences in the awake, behaving ferret, we were able to monitor moment-by-moment changes in neuronal receptive fields as the ferret switched from one attending to one sound or another for example, changing from focusing on a low tone to focusing on a high tone, or discriminating a harmonic chord from a noise. Unlike the traditional view of the neuron, which described a fixed and unchanging receptive field for each neuronal cell, we soon realized that many neurons have chameleon like properties that allow them to rapidly change their listening preferences in order to optimize their abilities to perceive an attended salient sound. Our results add to a growing chorus of scientific evidence for extraordinary neuronal plasticity, even in the adult animal, that is mediated by attention. Similar findings of rapid, adaptive brain plasticity during task switching and attentional focus have been shown by brain imaging techniques in humans.

 

So, how do chameleon neurons know how to change their listening preferences? Do they receive an instructive signal from elsewhere in the brain? In order to answer this question, we began to study the interaction between the frontal cortex in the ferret, and the auditory cortex. Previous studies in monkeys had suggested the importance of top-down signals from frontal cortex, and so we began recording from individual cells in the ferret frontal cortex, while simultaneously recording from cells in auditory cortex, in order to eavesdrop on the conversation between these two brain areas during attentive behavior.

 

Our second discovery was that some frontal cortex neurons showed rapid aha! or recognition responses, which allowed the frontal cells to zero in on the sound of interest and categorically distinguish between acoustic foreground and background stimuli thus acting like pure attention cells. The frontal cells did not respond at all to a sound that was presented in the background. But if the same sound became the focus of attentive interest, the frontal cells began to respond dramatically. This neuronal behavior is quite unlike the responses of neurons in the ferret auditory cortex, which are specifically attuned to specific acoustic features of the sound. In contrast, the responses of the frontal neurons were much more abstract and encoded the meaning of the sound, often independent of the acoustic properties of the sound and were ONLY interested when the sound was selectively attended.

 

Our third insight into the attention system came from simultaneous recordings in frontal and auditory cortex. We found that there was a striking change in the coherence of ensemble neuronal activity in the two areas, when the ferret engaged in behavior, and that this change in coherence was highly specific to the pitch of the sound that the ferret was attending to. These results suggest that there is a sharply tuned interaction between frontal cortex and auditory cortex, in which frontal cortex modulates the specific areas in auditory cortex that respond to a sound of interest by shining an attentional spotlight there.

 

What would happen to the chameleon cells in auditory cortex if you were to simulate attentional effects by artificial stimulation of frontal cortex? In recent experiments, we have stimulated frontal cortex while simultaneously playing a tone. We find that we can induce the chameleon cells to change their receptive field pattern in a similar way that we observe during natural behavior.

 

These results suggest that there is an attentional network for active listening that includes key, interactive components in the frontal and auditory cortex. Our current research focuses on elucidating other components in this brain network, figuring out the mechanisms of their action, and discovering how they work together to allow us to selectively extract and listen to one voice in a complex world of multiple, overlapping sounds.