Sound Localization

julianne
February 8, 2023

Editors Note: Barrett St. George, while at the University of Arizona did some extensive work on sound localization that led to his Ph.D. dissertation. I have asked him to summarize his research for a 5 minute read for our Pathways readers.

 

by Barrett St. George
Hearing & Balance Doctors, St. George, Utah

Sound localization is an auditory skill often overlooked in terms of the functional value it affords the listener. The auditory scene envelops the listener over 360 degrees. Environmental sounds are often heard much earlier relative to when the source of the sound is actually seen. For example, recall crossing a busy street. It’s really nice to be able to hear (and localize) oncoming traffic, even when you’re looking the opposite way… Humans would literally need eyes on the back (and top, and bottom) of their heads for the visual scene to rival that of audition. Furthermore, sounds can often be heard through visual barriers including doors and even walls. For these reasons, spatial listening skills are vital to situational awareness and personal safety. Moreover, the ability to understand speech in the presence of competing noise(s) is dependent on spatial hearing ability. This phenomenon is often referred to as the “cocktail party effect” or “auditory stream segregation”. Overall, knowing where to listen improves situational awareness, sound source identification, and speech perception, especially in the presence of noise.

For humans, sound localization in the horizontal or azimuthal plane provides most functional value, compared to sound localization on the median or vertical plane. To determine the azimuthal location of sound sources, the central auditory nervous system (CANS) compares information between the ears. Specifically, interaural level differences (ILDs; sound is louder in one ear than the other), and interaural time or phase differences (ITD/IPD; sound arrives at one ear before the other).

The CANS has remarkably acute sensitivity to these interaural differences. Classic psychoacoustic research has shown that ILDs as little as 1 dB, and ITDs of just 10 microseconds are exploited in spatial discrimination tasks. Just so we are on the same page… 10 microseconds is 0.00001 seconds!

Many individuals experience spatial listening difficulties, including people with hearing loss (unilateral, bilateral, conductive, and sensorineural), auditory processing disorder, older adults, and even those who have experienced traumatic brain injury. Accordingly, it is important to be able to evaluate spatial hearing ability as to provide targeted treatment/rehabilitation, when needed. Currently, there is no clinical measure of spatial processing for ILDs and ITDs.

There has been a wealth of research pertaining to auditory spatial processing using behavioral (e.g., psychoacoustic), and neurophysiological (e.g., electroencephalography magnetoencephalography, functional magnetic resonance imaging, positron emission tomography, etc.). The majority of that research has focused on auditory stimuli/targets located in a fixed or stationary spatial location. Yet, many of the sounds we experience in the real world are dynamic with respect to our physical location.  Hence, my research evaluated the relationship between azimuthal plane spatial perception (behavioral data) and cortical electrophysiology (physiological data) for both stationary and moving auditory targets by systematically manipulating ILDs and ITDs. From a clinical perspective, developing perceptual and electrophysiologic tests to assess spatial hearing are a necessary precursor for targeted rehabilitation, when deficits in spatial processing are identified.

sound localization audiology

My research employed two levels of cortical auditory processing to investigate differences in stationary vs. moving sound source localization: I focused on a “top-down” perspective in my first study – the electrophysiology studied was the P300 event-related potential (ERP), which is known to be closely tied to perception and cognition. In my second study, I focused on a “bottom-up” perspective – utilizing the Acoustic Change Complex obligatory ERP, known to be tightly linked to earlier cortical processing of stimulus characteristics.

In the first study, ILDs were manipulated for behavioral and P300 measures of spatial perception. Recall, the P300 evoked response is closely tied to a listener’s perception of an auditory event and the cognitive load of the listening task. In this case, standard (spatially centered) and deviant (spatially lateralized) tokens were presented in an oddball paradigm, and listeners were required to make judgments while listening to the stimuli. The main results in this study were that:

  • Stationary targets were more strongly lateralized relative to moving targets, given equivalent ILDs. This finding shed light on a perceptual “time weighting” listeners utilize with regard to the perceived spatial location of moving auditory targets.
  • All spatialized targets (i.e., both stationary and moving) with larger ILDs evoked larger P300 responses (amplitude), with shorter latencies. This result was not entirely surprising: For P300, we can interpret latency as “processing speed”, and amplitude as “amount of neural activation”. So, larger ILDs (i.e., targets that deviated further from the standard, spatially centered target) were more perceptually salient, therefore bore faster processing speeds and greater neural activation as reflected in P300.
  • There was a strong negative relationship between the perceived velocity of moving auditory targets and P300 latency. Specifically, as the perceived velocity increased, P300 latency decreased. So, in this case, P300 latency can be interpreted as a biomarker for perceived stimulus velocity.

In the second study, both ILDs and ITDs were systematically manipulated for behavioral and Acoustic Change Complex (ACC) measures of spatial perception. This study focused more on the relationship between perceived spatial location and electrophysiology (amplitude and latency), as well as hemispheric differences involved in spatial processing. Recall, the presence of ACC indicates that a change in the acoustic environment has been discriminated at the level of auditory cortex, and, because it is an obligatory response, listeners are not required to attend to the stimuli. The main results in this study were that:

  • Analogous to the first study – stationary targets were more strongly lateralized relative to moving targets, given equivalent ILDs and ITDs. This also supports the notion of a perceptual “time weighting” listeners employ for spatial processing of moving auditory targets.
  • There was a robust positive relationship (R2 = .85) between spatial perception and ACC amplitude for stationary targets: The more perceptually lateralized the target was, the larger the ACC amplitude. So, the ACC has the potential to be used as an objective measure for spatial processing, with great clinical utility!
  • There were significant hemispheric activity differences for slow vs. fast velocity stimuli: The left hemisphere had greater activity for faster velocity stimuli, whereas the right hemisphere had increased activity for slower velocity stimuli. This finding is in line with previous neurophysiological research suggesting that the right and left hemispheres operate in different “timescales”.

The next step in this line of research I would like to focus on is the age-related effects on perception and electrophysiology for stationary and moving auditory targets. Given what we know from previous literature, my hypothesis is that there will be an age-related diminishment in ITD processing – evident in both behavioral and electrophysiological measures, that is even more pronounced for moving auditory targets.

Leave a Reply