12/23/2023 0 Comments Soundsource headphone eqIn quiet, free-field acoustic environments, humans can localize a broadband, single sound source accurately and precisely on the basis of independent acoustic spatial cues, consisting of interaural timing and level differences (ITDs and ILDs, respectively for the azimuth angle see 1, 2 see 3 for definition of coordinate system), and pinna-induced spectral-shape cues (for elevation see 1, 4– 7). We found that even when the auditory system has accurate representations of both sources, it still has trouble to decide whether the scene contained one or two sounds, and in which order they appeared. We systematically varied spatial-temporal disparities for two sounds and instructed listeners to generate goal-directed head movements. NEW & NOTEWORTHY Sound localization requires spectral and temporal processing of implicit acoustic cues, and is seriously challenged when multiple sources coincide closely in space and time. We propose that the percept of temporal order of two concurrent sounds results from a different process than localization and discuss how dynamic lateral excitatory-inhibitory interactions within a spatial sensorimotor map could explain the findings. We show that the percept of two sounds occurring requires sufficient spatiotemporal separation, after which localization can be performed with high accuracy. Perceived locations were interchanged often in their temporal order (in ∼40% of trials). When participants perceived two sounds, the first and the second response were directed to either of the leading and lagging source locations. For longer delays, responses were again directed toward a weighted average. Increasing spatial separation enhanced this effect. ![]() For short delays, responses were directed toward the leading stimulus location. Whenever participants heard one sound, their localization responses for synchronous sounds were oriented to a weighted average of both source locations. Leading and lagging stimuli in close spatial proximity required longer stimulus delays to be perceptually separated than those further apart. Results suggest that perceptual fusion of the two sounds depends on delay and spatial separation. ![]() Participants indicated how many sounds were heard and where these were perceived by making one or two head-orienting localization responses. We examined the effects of spatial separation and interstimulus delay on the ability of human listeners to localize a pair of broadband sounds in the horizontal plane. To program a goal-directed response in the presence of acoustic reflections, the audio-motor system should suppress the detection of time-delayed sources.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |