Change Detection in Auditory Textures

Yves Boubenec, Jennifer Lawlor, Shihab Shamma, Bernhard Englitz

Many natural sounds have spectrotemporal signatures only on a statistical level, e.g. wind, fire or rain. While their local structure is highly variable, the spectrotemporal statistics of these auditory textures can be used for recognition. This suggests the existence of a neural representation of these statistics. To explore their encoding, we investigated the detectability of changes in the spectral statistics in relation to the properties of the change.
To achieve precise parameter control, we designed a minimal sound texture — a modified cloud of tones — which retains the central property of auditory textures: solely statistical predictability. Listeners had to rapidly detect a change in the frequency marginal probability of the tone cloud occurring at a random time.
The size of change as well as the time available to sample the original statistics were found to correlate positively with hit rate and negatively with reaction time, suggesting the accumulation of noisy evidence. In summary we quantified dynamic aspects of change detection in statistically defined contexts, and found evidence of integration of statistical information.