Abstract
Very little is known about how auditory categories are learned incidentally, without instructions to search for category-diagnostic dimensions, overt category decisions, or experimenter-provided feedback. This is an important gap because learning in the natural environment does not arise from explicit feedback and there is evidence that the learning systems engaged by traditional tasks are distinct from those recruited by incidental category learning. We examined incidental auditory category learning with a novel paradigm, the Systematic Multimodal Associations Reaction Time (SMART) task, in which participants rapidly detect and report the appearance of a visual target in 1 of 4 possible screen locations. Although the overt task is rapid visual detection, a brief sequence of sounds precedes each visual target. These sounds are drawn from 1 of 4 distinct sound categories that predict the location of the upcoming visual target. These many-to-one auditory-to-visuomotor correspondences support incidental auditory category learning. Participants incidentally learn categories of complex acoustic exemplars and generalize this learning to novel exemplars and tasks. Further, learning is facilitated when category exemplar variability is more tightly coupled to the visuomotor associations than when the same stimulus variability is experienced across trials. We relate these findings to phonetic category learning.
Original language | English |
---|---|
Pages (from-to) | 1124-1138 |
Number of pages | 15 |
Journal | Journal of Experimental Psychology: Human Perception and Performance |
Volume | 41 |
Issue number | 4 |
DOIs | |
State | Published - 1 Aug 2015 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2015 American Psychological Association.
Keywords
- Auditory category learning
- Category training
- Incidental learning
- Speech
- Statistical learning
ASJC Scopus subject areas
- Experimental and Cognitive Psychology
- Arts and Humanities (miscellaneous)
- Behavioral Neuroscience