A comparison of neural circuits underlying auditory and visual object categorization

Reginald B. Adams, Petr Janata

Research output: Contribution to journalArticlepeer-review

85 Scopus citations

Abstract

Knowledge about environmental objects derives from representations of multiple object features both within and across sensory modalities. While our understanding of the neural basis for visual object representation in the human and nonhuman primate brain is well advanced, a similar understanding of auditory objects is in its infancy. We used a name verification task and functional magnetic resonance imaging (fMRI) to characterize the neural circuits that are activated as human subjects match visually presented words with either simultaneously presented pictures or environmental sounds. The difficulty of the matching judgment was manipulated by varying the level of semantic detail at which the words and objects were compared. We found that blood oxygen level dependent (BOLD) signal was modulated in ventral and dorsal regions of the inferior frontal gyrus of both hemispheres during auditory and visual object categorization, potentially implicating these areas as sites for integrating polymodal object representations with concepts in semantic memory. As expected, BOLD signal increases in the fusiform gyrus varied with the semantic level of object categorization, though this effect was weak and restricted to the left hemisphere in the case of auditory objects.

Original languageEnglish (US)
Pages (from-to)361-377
Number of pages17
JournalNeuroImage
Volume16
Issue number2
DOIs
StatePublished - 2002

All Science Journal Classification (ASJC) codes

  • Neurology
  • Cognitive Neuroscience

Fingerprint Dive into the research topics of 'A comparison of neural circuits underlying auditory and visual object categorization'. Together they form a unique fingerprint.

Cite this