Medical College of Wisconsin
CTSICores SearchResearch InformaticsREDCap

A Distributed Network for Multimodal Experiential Representation of Concepts. J Neurosci 2022 Sep 14;42(37):7121-7130

Date

08/09/2022

Pubmed ID

35940877

Pubmed Central ID

PMC9480893

DOI

10.1523/JNEUROSCI.1243-21.2022

Scopus ID

2-s2.0-85138448742 (requires institutional sign-in at Scopus site)   13 Citations

Abstract

Neuroimaging, neuropsychological, and psychophysical evidence indicate that concept retrieval selectively engages specific sensory and motor brain systems involved in the acquisition of the retrieved concept. However, it remains unclear which supramodal cortical regions contribute to this process and what kind of information they represent. Here, we used representational similarity analysis of two large fMRI datasets with a searchlight approach to generate a detailed map of human brain regions where the semantic similarity structure across individual lexical concepts can be reliably detected. We hypothesized that heteromodal cortical areas typically associated with the default mode network encode multimodal experiential information about concepts, consistent with their proposed role as cortical integration hubs. In two studies involving different sets of concepts and different participants (both sexes), we found a distributed, bihemispheric network engaged in concept representation, composed of high-level association areas in the anterior, lateral, and ventral temporal lobe; inferior parietal lobule; posterior cingulate gyrus and precuneus; and medial, dorsal, ventrolateral, and orbital prefrontal cortex. In both studies, a multimodal model combining sensory, motor, affective, and other types of experiential information explained significant variance in the neural similarity structure observed in these regions that was not explained by unimodal experiential models or by distributional semantics (i.e., word2vec similarity). These results indicate that during concept retrieval, lexical concepts are represented across a vast expanse of high-level cortical regions, especially in the areas that make up the default mode network, and that these regions encode multimodal experiential information.SIGNIFICANCE STATEMENT Conceptual knowledge includes information acquired through various modalities of experience, such as visual, auditory, tactile, and emotional information. We investigated which brain regions encode mental representations that combine information from multiple modalities when participants think about the meaning of a word. We found that such representations are encoded across a widely distributed network of cortical areas in both hemispheres, including temporal, parietal, limbic, and prefrontal association areas. Several areas not traditionally associated with semantic cognition were also implicated. Our results indicate that the retrieval of conceptual knowledge during word comprehension relies on a much larger portion of the cerebral cortex than previously thought and that multimodal experiential information is represented throughout the entire network.

Author List

Tong J, Binder JR, Humphries C, Mazurchuk S, Conant LL, Fernandino L

Authors

Jeffrey R. Binder MD Professor in the Neurology department at Medical College of Wisconsin
Leonardo Fernandino PhD Assistant Professor in the Neurology department at Medical College of Wisconsin




MESH terms used to index this publication - Major topics in bold

Brain Mapping
Comprehension
Female
Humans
Magnetic Resonance Imaging
Male
Parietal Lobe
Semantics