Frequency-specific directed interactions between whole-brain regions during sentence processing using multimodal stimulus
Document Type
Article
Publication Date
8-24-2023
Abstract
Neural oscillations subserve a broad range of speech processing and language comprehension functions. Using an electroencephalogram (EEG), we investigated the frequency-specific directed interactions between whole-brain regions while the participants processed Chinese sentences using different modality stimuli (i.e., auditory, visual, and audio-visual). The results indicate that low-frequency responses correspond to the process of information flow aggregation in primary sensory cortices in different modalities. Information flow dominated by high-frequency responses exhibited characteristics of bottom-up flow from left posterior temporal to left frontal regions. The network pattern of top-down information flowing out of the left frontal lobe was presented by the joint dominance of low- and high-frequency rhythms. Overall, our results suggest that the brain may be modality-independent when processing higher-order language information.
Identifier
85166468219 (Scopus)
Publication Title
Neuroscience Letters
External Full Text Location
https://doi.org/10.1016/j.neulet.2023.137409
e-ISSN
18727972
ISSN
03043940
PubMed ID
37487970
Volume
812
Grant
23ZDYF0961
Fund Ref
Social Trends Institute
Recommended Citation
Pei, Changfu; Huang, Xunan; Qiu, Yuan; Peng, Yueheng; Gao, Shan; Biswal, Bharat; Yao, Dezhong; Liu, Qiang; Li, Fali; and Xu, Peng, "Frequency-specific directed interactions between whole-brain regions during sentence processing using multimodal stimulus" (2023). Faculty Publications. 1514.
https://digitalcommons.njit.edu/fac_pubs/1514