Every day we integrate meaningful information coming from different sensory modalities, and previous work has debated whether conceptual knowledge is represented in modality-specific neural stores specialized for specific types of information, and/or in an amodal, shared system. In the current study, we investigated semantic processing through a cross-modal paradigm which asked whether auditory semantic processing could be modulated by the constraints of context built up across a meaningful visual narrative sequence. We recorded event-related brain potentials (ERPs) to auditory words and sounds associated to events in visual narratives-i.e., seeing images of someone spitting while hearing either a word (Spitting!) or a sound (the sound of s...
Understanding language always occurs within a situational context and, therefore, often implies comb...
Phonological and semantic processing was studied using high-resolution event-related brain potential...
Contains fulltext : 73167.pdf (publisher's version ) (Open Access)Understanding la...
Every day we integrate meaningful information coming from different sensory modalities, and previous...
Researchers have long questioned whether information presented through different sensory modalities ...
Events in the world are inherently multimodal. A ball bouncing provides correlated auditory and visu...
& How do comprehenders build up overall meaning rep-resentations of visual real-world events? Th...
The efficacy of audiovisual (AV) integration is reflected in the degree of cross-modal suppression o...
The time-course of cross-modal semantic interactions between pictures and either naturalistic sounds...
Does lexical processing rely on a specialized semantic network in the brain, or does it draw on more...
To investigate the processing of environmental sounds, previous researchers have compared the semant...
Contains fulltext : 72607.pdf (publisher's version ) (Open Access)Understanding la...
This study investigates N400, the event related potential (ERP) that reflects semantic processing in...
Conceptual knowledge is thought to be represented in a large distributed network, indexing a range o...
Access restricted to the OSU CommunityIn real-world circumstances, we often hear auditory input whil...
Understanding language always occurs within a situational context and, therefore, often implies comb...
Phonological and semantic processing was studied using high-resolution event-related brain potential...
Contains fulltext : 73167.pdf (publisher's version ) (Open Access)Understanding la...
Every day we integrate meaningful information coming from different sensory modalities, and previous...
Researchers have long questioned whether information presented through different sensory modalities ...
Events in the world are inherently multimodal. A ball bouncing provides correlated auditory and visu...
& How do comprehenders build up overall meaning rep-resentations of visual real-world events? Th...
The efficacy of audiovisual (AV) integration is reflected in the degree of cross-modal suppression o...
The time-course of cross-modal semantic interactions between pictures and either naturalistic sounds...
Does lexical processing rely on a specialized semantic network in the brain, or does it draw on more...
To investigate the processing of environmental sounds, previous researchers have compared the semant...
Contains fulltext : 72607.pdf (publisher's version ) (Open Access)Understanding la...
This study investigates N400, the event related potential (ERP) that reflects semantic processing in...
Conceptual knowledge is thought to be represented in a large distributed network, indexing a range o...
Access restricted to the OSU CommunityIn real-world circumstances, we often hear auditory input whil...
Understanding language always occurs within a situational context and, therefore, often implies comb...
Phonological and semantic processing was studied using high-resolution event-related brain potential...
Contains fulltext : 73167.pdf (publisher's version ) (Open Access)Understanding la...