PapersFlow Research Brief

Social Sciences · Psychology

Multisensory perception and integration
Research Guide

What is Multisensory perception and integration?

Multisensory perception and integration is the process by which the brain combines information from multiple sensory modalities, such as vision, touch, and hearing, to form a unified perceptual experience.

Multisensory integration involves crossmodal processing where sensory inputs like audiovisual interactions enhance perception, as demonstrated in classic experiments. There are 37,043 works on this topic in experimental and cognitive psychology. Research covers neural correlates, synesthetic experiences, perceptual enhancement, sensory expectations, temporal integration, neuronal oscillations, and cortical connectivity.

Topic Hierarchy

100%
graph TD D["Social Sciences"] F["Psychology"] S["Experimental and Cognitive Psychology"] T["Multisensory perception and integration"] D --> F F --> S S --> T style T fill:#DC5238,stroke:#c4452e,stroke-width:2px
Scroll to zoom • Drag to pan
37.0K
Papers
N/A
5yr Growth
699.8K
Total Citations

Research Sub-Topics

Why It Matters

Multisensory integration improves perceptual accuracy in everyday tasks, such as humans combining visual and haptic cues in a statistically optimal manner, which Ernst and Banks (2002) quantified showing near-optimal weighting based on sensory reliability. McGurk and MacDonald (1976) revealed audiovisual speech integration through the McGurk effect, where conflicting visual lip movements alter heard speech sounds, impacting speech therapy and auditory prosthesis design. Craig (2002) established interoception as a sensory modality integrated with exteroceptive senses, influencing emotion recognition and clinical interventions for anxiety disorders. These findings apply to virtual reality systems, rehabilitation robotics, and human-computer interfaces requiring precise sensory feedback.

Reading Guide

Where to Start

'Hearing lips and seeing voices' by McGurk and MacDonald (1976) is the starting point for beginners because it provides a concrete, replicable demonstration of audiovisual integration through the McGurk effect, foundational to understanding basic principles.

Key Papers Explained

McGurk and MacDonald (1976) established audiovisual speech integration, which Ernst and Banks (2002) extended to vision-haptics with optimal statistical principles, while Craig (2002) incorporated interoception as a key modality converging in insula networks detailed by Menon and Uddin (2010). Mesulam (1998) frames these processes within a broader sensory hierarchy from sensation to cognition, and Ekman and Friesen (1971) link facial emotion universals to crossmodal recognition.

Paper Timeline

100%
graph LR P0["Constants across cultures in the...
1971 · 5.3K cites"] P1["Hearing lips and seeing voices
1976 · 6.0K cites"] P2["Forest before trees: The precede...
1977 · 4.0K cites"] P3["How do you feel? Interoception: ...
2002 · 5.8K cites"] P4["Humans integrate visual and hapt...
2002 · 4.8K cites"] P5["Saliency, switching, attention a...
2010 · 5.4K cites"] P6["Perception and Communication
2016 · 4.7K cites"] P0 --> P1 P1 --> P2 P2 --> P3 P3 --> P4 P4 --> P5 P5 --> P6 style P1 fill:#DC5238,stroke:#c4452e,stroke-width:2px
Scroll to zoom • Drag to pan

Most-cited paper highlighted in red. Papers ordered chronologically.

Advanced Directions

Current research emphasizes neural correlates of temporal integration and cortical connectivity, building on Mesulam (1998) hierarchies and Menon and Uddin (2010) insula models. Investigations probe variability in perceptual enhancement and sensory expectations amid absent recent preprints.

Papers at a Glance

# Paper Year Venue Citations Open Access
1 Hearing lips and seeing voices 1976 Nature 6.0K
2 How do you feel? Interoception: the sense of the physiological... 2002 Nature reviews. Neuros... 5.8K
3 Saliency, switching, attention and control: a network model of... 2010 Brain Structure and Fu... 5.4K
4 Constants across cultures in the face and emotion. 1971 Journal of Personality... 5.3K
5 Humans integrate visual and haptic information in a statistica... 2002 Nature 4.8K
6 Perception and Communication 2016 4.7K
7 Forest before trees: The precedence of global features in visu... 1977 Cognitive Psychology 4.0K
8 Visual search and stimulus similarity. 1989 Psychological Review 3.6K
9 The NimStim set of facial expressions: Judgments from untraine... 2009 Psychiatry Research 3.4K
10 From sensation to cognition 1998 Brain 3.1K

Frequently Asked Questions

What is the McGurk effect?

The McGurk effect is an illusion where visual lip movements incongruent with heard speech sounds cause perceivers to experience a blended audiovisual speech percept. McGurk and MacDonald (1976) demonstrated this in 'Hearing lips and seeing voices,' showing robust integration even when vision dominates audition. It illustrates automatic multisensory binding in speech perception.

How do humans integrate visual and haptic information?

Humans integrate visual and haptic information in a statistically optimal fashion by weighting each sense according to its reliability. Ernst and Banks (2002) showed in experiments that perceived object size matches Bayesian predictions when visual and touch cues conflict. This optimality holds across varying sensory noise levels.

What role does the insula play in multisensory integration?

The insula functions in saliency detection, attention switching, and interoceptive awareness within multisensory networks. Menon and Uddin (2010) proposed a network model in 'Saliency, switching, attention and control: a network model of insula function' linking it to integration of bodily and external signals. It modulates sensory processing for adaptive behavior.

What is interoception in multisensory perception?

Interoception is the sense of the physiological condition of the body, integrated with other sensory modalities. Craig (2002) mapped its neural pathways in 'How do you feel? Interoception: the sense of the physiological condition of the body,' emphasizing anterior insula convergence. It contributes to emotional and self-awareness states.

How does sensory information progress to cognition?

Sensory information progresses from primary sensory areas through unimodal association, heteromodal, and paralimbic regions to cognition. Mesulam (1998) described this hierarchy in 'From sensation to cognition,' highlighting attentional modulation and associative elaboration. This pathway enables unified multisensory representations.

Open Research Questions

  • ? How do neuronal oscillations facilitate temporal binding of asynchronous multisensory inputs?
  • ? What are the precise neural mechanisms underlying reliability-based weighting in crossmodal integration?
  • ? How does cortical connectivity vary across individuals in synesthetic experiences?
  • ? What factors determine the dominance of one sensory modality over another in conflicting stimuli?
  • ? How do expectations modulate early versus late stages of multisensory processing?

Research Multisensory perception and integration with AI

PapersFlow provides specialized AI tools for Psychology researchers. Here are the most relevant for this topic:

See how researchers in Social Sciences use PapersFlow

Field-specific workflows, example queries, and use cases.

Social Sciences Guide

Start Researching Multisensory perception and integration with AI

Search 474M+ papers, run AI-powered literature reviews, and write with integrated citations — all in one workspace.

See how PapersFlow works for Psychology researchers