PapersFlow Research Brief

Life Sciences · Neuroscience

Tactile and Sensory Interactions
Research Guide

What is Tactile and Sensory Interactions?

Tactile and sensory interactions refer to the neural and perceptual processes by which touch, haptic feedback, and other sensory modalities integrate with visual, auditory, and motor systems, including cross-modal plasticity and sensory substitution mechanisms.

This field encompasses 62,796 works on tactile perception, cross-modal plasticity, haptic interfaces, and sensory substitution devices. Research examines neural reorganization after sensory loss, such as visual cortex activation in blind individuals during Braille reading and vibrotactile displays. Key studies demonstrate statistically optimal integration of visual and haptic information by humans.

Topic Hierarchy

100%
graph TD D["Life Sciences"] F["Neuroscience"] S["Cognitive Neuroscience"] T["Tactile and Sensory Interactions"] D --> F F --> S S --> T style T fill:#DC5238,stroke:#c4452e,stroke-width:2px
Scroll to zoom • Drag to pan
62.8K
Papers
N/A
5yr Growth
808.4K
Total Citations

Research Sub-Topics

Why It Matters

Tactile and sensory interactions enable practical applications in haptic interfaces for virtual reality and sensory substitution devices that aid blind individuals through vibrotactile displays and Braille reading systems. Ernst and Banks (2002) showed in "Humans integrate visual and haptic information in a statistically optimal fashion" that the human brain combines these senses with weights proportional to their reliabilities, achieving performance better than either modality alone, which informs the design of multimodal interfaces in clinical monitoring and human-machine interaction. Epidermal electronics, as described by Kim et al. (2011) in "Epidermal Electronics," match human skin properties for wearable sensors in health monitoring, while stretchable strain sensors reviewed by Amjadi et al. (2016) in "Stretchable, Skin‐Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review" support motion detection and soft robotics.

Reading Guide

Where to Start

"Humans integrate visual and haptic information in a statistically optimal fashion" by Ernst and Banks (2002) is the beginner start because it provides a clear, empirical foundation for multisensory integration with quantifiable results accessible to newcomers.

Key Papers Explained

Ernst and Banks (2002) in "Humans integrate visual and haptic information in a statistically optimal fashion" establishes optimal visual-haptic integration, which Steuer (1992) in "Defining Virtual Reality: Dimensions Determining Telepresence" extends to haptic-rich VR frameworks, while Ishii and Ullmer (1997) in "Tangible bits" builds tangible haptic interfaces upon these perceptual principles. Kim et al. (2011) in "Epidermal Electronics" and Amjadi et al. (2016) in "Stretchable, Skin‐Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review" advance hardware for haptic sensing, applying integration insights to wearables. These connect perception, theory, interfaces, and technology sequentially.

Paper Timeline

100%
graph LR P0["Effects of noise letters upon th...
1974 · 7.2K cites"] P1["A new method for off-line remova...
1983 · 5.0K cites"] P2["Defining Virtual Reality: Dimens...
1992 · 5.2K cites"] P3["Understanding motor events: a ne...
1992 · 3.5K cites"] P4["Tangible bits
1997 · 3.7K cites"] P5["Humans integrate visual and hapt...
2002 · 4.8K cites"] P6["Epidermal Electronics
2011 · 4.5K cites"] P0 --> P1 P1 --> P2 P2 --> P3 P3 --> P4 P4 --> P5 P5 --> P6 style P0 fill:#DC5238,stroke:#c4452e,stroke-width:2px
Scroll to zoom • Drag to pan

Most-cited paper highlighted in red. Papers ordered chronologically.

Advanced Directions

Current frontiers focus on wearable strain sensors for human-machine interfaces and soft robotics, as reviewed in Amjadi et al. (2016), alongside epidermal electronics for clinical tactile monitoring from Kim et al. (2011). No recent preprints or news specify further shifts.

Papers at a Glance

# Paper Year Venue Citations Open Access
1 Effects of noise letters upon the identification of a target l... 1974 Perception & Psychophy... 7.2K
2 Defining Virtual Reality: Dimensions Determining Telepresence 1992 Journal of Communication 5.2K
3 A new method for off-line removal of ocular artifact 1983 Electroencephalography... 5.0K
4 Humans integrate visual and haptic information in a statistica... 2002 Nature 4.8K
5 Epidermal Electronics 2011 Science 4.5K
6 Tangible bits 1997 3.7K
7 Understanding motor events: a neurophysiological study 1992 Experimental Brain Res... 3.5K
8 Stretchable, Skin‐Mountable, and Wearable Strain Sensors and T... 2016 Advanced Functional Ma... 3.0K
9 Unconscious cerebral initiative and the role of conscious will... 1985 Behavioral and Brain S... 2.5K
10 Signal-dependent noise determines motor planning 1998 Nature 2.5K

Frequently Asked Questions

What is cross-modal plasticity in tactile and sensory interactions?

Cross-modal plasticity involves neural reorganization where the loss of one sense, such as vision, leads to activation of visual cortex by tactile inputs like Braille reading in blind individuals. This mechanism underlies sensory substitution devices that convert visual information into tactile or auditory signals. The field documents such adaptations through studies on vibrotactile displays and auditory localization tied to touch.

How do humans integrate visual and haptic information?

Humans integrate visual and haptic information in a statistically optimal fashion, weighting each sense by its reliability. Ernst and Banks (2002) demonstrated this in "Humans integrate visual and haptic information in a statistically optimal fashion," where combined cues yield lower variance than individual modalities. This optimality holds across varying reliability levels of the senses.

What are haptic interfaces in this context?

Haptic interfaces provide tactile feedback in virtual environments and sensory substitution systems. Steuer (1992) defined key dimensions in "Defining Virtual Reality: Dimensions Determining Telepresence," linking telepresence to sensory richness including touch. Ishii and Ullmer (1997) introduced physical-tactile interactions in "Tangible bits," bridging digital bits with tangible atoms.

What role does tactile perception play in sensory substitution?

Tactile perception enables sensory substitution by mapping lost sensory inputs, like vision, onto touch via vibrotactile displays. This recruits visual cortex in blind users during tasks such as Braille reading. The cluster highlights neural mechanisms supporting pleasant touch and haptic feedback in these devices.

Which papers demonstrate neural mechanisms in tactile interactions?

Ernst and Banks (2002) in "Humans integrate visual and haptic information in a statistically optimal fashion" quantify sensory integration optimality. Kim et al. (2011) in "Epidermal Electronics" enable skin-like tactile sensing for monitoring. Amjadi et al. (2016) review strain sensors for wearable tactile applications.

Open Research Questions

  • ? How does the weighting in visual-haptic integration adapt to dynamic changes in sensory reliability during real-world tasks?
  • ? What are the precise neural pathways for visual cortex recruitment in tactile Braille reading among congenitally blind individuals?
  • ? Can epidermal electronics fully replicate pleasant touch perception for therapeutic applications in sensory substitution?
  • ? How does signal-dependent noise influence planning in multimodal tactile-motor interactions?
  • ? What limits the telepresence achieved through tangible haptic interfaces in virtual environments?

Research Tactile and Sensory Interactions with AI

PapersFlow provides specialized AI tools for Neuroscience researchers. Here are the most relevant for this topic:

See how researchers in Life Sciences use PapersFlow

Field-specific workflows, example queries, and use cases.

Life Sciences Guide

Start Researching Tactile and Sensory Interactions with AI

Search 474M+ papers, run AI-powered literature reviews, and write with integrated citations — all in one workspace.

See how PapersFlow works for Neuroscience researchers