Multimodal interaction of auditory spatial cues and passive observer movement in simulated self motion

William L. Martens, Shuichi Sakamoto, Yôiti Suzuki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

When the movement of an observer through a three-dimensional (3D) space is simulated via a multimodal display system, the synchrony of cues from different sensory modalities can help or hinder the creation of illusions of self motion. Stereoscopic stimuli provide particularly strong cues to self motion, through both binocular and motion disparity, but in the absence of these visual cues, movement cues can also be provided through spatial audio and through "suggestive" movement of an observer using a motion platform. Indeed, passive movement of observers watching 3DTV can suggest more dramatic movement in space, which can facilitate interpretation of auditory cues, reinforcing illusions of observer movement through a presented space. This multimodal interaction is important for successful presentation of 3D information in future 3DTV applications, since changes in listening position can result in apparent sound source motion relative to a fixed listening position, rather than supporting illusions of self motion through a static scene. In order to quantitatively measure such effects, a multimodal interaction study was initiated in which visual cues were eliminated so as to focus upon sensitivity to temporal synchrony between passive whole-body motion and auditory spatial information. For simple spatial trajectories of two sound sources passing by the observer's position, the relative timing of passive movement of the observer was manipulated to produce a range of intermodal delays. It was found that decreasing the peak velocity reached by the moving sound sources allowed observers to tolerate more easily the presented asynchronies between the timing of this peak and the peak in whole-body motion, especially when the peak in motion occurred earlier in time than the peak in sound source velocity.

Original languageEnglish
Title of host publication3DTV-CON 2009 - 3rd 3DTV-Conference
Subtitle of host publicationThe True Vision - Capture, Transmission and Display of 3D Video, Proceedings
DOIs
Publication statusPublished - 2009 Oct 19
Event3rd 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2009 - Potsdam, Germany
Duration: 2009 May 42009 May 6

Publication series

Name3DTV-CON 2009 - 3rd 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, Proceedings

Other

Other3rd 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2009
CountryGermany
CityPotsdam
Period09/5/409/5/6

Keywords

  • Auditory induced vection
  • Multimodal display systems
  • Multimodal information processing
  • Self motion perception
  • Spatial perception

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Multimodal interaction of auditory spatial cues and passive observer movement in simulated self motion'. Together they form a unique fingerprint.

Cite this