Vibrotactile rendering of camera motion for bimanual experience of first-person view videos

Daniel Gongora, Hikaru Nagano, Masashi Konyo, Satoshi Tadokoro

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

We propose a vibrotactile rendering method for the motion of a camera in first-person view videos that enables people to feel the movement of the camera with both hands. Concretely, we consider an arrangement of two vibrotactile actuators to render panning movements on the horizontal axis as vibrations that move from hand to hand, and to represent sudden vertical displacements of the camera as transient vibrations on both hands. We investigate three representation methods for the panning motion based on the estimated velocity and acceleration of the camera and a combination of both. In a preliminary user experiment, we observed favorable effects of applying our proposed rendering method on the perceived realism and satisfaction associated with the experience of watching a video.

Original languageEnglish
Title of host publication2017 IEEE World Haptics Conference, WHC 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages454-459
Number of pages6
ISBN (Electronic)9781509014255
DOIs
Publication statusPublished - 2017 Jul 21
Event7th IEEE World Haptics Conference, WHC 2017 - Munich, Germany
Duration: 2017 Jun 62017 Jun 9

Publication series

Name2017 IEEE World Haptics Conference, WHC 2017

Other

Other7th IEEE World Haptics Conference, WHC 2017
CountryGermany
CityMunich
Period17/6/617/6/9

ASJC Scopus subject areas

  • Instrumentation
  • Cognitive Neuroscience
  • Sensory Systems
  • Human Factors and Ergonomics
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Vibrotactile rendering of camera motion for bimanual experience of first-person view videos'. Together they form a unique fingerprint.

Cite this