Abstract
In virtual reality (VR), 360? video services provided through head mounted displays (HMDs) or smartphones are widely used. Despite the fact that the user's viewpoint seamlessly changes, sounds through the headphones are fixed even when images change in correspondence with user head motion in many 360 ? video services. We have been studying acoustic immersion technology that is achieved by, for example, generating binaural sounds corresponding to the user head motion. Basically, our method is composed of angular region-wise source enhancement using array observation signals, multichannel audio encoding based on MPEG-4 Audio Lossless Coding (ALS), and binaural synthesizing of enhanced signals using head related transfer functions (HRTFs). In this paper, we constructed a smartphone-based real-Time system for streaming/viewing 360 ? video including acoustic immersion and evaluated it through subjective tests.
Original language | English |
---|---|
Publication status | Published - 2016 |
Event | 141st Audio Engineering Society International Convention 2016, AES 2016 - Los Angeles, United States Duration: 2016 Sep 29 → 2016 Oct 2 |
Other
Other | 141st Audio Engineering Society International Convention 2016, AES 2016 |
---|---|
Country/Territory | United States |
City | Los Angeles |
Period | 16/9/29 → 16/10/2 |
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Modelling and Simulation
- Acoustics and Ultrasonics