Generation of stable and realistic haptic feedback during mid-air gesture interactions have recently garnered significant research interest. However, the limitations of the sensing technologies such as unstable tracking, range limitations, nonuniform sampling duration, self occlusions, and motion recognition faults significantly distort motion based haptic feedback to a large extent. In this paper, we propose and implement a hidden Markov model (HMM)-based motion synthesis method to generate stable concurrent and terminal vibrotactile feedback. The system tracks human gestures during interaction and recreates smooth, synchronized motion data from detected HMM states. Four gestures—tapping, three-fingered zooming, vertical dragging, and horizontal dragging—were used in the study to evaluate the performance of the motion synthesis methodology. The reference motion curves and corresponding primitive motion elements to be synthesized for each gesture were obtained from multiple subjects at different interaction speeds by using a stable motion tracking sensor. Both objective and subjective evaluations were conducted to evaluate the performance of the motion synthesis model in controlling both concurrent and terminal vibrotactile feedback. Objective evaluation shows that synthesized motion data had a high correlation for shape and end-timings with the reference motion data compared to measured and moving average filtered data. The mean R2 values for synthesized motion data was always greater than 0.7 even under unstable tracking conditions. The experimental results of subjective evaluation from nine subjects showed significant improvement in perceived synchronization of vibrotactile feedback based on synthesized motion.
ASJC Scopus subject areas