HMM-based state classification of a user with a walking support system using visual PCA features

Sajjad Taghvaei, Kazuhiro Kosuge

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

The improvement of safety and dependability in systems that physically interact with humans requires investigation with respect to the possible states of the users motion and an attempt to recognize these states. In this study, we propose a method for real-time visual state classification of a user with a walking support system. The visual features are extracted using principal component analysis and classification is performed by hidden Markov models, both for real-time fall detection (one-class classification) and real-time state recognition (multi-class classification). The algorithms are used in experiments with a passive-type walker robot called "RT Walker" equipped with servo brakes and a depth sensor (Microsoft Kinect). The experiments are performed with 10 subjects, including an experienced physiotherapist who can imitate the walking pattern of the elderly and people with disabilities. The results of the state classification can be used to improve fall-prevention control algorithms for walking support systems. The proposed method can also be used for other vision-based classification applications, which require real-time abnormality detection or state recognition.

Original languageEnglish
Pages (from-to)219-230
Number of pages12
JournalAdvanced Robotics
Volume28
Issue number4
DOIs
Publication statusPublished - 2014 Feb 16

Keywords

  • PCA feature extraction
  • hidden Markov models
  • human state classification
  • walking support systems

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications

Fingerprint Dive into the research topics of 'HMM-based state classification of a user with a walking support system using visual PCA features'. Together they form a unique fingerprint.

Cite this