Alignment of a flexible sheet object with position-based and image-based visual servoing

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)


This paper describes visual servoing methods to align an end edge of a flexible sheet object such as paper to a specified line segment in 3D space. In order to manipulate the end edge pose which may be distant from a grasping point by a robot hand on the flexible sheet, we incorporate online estimation of the relative pose between the end edge and the robot hand into visual servoing control laws. Two types of feature values to be regulated, position-based and image-based ones, are defined. The position-based one consists of 3D pose parameters of the end edge, which are estimated by fitting a polynomial surface model to stereo-measured edge points around the sheet. The image-based one consists of image pixel values within narrow regions around the end edge. Through numerical simulations, we confirm that the position-based features are effective in achieving quick response while the image-based features are robust against measurement noise. In order to combine these contrasting behaviors, we introduce a weighted combination of the position-based and image-based features. Experimental results show both of the position-based method and the weighted combination method work well and suggest the possibility that the weighted combination method can reduce final alignment errors.

Original languageEnglish
Pages (from-to)965-978
Number of pages14
JournalAdvanced Robotics
Issue number15
Publication statusPublished - 2016 Aug 2


  • Hybrid visual servo
  • deformable object
  • high-speed visual feedback
  • weighted least square

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications


Dive into the research topics of 'Alignment of a flexible sheet object with position-based and image-based visual servoing'. Together they form a unique fingerprint.

Cite this