TY - GEN
T1 - Implementing tactile behaviors using FingerVision
AU - Yamaguchi, Akihiko
AU - Atkeson, Christopher G.
N1 - Funding Information:
achieved angles exceeded the target. This was because the object movement caused by gravity was fast and the sensing and processing frame rate was not enough to respond to that, and the gripper response was not fast enough. VII. CONCLUSION We explored four manipulation strategies that used tactile sensing: gentle grasping, holding, handover, and in-hand manipulation. We used a simple vision-based approach, FingerVision, an optical multimodal-sensing skin for fingers that we proposed in [1]. We developed image processing methods for FingerVision (proximity vision) to provide slip detection, object detection, and object pose estimation. We improved the hardware design of FingerVision, and force estimation. The results of experiments demonstrated that the manipulation strategies with FingerVision were effective. ACKNOWLEDGMENT This material is based upon work supported in part by the US National Science Foundation under grant IIS-1717066.
Publisher Copyright:
© 2017 IEEE.
PY - 2017/12/22
Y1 - 2017/12/22
N2 - We explore manipulation strategies that use vision-based tactile sensing. FingerVision is a vision-based tactile sensor that provides rich tactile sensation as well as proximity sensing. Although many other tactile sensing methods are expensive in terms of cost and/or processing, FingerVision is a simple and inexpensive approach. We use a transparent skin for fingers. Tracking markers placed on the skin provides contact force and torque estimates, and processing images obtained by seeing through the transparent skin provides static (pose, shape) and dynamic (slip, deformation) information. FingerVision can sense nearby objects even when there is no contact since it is vision-based. Also the slip detection is independent from contact force, which is effective even when the force is too small to measure, such as with origami objects. The results of experiments demonstrate that several manipulation strategies with FingerVision are effective. For example the robot can grasp and pick up an origami crane without crushing it. Video: https://youtu.be/L-YbxcyRghQ.
AB - We explore manipulation strategies that use vision-based tactile sensing. FingerVision is a vision-based tactile sensor that provides rich tactile sensation as well as proximity sensing. Although many other tactile sensing methods are expensive in terms of cost and/or processing, FingerVision is a simple and inexpensive approach. We use a transparent skin for fingers. Tracking markers placed on the skin provides contact force and torque estimates, and processing images obtained by seeing through the transparent skin provides static (pose, shape) and dynamic (slip, deformation) information. FingerVision can sense nearby objects even when there is no contact since it is vision-based. Also the slip detection is independent from contact force, which is effective even when the force is too small to measure, such as with origami objects. The results of experiments demonstrate that several manipulation strategies with FingerVision are effective. For example the robot can grasp and pick up an origami crane without crushing it. Video: https://youtu.be/L-YbxcyRghQ.
UR - http://www.scopus.com/inward/record.url?scp=85044473458&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85044473458&partnerID=8YFLogxK
U2 - 10.1109/HUMANOIDS.2017.8246881
DO - 10.1109/HUMANOIDS.2017.8246881
M3 - Conference contribution
AN - SCOPUS:85044473458
T3 - IEEE-RAS International Conference on Humanoid Robots
SP - 241
EP - 248
BT - 2017 IEEE-RAS 17th International Conference on Humanoid Robotics, Humanoids 2017
PB - IEEE Computer Society
T2 - 17th IEEE-RAS International Conference on Humanoid Robotics, Humanoids 2017
Y2 - 15 November 2017 through 17 November 2017
ER -