TY - GEN
T1 - Action Recognition of Construction Machinery from Simulated Training Data Using Video Filters
AU - Sim, Jinhyeok
AU - Kasahara, Jun Younes Louhi
AU - Chikushi, Shota
AU - Yamakawa, Hiroshi
AU - Tamura, Yusuke
AU - Nagatani, Keiji
AU - Chiba, Takumi
AU - Yamamoto, Shingo
AU - Chayama, Kazuhiro
AU - Yamashita, Atsushi
AU - Asama, Hajime
N1 - Publisher Copyright:
© 2020 Proceedings of the 37th International Symposium on Automation and Robotics in Construction, ISARC 2020: From Demonstration to Practical Use - To New Stage of Construction Robot. All rights reserved.
PY - 2020
Y1 - 2020
N2 - In the construction industry, continuous monitoring of actions performed by construction machinery is a critical task in order to achieve improved productivity and efficiency. However, measuring and recording each individual construction machinery's actions is both time consuming and expensive if conducted manually by humans. Therefore, automatic action recognition of construction machinery is highly desirable. Inspired by the success of Deep Learning approaches for human action recognition, there has been an increased number of studies dealing with action recognition of construction machinery using Deep Learning. However, those approaches require large amounts of training data, which is difficult to obtain since construction machinery are usually located in the field. Therefore, this paper proposes a method for action recognition of construction machinery using only training data generated from a simulator, which is much easier to obtain than actual training data. In order to bridge the feature domain gap between simulator-generated data and actual field data, a video filter was used. Experiments using a model of an excavator, one of the most commonly used construction machinery, showed the potential of our proposed method.
AB - In the construction industry, continuous monitoring of actions performed by construction machinery is a critical task in order to achieve improved productivity and efficiency. However, measuring and recording each individual construction machinery's actions is both time consuming and expensive if conducted manually by humans. Therefore, automatic action recognition of construction machinery is highly desirable. Inspired by the success of Deep Learning approaches for human action recognition, there has been an increased number of studies dealing with action recognition of construction machinery using Deep Learning. However, those approaches require large amounts of training data, which is difficult to obtain since construction machinery are usually located in the field. Therefore, this paper proposes a method for action recognition of construction machinery using only training data generated from a simulator, which is much easier to obtain than actual training data. In order to bridge the feature domain gap between simulator-generated data and actual field data, a video filter was used. Experiments using a model of an excavator, one of the most commonly used construction machinery, showed the potential of our proposed method.
KW - Action recognition
KW - Deep learning
KW - Video filter
UR - http://www.scopus.com/inward/record.url?scp=85103738430&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85103738430&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85103738430
T3 - Proceedings of the 37th International Symposium on Automation and Robotics in Construction, ISARC 2020: From Demonstration to Practical Use - To New Stage of Construction Robot
SP - 595
EP - 599
BT - Proceedings of the 37th International Symposium on Automation and Robotics in Construction, ISARC 2020
PB - International Association on Automation and Robotics in Construction (IAARC)
T2 - 37th International Symposium on Automation and Robotics in Construction: From Demonstration to Practical Use - To New Stage of Construction Robot, ISARC 2020
Y2 - 27 October 2020 through 28 October 2020
ER -