TY - JOUR
T1 - Semantic Mapping of Construction Site from Multiple Daily Airborne LiDAR Data
AU - Westfechtel, Thomas
AU - Ohno, Kazunori
AU - Akegawa, Tetsu
AU - Yamada, Kento
AU - Neto, Ranulfo Plutarco Bezerra
AU - Kojima, Shotaro
AU - Suzuki, Taro
AU - Komatsu, Tomohiro
AU - Shibata, Yukinori
AU - Asano, Kimitaka
AU - Nagatani, Keiji
AU - Miyamoto, Naoto
AU - Suzuki, Takahiro
AU - Harada, Tatsuya
AU - Tadokoro, Satoshi
N1 - Funding Information:
Manuscript received October 15, 2020; accepted February 15, 2021. Date of publication February 26, 2021; date of current version March 19, 2021. This letter was recommended for publication by Associate Editor F. Ruggiero and Editor P. Pounds upon evaluation of the reviewers’ comments. This work was supported by the CREST No. 14532298, and NEDO No. 18065741. (Corresponding author: Thomas Westfechtel.) Thomas Westfechtel, Keji Nagatani, and Tatsuya Harada are with The University of Tokyo, Tokyo 113-8654, Japan (e-mail: thomas@mi.t.u-tokyo.ac.jp; keiji@ieee.org; harada@isi.imi.i.u-tokyo.ac.jp).
Publisher Copyright:
© 2016 IEEE.
PY - 2021/4
Y1 - 2021/4
N2 - Semantic maps are an important tool to provide robots with high-level knowledge about the environment, enabling them to better react to and interact with their surroundings. However, as a single measurement of the environment is solely a snapshot of a specific time, it does not necessarily reflect the underlying semantics. In this work, we propose a method to create a semantic map of a construction site by fusing multiple daily data. The construction site is measured by an unmanned aerial vehicle (UAV) equipped with a LiDAR. We extract clusters above ground level from the measurements and classify them using either a random forest or a deep learning based classifier. Furthermore, we combine the classification results of several measurements to generalize the classification of the single measurements and create a general semantic map of the working site. We measured two construction fields for our evaluation. The classification models can achieve an average intersection over union (IoU) score of 69.2% during classification on the Sanbongi field, which is used for training, validation and testing and an IoU score of 49.16% on a hold-out testing field. In a final step, we show how the semantic map can be employed to suggest a parking spot for a dump truck, and in addition, show that the semantic map can be utilized to improve path planning inside the construction site.
AB - Semantic maps are an important tool to provide robots with high-level knowledge about the environment, enabling them to better react to and interact with their surroundings. However, as a single measurement of the environment is solely a snapshot of a specific time, it does not necessarily reflect the underlying semantics. In this work, we propose a method to create a semantic map of a construction site by fusing multiple daily data. The construction site is measured by an unmanned aerial vehicle (UAV) equipped with a LiDAR. We extract clusters above ground level from the measurements and classify them using either a random forest or a deep learning based classifier. Furthermore, we combine the classification results of several measurements to generalize the classification of the single measurements and create a general semantic map of the working site. We measured two construction fields for our evaluation. The classification models can achieve an average intersection over union (IoU) score of 69.2% during classification on the Sanbongi field, which is used for training, validation and testing and an IoU score of 49.16% on a hold-out testing field. In a final step, we show how the semantic map can be employed to suggest a parking spot for a dump truck, and in addition, show that the semantic map can be utilized to improve path planning inside the construction site.
KW - Field robots
KW - robotics and automation in construction
KW - semantic scene understanding
UR - http://www.scopus.com/inward/record.url?scp=85101844031&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85101844031&partnerID=8YFLogxK
U2 - 10.1109/LRA.2021.3062606
DO - 10.1109/LRA.2021.3062606
M3 - Article
AN - SCOPUS:85101844031
SN - 2377-3766
VL - 6
SP - 3073
EP - 3080
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
M1 - 9364688
ER -