Abstract
It is of paramount importance for a rover running on an extraterrestrial body surface to recognize the dangerous zones autonomously. This automation is inevitable due to the communication delay. However, as far as we know, there are few annotated terrain recognition datasets for extraterrestrial bodies. Furthermore, the lack of datasets hinders the training and evaluation of recognition algorithms. Therefore, we first built the Chang’e 3 terrain recognition (CE3TR) dataset to address terrain recognition and semantic segmentation problems on the lunar surface. The moon is one of the nearest celestial bodies to the earth; our work is geared towards extraterrestrial bodies. The images of our dataset are captured by the Yutu moon rover, which can retain the real illumination condition and terrain environment on the moon. A residual grounding transformer network (RGTNet) is also proposed to find out unsafe areas like rocks and craters. The residual grounding transformer is introduced to facilitate cross-scale interactions of different level features. A local binary pattern feature fusion module is another notable part of the RGTNet, which contributes to extracting the boundaries of different obstacles. We also present the ability of new loss, called smooth intersection over union loss, to mitigate overfitting. To evaluate RGTNet, we have conducted extensive experiments on our CE3TR dataset. The experimental results demonstrate that our model can recognize risky terrain readily and outperforms other state-of-the-art methods.
© 2021 Optical Society of America
Full Article | PDF ArticleMore Like This
Tongfei Lv, Yu Zhang, Lin Luo, and Xiaorong Gao
Appl. Opt. 61(9) 2219-2229 (2022)
Xinyu Zhao and Bin Wu
Appl. Opt. 60(29) 9167-9179 (2021)
Feng Li, Zetao Huang, Lu Zhou, Yuyang Chen, Shiqing Tang, Pengchao Ding, Haixia Peng, and Yimin Chu
Biomed. Opt. Express 15(4) 2590-2621 (2024)