A Method for Localization and Dense Mapping in Indoor Dynamic Environments Based on an Improved ORB-SLAM2 Algorithm
DOI:
https://doi.org/10.54097/exr3jx13Keywords:
Dense mapping, Semantic segmentation, Depth consistency detection, Keyframe selection, Point cloud fusionAbstract
To address the difficulty of ORB-SLAM2 in constructing dense maps in complex indoor dynamic environments, this paper proposes an improved method for localization and dense mapping in dynamic indoor scenes. The proposed approach integrates a semantic segmentation module based on DeepLabV3+ and introduces an additional semantic thread to obtain scene semantic information. In the tracking process, semantic segmentation results are combined with a dual depth-consistency checking strategy to identify and remove dynamic feature points while preserving reliable static features. Furthermore, the original keyframe selection mechanism is enhanced by incorporating relative pose variations between consecutive frames as an additional criterion, which helps reduce redundant keyframes and improve localization accuracy. A dense mapping thread is also introduced to fuse multi-frame point cloud data and generate a dense three-dimensional map of indoor environments. Experiments conducted on the TUM RGB-D Dataset demonstrate that the proposed method significantly improves trajectory accuracy. The average localization error is reduced by more than 95% in highly dynamic sequences and by 39.60% in low-dynamic sequences. The results show that the proposed approach achieves more accurate pose estimation and dense map reconstruction, providing effective support for mobile robot localization and navigation in indoor dynamic environments.
Downloads
References
[1] Davison AJ, Reid ID, Molton ND et al (2007) MonoSLAM: real-time single camera SLAM. IEEE Trans Pattern Anal Mach Intell 29(6):1052–1067
[2] Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, pp 225–234.
[3] Campos C, Elvira R, Rodríguez JJG, Montiel, JMM., Tardós, JD (2021) Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans Robot 37(6):1874–1890.
[4] Qin T, Li P, Shen S (2018) VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator vol 34, pp 1004–1020.
[5] BESCOS B, FÁCIL J M, Civera J, et al. DynaSLAM: tracking, mapping, and inpainting in dynamic scenes[J]. IEEE Robotics and Automation Letters, 2018, 3(4): 4076-4083.
[6] Yu C, Liu Z, Liu X J, et al. DS-SLAM: A semantic visual SLAM towards dynamic environments[C]//2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, 2018: 1168-1174.
[7] LIU Y B, MIURA J. RDS-SLAM: real-Time dynamic SLAM using semantic segmentation methods[J].IEEE Access, 2021, PP(99):1-1.
[8] Fu, Q., Zhong, Z., Ji, Y., Yan, S. (2026). Dynamic Visual SLAM Algorithm Combined with YOLO. In: Hassan, M.H.A., Jamaludin, A.S., Bin Zakaria, M.A., Usman, F., Uchidate, M. (eds) Intelligent Manufacturing and Mechatronics. SympoSIMM 2024. Lecture Notes in Mechanical Engineering. Springer, Singapore. https://doi.org/10.1007/978-981-96-7703-0_8
[9] Kim D H, Kim J H. Effective background model-based RGB-D dense visual odometry in a dynamic environment[J]. IEEE Transactions on Robotics, 2016, 32(6): 1565-1573.
[10] Wang R, Wan W, Wang Y, et al. A new RGB-D SLAM method with moving object detection for dynamic indoor scenes[J]. Remote Sensing, 2019, 11(10): 1143.
[11] Dai W, Zhang Y, Li P, et al. Rgb-d slam in dynamic environments using point correlations[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 44(1): 373-389.
[12] Song S, Lim H, Lee A J, et al. DynaVINS: A visual-inertial SLAM for dynamic environments [J]. IEEE Robotics and Automation Letters, 2022, 7(4): 11523-11530.
[13] Cheng J, Wang C, Meng M Q H. Robust visual localization in dynamic environments based on sparse motion removal[J]. IEEE Transactions on Automation Science and Engineering, 2019, 17(2): 658-669.
[14] Fang Y, Dai B. An improved moving target detecting and tracking based on optical flow technique and kalman filter[C]//2009 4th International Conference on Computer Science & Education. IEEE, 2009: 1197-1202.
[15] Zhang T, Zhang H, Li Y, et al. Flowfusion: Dynamic dense rgb-d slam based on optical flow[C]//2020 IEEE International Conference on Robotics and Automation (ICRA).IEEE, 2020: 7322-7328.
[16] Bakkay M C, Arafa M, Zagrouba E. Dense 3D SLAM in dynamic scenes using Kinect[J].Springer, Cham, 2015: 121-129.
[17] Gao R, Li Z, Li J, et al. Real-time SLAM based on dynamic feature point elimination in dynamic environment[J]. IEEE Access, 2023, 11: 113952-113964.
[18] Chang Z, Wu H, Li C. YOLOv4‐tiny‐based robust RGB‐D SLAM approach with point and surface feature fusion in complex indoor environments[J]. Journal of Field Robotics, 2023, 40(3): 521-534.
[19] Xiao L, Wang J, Qiu X, et al. Dynamic-SLAM: Semantic monocular visual localization and mapping based on deep learning in dynamic environment[J]. Robotics and Autonomous Systems, 2019, 117: 1-16.
[20] Liu Y, Miura J. RDS-SLAM: Real-time dynamic SLAM using semantic segmentation methods[J]. IEEE Access, 2021, 9: 23772-23785.
[21] Ballester I, Fontán A, Civera J, et al. Dot: Dynamic object tracking for visual slam[C]. 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, 11705-11711.
[22] Cui L, Ma C. SOF-SLAM: A semantic visual SLAM for dynamic environments[J]. IEEE Access, 2019, 7(2): 166528-166539.
[23] Gonzalez M, Marchand E, Kacete A, et al. Twistslam: Constrained slam in dynamic environment[J]. IEEE Robotics andAutomation Letters, 2022, 7(3): 6846-6853.
[24] Wadud, R.A., & Sun, W. DyOb-SLAM: Dynamic Object Tracking SLAM System[J]. arXiv preprint, 2022, arXiv: 2211.01941.
[25] Chang J, Dong N, Li D. A real-time dynamic object segmentation framework for SLAM system in dynamic scenes[J]. IEEE Transactions on Instrumentation and Measurement, 2021, 70: 1-9.
[26] He J, Li M, Wang Y, et al. OVD-SLAM: An online visual SLAM for dynamic environments[J]. IEEE Sensors Journal, 2023, 23(12): 13210-13219.
[27] You Y, Wei P , Cai J, et al. MISD-SLAM: multimodal semantic SLAM for dynamic environments[J]. Wireless Communications and Mobile Computing, 2022, 2022(1): 7600669.
[28] Shuangquan Han, Zhihong Xi. Dynamic scene semantics SLAM based on semantic segmentation[J]. IEEE Access, 2020, 8: 43563-43570.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Journal of Computing and Electronic Information Management

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.








