Design and Integrated Implementation of a Unified Architecture for Intelligent Unmanned Ground Vehicles Based on MATLAB
DOI: https://doi.org/10.62517/jike.202604127
Author(s)
Zhanxin Ye
Affiliation(s)
School of Mechanical Engineering, UNSW, Mechatronic Engineering, NSW, 2052, Australia
Abstract
This paper addresses the challenges of complex hardware integration, fragmented software frameworks, and inconsistent interactions encountered during the development of unmanned ground vehicles (UGVs). It proposes and implements an intelligent UGV integrated control system centred on the MATLAB/Simulink platform. The system adopts a layered architecture comprising 'remote monitoring-onboard fusion-real-time control-sensor-based execution', integrating modules such as LiDAR terrain perception, GNSS/IMU tight-coupled positioning, Galil motion control, and Xbox remote operation. Through designing a unified multi-layer communication protocol and secure arbitration mechanism, and utilising MATLAB to complete the entire integration process from algorithm development and simulation validation to real-time monitoring, the system demonstrates stable module coordination and unimpeded closed-loop control command transmission in simulation experiments. This validates the effectiveness and engineering practicality of the proposed integrated solution, providing a replicable approach for rapid UGV prototyping.
Keywords
Matlab; UGVs; GNSS; IMU; Galileo
References
[1] Cheung, C., Griffith, S., Wells, L., Gregory, D., Moore, M., Johannes, M., Hetherington, D., Walters, J., & Kang, S. (2023). Application of the Autonomous Ground Vehicle Reference Architecture to MBSE. In Proceedings of the Ground Vehicle Systems Engineering and Technology Symposium (GVSETS), NDIA, Novi, MI, Aug. 15–17, 2023. https://doi.org/10.4271/2024-01-4045
[2] Cai, Z., Liu, J., Chi, W., & Zhang, B. (2023). A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments. Remote Sensing, 15(23), 5584. https://doi.org/10.3390/rs15235584
[3] Liu, Z., Tang, H., Amini, A., Yang, X., Mao, H., Rus, D., & Han, S. (2022). BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird’s-Eye View Representation. arXiv preprint arXiv:2205.13542. https://doi.org/10.48550/arXiv.2205.13542
[4] Qin, Y., Wang, C., Kang, Z., Ma, N., Li, Z., & Zhang, R. (2023). SupFusion: Supervised LiDAR-Camera Fusion for 3D Object Detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) (2023). https://doi.org/10.48550/arXiv.2309.07084
[5] Pang, S., Morris, D., & Radha, H. (2020). CLOCs: Camera-LiDAR Object Candidates Fusion for 3D Object Detection. arXiv preprint arXiv:2009.00784. https://arxiv.org/abs/2009.00784
[6] SICK. (2025). LMS151-10100 | LMS1xx 2D LiDAR (Data sheet, Part No. 1047607). Retrieved from https://www.sick.com/media/pdf/0/40/840/dataSheet_LMS151-10100_1047607_zh.pdf
[7] Jin, M., Li, J., & Chen, T. (2024). Method for the Trajectory Tracking Control of Unmanned Ground Vehicles Based on Chaotic Particle Swarm Optimization and Model Predictive Control. Symmetry, 16(6), 708. https://doi.org/10.3390/sym16060708
[8] Martelli, S., Martini, V., Mocera, F., & Somà, A. (2025). Co-Simulation Model of an Autonomous Driving Rover for Agricultural Applications. Robotics, 14(9), 120. https://doi.org/10.3390/robotics14090120