Outdoor 5G Autonomous Fleet Mobility

Sensors installed on the autonomous vehicle often have blind spots (areas around the shuttle not visible to onboard sensors) due to limited view and object occlusions, which may pose significant safety risks as they may contain pedestrians, cyclists, or other vehicles. To solve this issue, a 5G-based outdoor autonomous fleet mobility is studied.

The team has chosen a one-kilometer route within UW’s North Campus, which includes 14 light posts where the sensor nodes have been deployed. Each sensor node consists of 2 Cameras,1 LiDAR and 1 5G (or Wi-Fi) module along with software integration running on a low-power Jetson Orin Nano, and the whole system is powered by a solar plane. Experimental verification of the sensor configuration, perception, and communication has also been performed on data collected from the sensor nodes.

Outdoor Framework of 5G Autonomous Fleet Mobility

Outdoor Framework of 5G Autonomous Fleet Mobility

Gallery of photos displaying a road map with a round about, a light post, a camera system, wiring of the camera system

14 Infrastructure Sensor Nodes (ISNs) installed on the University of Waterloo North Campus for outdoor 5G autonomous mobility testing and evaluation. Left: Location of ISNs (green circles, route for the experimental testing, middle: ISN installed on a street light, Right: ISN sensors suite (Lidar and 2 cameras), and processing and communication box.

Outdoor Real-Time Perception

The Cameras and Lidar have been well-calibrated for sensor fusion purposes. The unnecessary points from the Lidar will be removed via Point Cloud Background Subtraction, which will only keep the points from objects that will affect the driving. The YOLOv8 object detection model is customized and optimized for our application, which will provide the 2D bounding boxes of interested objects, like cars, pedestrians, buses and animals, etc. Then the Lidar Camera Association, considering spatial, semantic and temporal information, will be applied to provide the final perception results. The perception results from each node will be sent to the cloud via 5G, which will be then fused and utilized in the later modules like global planning and global control.

Experiments will be conducted using our WATonoBus to evaluate the performance of the 5G autonomous fleet mobility  and the impact of latency, signal dropout, and weather conditions on the performance, reliability, and safety of a fleet mobility using 5G network and cloud computing.

Outdoor Perception Framework displaying the inputs and perception of the system

Figure 4 Outdoor Perception Framework