Technical Articles

Simulating Autonomous Driving Algorithms for the SAE AutoDrive Challenge

By Aaron Angert, Texas A&M University


The 2021 SAE AutoDrive Challenge™ was a multiyear competition in which eight student teams designed and developed autonomous vehicles capable of navigating in a typical urban setting. In the final challenge, held at the MCity testing facility at the University of Michigan, the competitor vehicles completed a test course that included traffic lights, street signs, a railway crossing, and other elements common to city driving (Figure 1).  

Two people sitting in the Texas A and M autonomous vehicle, which is parked in a lot. The car has purple and white stripes with the names of sponsors on the car.

Figure 1. The Texas A&M autonomous vehicle used in the AutoDrive Challenge.

My team from Texas A&M University took first place in the final event of the 2021 AutoDrive Challenge. Critical to our success was the simulation environment we used to test our algorithms. Built using Simulink® and Unreal Engine®, this environment enabled us to test and debug our perception, planning, decision, and control algorithms in realistic conditions before verifying them on our vehicle, a 2017 Chevy Bolt EV provided by General Motors. It also enabled us to partition the development effort so that multiple teams could work in parallel and to continue our work uninterrupted during the pandemic, when in-person access to the vehicle was not possible.

Simulating Autonomous Vehicle Algorithms

The Texas A&M AutoDrive Challenge team is made up of more than 70 graduate and undergraduate students, led over the past two years by my faculty advisor, Dr. Dezhen Song. The students work in small groups, each focusing on a specific area of development, such as perception or simultaneous localization and mapping (SLAM). Working in a simulation environment enabled us to develop and test our algorithms without reliance on a fully functional vehicle. It also meant that the work didn’t stop when other groups needed to work on the vehicle. 

Our simulation environment accurately reflected the dynamics of the Chevy Bolt, its lidar and GPS sensors, and the urban setting in which it would operate. Within this environment, we used the Simulation 3D Lidar block from Automated Driving Toolbox™ to model the Velodyne® lidar on the actual vehicle, and we configured the Simulation 3D Camera block to have the same intrinsic parameters (focal length, optical center, image size, and so on) as the cameras mounted on the vehicle.

In year two of the competition, MathWorks provided us with a 3D model of MCity, built with Unreal Engine. We integrated this model into our simulation environment and used Automated Driving Toolbox to enable cosimulation (Figure 2). As we developed and refined our simulation environment, we met regularly with a MathWorks application engineer, who provided guidance on best practices and answered our questions.

Diagram of the simulation environment created by the Texas A and M AutoDrive Challenge team to test autonomous vehicle control algorithms.

Figure 2. Diagram of the simulation environment, which enables 3D visualizations while testing a variety of autonomous vehicle control algorithms. 

We used the simulation environment to test our algorithms in a variety of ways. For example, during simulations, we stored data from the lidar and GPS sensor models in ROS log files (rosbags). Individual groups used this recorded data to run open-loop unit tests on their algorithms. After verifying individual algorithms via unit testing, we used the simulation environment to run closed-loop integration tests to verify multiple algorithms working together and communicating via an ROS interface. MATLAB® was used to visualize test results (Figure 3). For example, one group tested a new version of their path-tracking algorithm, based on the pure pursuit method, and used a MATLAB visualization to compare the performance of the new algorithm with the previous one.

A MATLAB visualization of an open-loop test testing the SLAM algorithm with lidar, verifying SLAM and G P S algorithms working together in a simulation environment.

Figure 3. A MATLAB visualization showing SLAM results overlayed on measured GPS position.

Replicating Real-World Tests During the Competition

To prepare for the competition, we created and simulated driving scenarios like those we would encounter on the MCity test track and used these scenarios to test our algorithms extensively via simulation (Figure 4). Despite this rigorous testing, we were concerned that we might uncover some unforeseen issues once we began testing the vehicle on the competition course. We were correct.

Before the competition started, each team was allowed to run their vehicle through the test course and, if necessary, fine-tune their algorithms. During these practice runs, we discovered several defects—including segmentation faults—in our software. Later that day, we used data that we had captured during the practice runs to replay the exact routes the vehicle had taken in our simulation environment. We were then able to pinpoint and fix the problems in our code before we retested the next day.

A map shows the M City test course at the University of Michigan where college teams competed in 2021 S A E AutoDrive Challenge.

Figure 4. A map of the MCity route.

Looking Ahead to AutoDrive Challenge II

AutoDrive Challenge II, launched in 2022, continues the competition series with a focus on collision avoidance, path following, and traffic light interactions. A team from Texas A&M has entered, with new members taking over from students who have moved on. The new team will use the same approach, grounded in extensive simulations in Simulink, as they build upon the success of our first team.

About the AutoDrive Challenge

Sponsored by SAE International and General Motors (GM®), the AutoDrive Challenge series invites student teams to develop and demonstrate a fully autonomous passenger vehicle capable of navigating an urban driving course in an automated driving mode as described by SAE Standard (J3016) Level 4. MathWorks is the official software supplier for the competition. 

About the Author

Aaron Angert is a Ph.D. student at Texas A&M University. He served as the lead architecture design engineer for the university’s AutoDrive Challenge team.  

Published 2022