Autonomous Emergency Braking with Sensor Fusion
This example shows how to implement autonomous emergency braking (AEB) using a sensor fusion algorithm.
Autonomous emergency braking is an advanced active safety system that helps drivers avoid or mitigate collisions with other vehicles.
The European New Car Assessment Programme (Euro NCAP®) has included the AEB city and interurban systems in its safety rating since 2014. The Euro NCAP continues to promote AEB systems for protecting vulnerable road users, such as pedestrians and cyclists.
Today, AEB systems mostly use radar and vision sensors to identify potential collision partners ahead of the ego vehicle. These systems often require multiple sensors to obtain accurate, reliable, and robust detections while minimizing false positives. To combine the data from various sensors, multiple sensor AEB systems use sensor fusion technology. This example shows how to implement AEB using a sensor fusion algorithm. In this example, you:
Explore the test bench model — The model contains the sensors and environment, sensor fusion and tracking, decision logic, controls, and vehicle dynamics.
Model the AEB Controller — Use Simulink® and Stateflow® to integrate a braking controller for braking control and a nonlinear model predictive controller (NLMPC) for acceleration and steering controls.
Simulate the test bench model — You can configure the test bench model for different scenarios based on Euro NCAP test protocols.
Generate C++ code — Generate C++ code and test the software-in-the-loop(SIL) simulation for the sensor fusion, decision logic, and control algorithms.
Explore additional scenarios — These scenarios test the system under additional conditions.
Explore Test Bench Model
In this example, you use a system-level simulation test bench model to explore the behavior of the controller for an AEB system.
To explore the test bench model, load the autonomous emergency braking project.
To reduce Command Window output, turn off model predictive controller (MPC) update messages.
Open the system-level simulation test bench model.
Opening this model runs the
helperSLAEBSetup helper function, which initializes the scenario using the
drivingScenario object in the base workspace. It runs the default test scenario,
scenario_25_AEB_PedestrianTurning_Nearside_10kph, that contains an ego vehicle and a pedestrian. This setup function also configures the controller design parameters, vehicle model parameters, and Simulink bus signals required for defining the inputs and outputs for the
The test bench model contains these modules:
Sensors and Environment— Subsystem that specifies the road, actors, camera, and radar sensor used for simulation.
Sensor Fusion and Tracking— Algorithm model that fuses vehicle detections from the camera to those from the radar sensor.
AEB Decision Logic— Algorithm model that specifies the lateral and longitudinal decision logic that provides most important object (MIO) related information and ego vehicle reference path information to the controller.
AEB Controller— Algorithm model that specifies the steering angle and acceleration controls.
Vehicle Dynamics— Subsystem that specifies the dynamic model of the ego vehicle.
Metrics Assessment— Subsystem that assesses system-level behavior.
Vehicle Dynamics subsystem models the ego vehicle using a Bicycle Model, and updates its state using commands received from the
AEB Controller model. For more details on the
Vehicle Dynamics subsystem, see the Highway Lane Following example.
To plot synthetic sensor detections, tracked objects, and ground truth data, use the Bird's-Eye Scope. The Bird's-Eye Scope is a model-level visualization tool that you can open from the Simulink model toolbar. On the Simulation tab, under Review Results, click Bird's-Eye Scope. After opening the scope, click Update Signals to set up the signals. The dashboard panel displays these ego vehicle parameters: velocity, acceleration, AEB status, forward collision warning (FCW) status, and safety status.
Sensors and Environment subsystem configures the road network, defines target actor trajectories, and synthesizes sensors. Open the
Sensors and Environment subsystem.
open_system("AEBTestBench/Sensors and Environment")
The subsystem specifies the scenario and sensors of the ego vehicle using these blocks:
The Scenario Reader block reads the
drivingScenarioobject from the base workspace, and then reads the actor data from that object. The block uses the ego vehicle information to perform a closed-loop simulation, and then outputs the ground truth information of the scenario actors and their trajectories in ego vehicle coordinates.
The Driving Radar Data Generator block generates radar sensor data from a driving scenario.
The Vision Detection Generator block generates detections and measurements from a camera mounted on the ego vehicle.
Reference Path Infoblock provides a predefined reference trajectory for ego vehicle navigation. The reference path in the block is created by using the
Sensor Fusion and Tracking algorithm model processes vision and radar detections and generates the position and velocity of the tracks relative to the ego vehicle. Open the
AEBSensorFusion algorithm model.
AEBSensorFusion model contains these blocks:
Detection Concatenation — Combines the vision and radar detections onto a single output bus.
Multi-Object Tracker — Performs sensor fusion and outputs the tracks of stationary and moving objects. These tracks are updated at Prediction Time, specified by a Digital Clock block in the
Sensors and Environmentsubsystem.
AEBDecisionLogic algorithm model specifies lateral and longitudinal decisions based on the predefined ego reference path and tracks. Open the
AEBDecisionLogic algorithm model.
AEB Decision Logic algorithm model contains these blocks:
Ego Reference Path Generator— Estimates the curvature, relative yaw angle, and lateral deviation of the ego vehicle using the current ego position and the reference path information from
Sensors and Environmentsubsystem. The block also determines if the ego vehicle reached its goal.
Find Lead Car— Finds the lead car, which is the MIO in front of the ego vehicle in the same lane. This block outputs the relative distance and relative velocity between the ego vehicle and the MIO.
Model AEB Controller
AEBController algorithm model implements the main algorithm to specify the longitudinal and lateral controls. Open the
AEBController algorithm model.
AEBController model contains these subsystems:
Controller Mode Selector— Releases the vehicle accelerator when AEB is activated.
NLMPC Controller— reads the ego longitudinal velocity, curvature sequence, relative yaw angle, and lateral deviation, and then outputs the steering angle and acceleration for the ego vehicle. Open the
NLMPC Controllerreferenced subsystem.
This example uses a nonlinear MPC controller with a prediction model that has seven states, three output variables, and two manipulated variables.
Relative yaw angle
Output disturbance of relative yaw angle
Sum of the yaw angle and yaw angle output disturbance
The controller models the product of the road curvature and the longitudinal velocity as a measured disturbance. The prediction horizon and control horizon are specified by the
helperSLAEBSetup function. The state function for the nonlinear plant model and its Jacobian are specified by
helperNLMPCStateFcn function and
helperNLMPCStateJacFcn function, respectively. The continuous-time prediction model for the NLMPC controller uses the output equation defined in the
helperNLMPCOutputFcn function. The constraints for the manipulated variables, and the weights in the standard MPC cost function are defined in the
helperSLAEBSetup function when it creates the
nlmpc object. In this example, the NLMPC controller does not support zero initial velocity for the ego vehicle.
In this example, an extended Kalman filter (EKF) provides state estimation for the seven states. The state transition function for the EKF is defined in the
helperEKFStateFcn function, and the measurement function is defined in
helperEKFMeasFcn function. For more details on designing a nonlinear MPC controller, see Lane Following Using Nonlinear Model Predictive Control (Model Predictive Control Toolbox).
Braking Controller subsystem implements the FCW and AEB control algorithm based on a stopping time calculation approach.
Stopping time refers to the time from when the ego vehicle first applies its brakes, , to when it comes to a complete stop. You can find stopping time by using this equation:
where is the velocity of the ego vehicle.
The FCW system alerts the driver about an imminent collision with a lead vehicle. The driver is expected to react to the alert and apply the brake with a delay time, .
The total travel time of the ego vehicle before colliding with the lead vehicle can be expressed as:
When the time-to-collision (TTC) with the lead vehicle is less than , the FCW alert activates.
If the driver fails to apply the brake in time, such as due to distraction, the AEB system acts independently of the driver to avoid or mitigate the collision. The AEB systems typically apply cascaded braking, which consists of multi-stage partial braking followed by full braking .
Braking Controller subsystem.
Braking Controller subsystem contains these blocks:
TTCCalculation— Calculates the TTC using the relative distance and velocity of the lead vehicle.
StoppingTimeCalculation— Calculates stopping times for the FCW, first- and second-stage partial braking (PB), and full braking (FB).
AEBLogic— State machine that compares the TTC with the calculated stopping times to determine the FCW and AEB activations.
AEB Controller outputs the steering angle and acceleration commands that determine whether to accelerate or decelerate.
Explore Metrics Assessment
Metrics Assessment subsystem enables system-level metric evaluations using the ground truth information from the scenario. Open the
Metrics Assessment subsystem.
In this example, you can assess the AEB system using these metrics:
Check Collision— Verifies whether the ego vehicle collide with the target actor at any point during the simulation.
Check Safety— Verifies that the ego vehicle is within the prescribed threshold
safetyGoalthroughout the simulation. Use the
helperAEBSetuppost-load callback function to define
Simulate AEB Model
Simulate the test bench model with scenarios based on Euro NCAP test protocols. Euro NCAP offers a series of test protocols that test the performance of AEB systems in car-to-car rear (CCR)  and vulnerable road user (VRU)  scenarios.
This example uses a closed-loop simulation of these two scenarios. You then analyze the results.
AEBTestBench model for the
scenario_23_AEB_PedestrianChild_Nearside_50width scenario. In this scenario, a child pedestrian is crossing from the right side of the road to the left. The ego vehicle, which is traveling forward, collides with the child pedestrian. At collision time, the pedestrian is 50% of the way across the width of the ego vehicle.
The test bench model reads the
drivingScenario object and runs a simulation.
Simulate the model for 0.1 seconds.
sim("AEBTestBench",StopTime="0.1"); % Simulate for 0.1 seconds
The Bird's-Eye Scope shows the ground truth data of the vehicles and child pedestrian. It also shows radar detections, vision detections, and objects tracked by the multi-object tracker. At a simulation time of 0.1 seconds, the camera and radar sensor do not detect the child pedestrian, as other the vehicles obstruct their line of sight.
Simulate the model for 2.8 seconds.
sim("AEBTestBench",StopTime="2.8"); % Simulate for 2.8 seconds
Update the bird's-eye scope. Notice that the sensor fusion and tracking algorithm detects the child pedestrian as the MIO, and that the AEB system applies the brake to avoid a collision.
The dashboard panel shows that the AEB system applies cascaded brake to stop the ego vehicle before the collision point. The color of the AEB indicator specifies the level of AEB activation.
Gray — AEB is not activated.
Yellow — First stage partial brake is activated.
Orange — Second stage partial brake is activated.
Red — Full brake is activated.
Simulate the scenario to the end. Then, plot the results by using the
helperPlotAEBResults helper function.
sim("AEBTestBench"); % Simulate to end of scenario helperPlotAEBResults(logsout,scenarioFcnName);
TTC vs. Stopping Time — Compares the time-to-collision and the stopping times for the FCW, first stage partial brake, second stage partial brake, and full brake, respectively.
FCW and AEB Status — Displays the FCW and AEB activation status based on the comparison results from the first plot.
Ego Car Acceleration — Shows the acceleration of the ego vehicle.
Ego Car Yaw and Yaw Rate — Shows the yaw and yaw rate of the ego vehicle.
Ego Car Velocity — Shows the velocity of the ego vehicle.
Headway — Shows the headway between the ego vehicle and the MIO.
In the first 2 seconds, the ego vehicle speeds up to reach its specified velocity. At 2.3 seconds, the sensors first detect the child pedestrian. Immediately after the detection, the FCW system activates.
At 2.4 seconds, the AEB system applies the first stage of the partial brake, and the ego vehicle starts to slow down.
When the ego vehicle comes to a complete stop at 4.1 seconds, the headway between the ego vehicle and the child pedestrian is about 2.1 meters. The AEB system fully avoids a collision in this scenario.
AEBTestBench model for the
scenario_25_AEB_PedestrianTurning_Nearside_10kph scenario. In this scenario, the ego vehicle makes a right turn at an intersection, and collides with an adult pedestrian crossing the road from the opposite side of the intersection. At the time of collision, the pedestrian is 50% of the way across the width of the frontal structure of the ego vehicle.
Simulate the model and plot the results.
For the first 9.5 seconds, the ego vehicle travels at its specified velocity. At 9.5 seconds, the sensors first detect the pedestrian in the intersection, after the ego vehicle has turned right. Despite the short headway between the ego vehicle and the pedestrian, the AEB system applies only the first partial brake due to the low velocity of the ego vehicle.
Generate C++ Code
If you have the licenses for Simulink Coder™ and Embedded Coder™, you can generate ready-to-deploy code for algorithm models such as AEB sensor fusion, AEB decision logic, and AEB controller for an embedded real-time (ERT) target.
You can verify that the compiled C++ code behaves as expected using software-in-the-loop simulation. To simulate the referenced models in SIL mode, enter these commands.
set_param("AEBTestBench/Sensor Fusion and Tracking", ... SimulationMode="Software-in-the-loop (SIL)") set_param("AEBTestBench/AEB Decision Logic", ... SimulationMode="Software-in-the-loop (SIL)") set_param("AEBTestBench/AEB Controller", ... SimulationMode="Software-in-the-loop (SIL)")
When you run the
AEBTestBench model, code is generated, compiled, and executed for the
AEBController models. This enables you to test the behavior of the compiled code through simulation.
Explore Additional Scenarios
In this example, you have explored the system behavior for the
scenario_23_AEB_PedestrianChild_Nearside_50width scenarios. This example provides additional scenarios that are compatible with the
These scenarios have been created using the Driving Scenario Designer app, and then exported to scenario files. You can configure the
AEBTestBench model and workspace to simulate these scenarios using the
helperSLAEBSetup function. For example, to configure the simulation for the
scenario_01_AEB_Bicyclist_Longitudinal_25width scenario, enter this command.
When you are finished with this example, enable MPC update messages once again.
 Hulshof, Wesley, Iain Knight, Alix Edwards, Matthew Avery, and Colin Grover. "Autonomous Emergency Braking Test Results." In Proceedings of the 23rd International Technical Conference on the Enhanced Safety of Vehicles (ESV) , Paper Number 13-0168. Seoul, Korea: ESV Conference, 2013.
 European New Car Assessment Programme (Euro NCAP). Test Protocol – _AEB Systems . Version 2.0.1. Euro NCAP, November, 2017.
 European New Car Assessment Programme (Euro NCAP). Test Protocol – AEB VRU Systems. Version 2.0.2. Euro NCAP, November, 2017.
- Vehicle Body 3DOF (Vehicle Dynamics Blockset) | Driving Radar Data Generator | Vision Detection Generator
- Autonomous Emergency Braking with High-Fidelity Vehicle Dynamics
- Autonomous Emergency Braking with Vehicle Variants
- Autonomous Emergency Braking with RoadRunner Scenario
- Automate Testing for Autonomous Emergency Braking
- Automate Testing for Scenario Variants of AEB System
- Forward Collision Warning Using Sensor Fusion
- Adaptive Cruise Control with Sensor Fusion
- Euro NCAP Driving Scenarios in Driving Scenario Designer