laneBoundaries
Get lane boundaries relative to host vehicle from RoadRunner Scenario
Since R2023a
Syntax
Description
gets the lane boundaries lbdrys
= laneBoundaries(sensorSim
,sensorID
)lbdrys
of the lane in which the vehicle actor
hosting the sensor specified by sensorID
, is traveling. The lane
boundaries are in the coordinate system of the host vehicle. The SensorSimulation
object sensorSim
specifies the RoadRunner scenario that contains the host vehicle actor.
Note
If you modify the RoadRunner scene after creating the SensorSimulation
object, you
must follow these steps for the changes to reflect correctly in MATLAB®:
Save and close RoadRunner.
Clear the
ScenarioSimulation
object from the MATLAB workspace.Reopen the RoadRunner scenario.
Recreate the
ScenarioSimulation
object and theSensorSimulation
objects.
specifies additional options using one or more name-value arguments.lbdrys
= laneBoundaries(sensorSim
,sensorID
,Name=Value
)
Examples
Define sensor models in MATLAB®, and add them to vehicle actors in a RoadRunner Scenario. Then, obtain ground truth measurements from RoadRunner Scenario, process them into detections for visualization.
Set Up RoadRunner Scenario — MATLAB Interface
Configure your RoadRunner installation and project folder properties. Open the RoadRunner app.
rrInstallationPath = "C:\Program Files\RoadRunner R2024a\bin\win64"; rrProjectPath = "D:\RR\TestProjects"; s = settings; s.roadrunner.application.InstallationFolder.PersonalValue = rrInstallationPath; rrApp = roadrunner(rrProjectPath);
To open the scenario this example uses, you must add the TrajectoryCutIn-longRun.rrscenario
file from the example folder to your RoadRunner project folder. Then, open the scenario.
copyfile("TrajectoryCutIn-longRun.rrscenario",fullfile(rrProjectPath,"Scenarios/")) openScenario(rrApp,"TrajectoryCutIn-longRun")
Create a ScenarioSimulation
object to connect MATLAB to the RoadRunner Scenario simulation and set the step size.
scenarioSim = createSimulation(rrApp);
Connection status: 1 Connected to RoadRunner Scenario server on localhost:60730, with client id {761e01bc-376c-4b4b-8ffa-aa1490d7438d}
stepSize = 0.1;
set(scenarioSim,"StepSize",stepSize);
Create a SensorSimulation
object to control the sensor configuration for the RoadRunner Scenario simulation.
sensorSim = get(scenarioSim,"SensorSimulation");
To use the GPU on your device for supporting hardware accelerated raytracing, specify the UseGPU
property of the SensorSimulation
object to on
. This enables supported sensors like lidars to use GPU for raytracing.
sensorSim.UseGPU = "on";
If you modify the RoadRunner scene after creating the SensorSimulation
object, you must follow these steps for the changes to reflect correctly in MATLAB:
Save and close RoadRunner application.
Clear the
ScenarioSimulation
object from the MATLAB workspace.Reopen the RoadRunner scenario.
Recreate the
ScenarioSimulation
object and theSensorSimulation
object.
Configure Sensors and Add to RoadRunner Scenario
Configure sensor models for vision, radar and lidar sensors to add to the ego vehicle using visionDetectionGenerator
, drivingRadarDataGenerator
and lidarPointCloudGenerator
objects. Specify unique IDs for each sensor.
visionSensor = visionDetectionGenerator(SensorIndex=1, ... SensorLocation=[2.4 0], MaxRange=50, ... DetectorOutput="Lanes and objects", ... UpdateInterval=stepSize); radarSensor = drivingRadarDataGenerator(SensorIndex=2,... MountingLocation=[1.8 0 0.2], FieldOfView=[80 5],... AzimuthResolution=1,UpdateRate=1/stepSize); lidarSensor = lidarPointCloudGenerator(SensorIndex=3,UpdateInterval=stepSize);
Add the sensor to the ego vehicle actor in the RoadRunner scenario. Specify the Actor
ID
property for the vehicle.
egoVehicleID = 1; addSensors(sensorSim,{visionSensor,radarSensor,lidarSensor},egoVehicleID);
Simulate RoadRunner Scenario and Visualize Sensor Data
To visualize sensor data at each time-step of the simulation, add an observer to the RoadRunner scenario.
The helperSensorObserver
system object implements the observer behavior. At the first timestep, the system object initializes the bird's-eye-plot for visualization, Then, at each time step, the system object:
Retrieves target poses in the sensor range using the
targetPoses
function.Processes the target poses into detections using the sensor models.
Visualizes detections and ground truth lane boundaries using
birdsEyePlot
.
observer = addObserver(scenarioSim,"VisualizeSensorData","helperSensorObserver");
Start the scenario.
set(scenarioSim,"SimulationCommand","Start");
Helper Functions
helperSetupBEP
function creates a bird's-eye-plot and configures all the plotters for visualization.
helperPlotLaneBoundaries
function plots the lane boundaries on the birds'eye-plot.
helperSensorObserver
system object implements the visualization of sensor data during the RoadRunner scenario simulation.
type("helperSensorObserver.m")
classdef helperSensorObserver < matlab.System properties(Access=private) currSimTime scenarioSimObj sensorSimObj visionSensor radarSensor lidarSensor visionDetPlotter radarDetPlotter pcPlotter lbGTPlotter lbDetPlotter bepAxes end methods % Constructor function obj = helperSensorObserver() end end methods(Access=protected) function interface = getInterfaceImpl(~) import matlab.system.interface.*; interface = ActorInterface; end % Get ScenarioSimulation and SensorSimulation objects, set up Bird's-Eye-Plot function setupImpl(obj) obj.scenarioSimObj = Simulink.ScenarioSimulation.find('ScenarioSimulation','SystemObject',obj); obj.sensorSimObj = obj.scenarioSimObj.get('SensorSimulation'); obj.visionSensor = evalin("base","visionSensor"); obj.radarSensor = evalin("base","radarSensor"); obj.lidarSensor = evalin("base","lidarSensor"); [obj.visionDetPlotter,obj.radarDetPlotter,obj.pcPlotter,obj.lbGTPlotter,obj.lbDetPlotter,obj.bepAxes] = helperSetupBEP(obj.visionSensor,obj.radarSensor); legend(obj.bepAxes,"show") obj.currSimTime = 0; end function releaseImpl(obj) obj.lidarSensor.release; obj.radarSensor.release; obj.visionSensor.release; end function stepImpl(obj) % Get current Simulation Time obj.currSimTime = obj.scenarioSimObj.get("SimulationTime"); % Get ground truth target poses and lane boundaries from the sensor tgtPoses1 = targetPoses(obj.sensorSimObj,1); tgtPoses2 = targetPoses(obj.sensorSimObj,2); gTruthLbs = laneBoundaries(obj.sensorSimObj,1,OutputOption="EgoAdjacentLanes",inHostCoordinate=true); if ~isempty(gTruthLbs) % Get detections from vision and radar sensors [visionDets,numVisionDets,visionDetsValid,lbDets,numLbDets,lbDetsValid] = obj.visionSensor(tgtPoses1,gTruthLbs,obj.currSimTime); [radarDets,numRadarDets,radarDetsValid] = obj.radarSensor(tgtPoses2,obj.currSimTime); % Get point cloud from lidar sensor [ptCloud,ptCloudValid] = obj.lidarSensor(); % Plot ground-truth and detected lane boundaries helperPlotLaneBoundaries(obj.lbGTPlotter,gTruthLbs) % Plot vision and radar detections if visionDetsValid detPos = cellfun(@(d)d.Measurement(1:2),visionDets,UniformOutput=false); detPos = vertcat(zeros(0,2),cell2mat(detPos')'); plotDetection(obj.visionDetPlotter,detPos) end if lbDetsValid plotLaneBoundary(obj.lbDetPlotter,vertcat(lbDets.LaneBoundaries)) end if radarDetsValid detPos = cellfun(@(d)d.Measurement(1:2),radarDets,UniformOutput=false); detPos = vertcat(zeros(0,2),cell2mat(detPos')'); plotDetection(obj.radarDetPlotter,detPos) end % Plot lidar point cloud if ptCloudValid plotPointCloud(obj.pcPlotter,ptCloud); end end end end end
Input Arguments
Sensor simulation object, specified as a SensorSimulation
object. The sensor simulation object must correspond to the
desired scenario created in RoadRunner Scenario.
Unique index of the sensor in the RoadRunner Scenario, specified as a positive scalar. This must be same as the
SensorIndex
property of the detection generator object which
corresponds to the desired sensor.
Name-Value Arguments
Specify optional pairs of arguments as
Name1=Value1,...,NameN=ValueN
, where Name
is
the argument name and Value
is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Example: lbdrys =
laneBoundaries(sensorSim,1,OutputOption="EgoAdjacentLanes")
returns the
boundaries for the current and adjacent lanes to the vehicle actor hosting sensor
1.
Lane boundaries to output, specified as one of these options:
"EgoLane"
— The function returns the boundaries for the lane in which the host vehicle actor is traveling."EgoAdjacentLanes"
— The function returns the boundaries for the adjacent left and right lanes along with the lane in which host vehicle actor is traveling."AllLanes"
— The function returns the boundaries for all lanes on the road.
Enable the lane boundaries to be returned in host vehicle coordinates, specified
as a logical 1
(true
) or 0
(false
). Specify this argument as false
or
0
to return the lane boundaries in the world coordinates of the
RoadRunner scenario.
Output Arguments
Lane boundaries, returned as an array of lane boundary structures. This table shows the fields for each structure.
Field | Description |
| Lane boundary coordinates, returned as a real-valued N-by-3 matrix, where N is the number of lane boundary coordinates. Lane boundary coordinates define the position of points on the boundary at returned longitudinal distances away from the ego vehicle, along the center of the road.
This matrix also includes the boundary coordinates at zero distance from the ego vehicle. These coordinates are to the left and right of the ego vehicle origin, which is located under the center of the rear axle. Units are in meters. |
| Lane boundary curvature at each row of the Coordinates
matrix, returned as a real-valued N-by-1 vector.
N is the number of lane boundary coordinates. Units are
in radians per meter. |
| Derivative of lane boundary curvature at each row of the
Coordinates matrix, returned as a real-valued
N-by-1 vector. N is the number of lane
boundary coordinates. Units are in radians per square meter. |
| Initial lane boundary heading angle, returned as a real scalar. The heading angle of the lane boundary is relative to the ego vehicle heading. Units are in degrees. |
| Lateral offset of the ego vehicle position from the lane boundary, returned as a real scalar. An offset to a lane boundary to the left of the ego vehicle is positive. An offset to the right of the ego vehicle is negative. Units are in meters. In this image, the ego vehicle is offset 1.5 meters from the left lane and 2.1 meters from the right lane. |
| Type of lane boundary marking, returned as one of these values:
|
| Saturation strength of the lane boundary marking, returned as a
real scalar from 0 to 1. A value of |
| Lane boundary width, returned as a positive real scalar. In a double-line lane marker, the same width is used for both lines and for the space between lines. Units are in meters. |
| Length of dash in dashed lines, returned as a positive real scalar. In a double-line lane marker, the same length is used for both lines. |
| Length of space between dashes in dashed lines, returned as a positive real scalar. In a dashed double-line lane marker, the same space is used for both lines. |
Version History
Introduced in R2023a
See Also
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: United States.
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)