Generate Lane Information from Recorded Data
This example shows how to generate lane information using recorded data. This workflow enables you to add lane specifications to a road network imported from standard definition (SD) map data using recorded data from a camera and a GPS sensor.
Overview
You can use virtual driving scenarios to recreate real scenarios from recorded vehicle data. Generating road networks is an important stage in creating a virtual driving scenario. Using a drivingScenario
object or the Driving Scenario Designer app, you can import a road network from OpenStreetMap® that provides SD map data. However, such road networks lack detailed lane information that is essential for navigation in an autonomous system. In this example, you create a virtual driving scenario by generating a drivingScenario
object using data recorded from a test (ego) vehicle and OpenStreetMap file. The OpenStreetMap file describes the road network information in the area where the data has been recorded.
The recorded data includes:
GPS data — Contains the latitude, longitude, and altitude of the ego vehicle at each timestamp.
Video data — MP4 file recorded from a forward-facing monocular camera mounted on the ego vehicle.
Track data — Contains the detected lane tracks at each timestamp of the ego trajectory in the local coordinate frame of the ego vehicle.
To create lane specifications and simulate a scenario, follow these steps:
Explore the recorded vehicle data.
Import an OpenStreetMap road network into a driving scenario.
Add ego vehicle data from the GPS to the driving scenario.
Identify roads on which the ego vehicle is traveling.
Create lane specifications.
Generate a new scenario.
Simulate and visualize the generated scenario.
This diagram shows how you use the recorded data in this example. Note that you create the driving scenario from the GPS data, and OpenStreetMap file.
Explore Recorded Vehicle Data
The position of the ego vehicle has been captured using a lane detection sensor module. The inbuilt GPS sensor is placed in the middle of the dashboard of the ego vehicle. The inbuilt camera in the sensor module returns lane detections in terms of parabolic parameters.
Load Data
Define the range of timestamps for the data.
startTimeStamp = 1461634426377778; endTimeStamp = 1461634462779242;
Load the GPS data, track detections, and image data from their respective MAT files for the selected range of timestamps.
[gpsData,laneDetections,cameraImages] = helperLoadFile(startTimeStamp,endTimeStamp);
Visualize Ego Trajectory
Extract the latitude, longitude, time, and altitude values from the GPS data.
count = size(gpsData,2); time = arrayfun(@(x) x.timeStamp,gpsData)'; lat = arrayfun(@(x) x.latitude,gpsData)'; lon = arrayfun(@(x) x.longitude,gpsData)'; alt = arrayfun(@(x) x.altitude,gpsData)';
Visualize the recorded GPS data using the geoplayer
object.
zoomLevel = 17; player = geoplayer(lat(1),lon(1),zoomLevel); plotRoute(player,lat,lon); for i = 1:length(lat) plotPosition(player,lat(i),lon(i)); end
Visualize Lane Detections
The recorded lane detections show information about the driving lane of the ego vehicle. Visualize these detections using a bird's-eye plot.
currentFigure = figure(Name="Lane Detections"); hPlot = axes(uipanel(currentFigure)); bep = birdsEyePlot(XLim=[0 60],YLim=[-35 35],Parent=hPlot); bepPlotters.LaneLeft = laneBoundaryPlotter(bep, ... DisplayName="Left lane marking", ... Color="red",LineStyle="-"); bepPlotters.LaneRight = laneBoundaryPlotter(bep, ... DisplayName="Right lane marking", ... Color="red",LineStyle="-"); olPlotter = outlinePlotter(bep); for i= 1:size(laneDetections,2) plotOutline(olPlotter,[0 0],0,4.7,1.8,OriginOffset=[-1.35 0],Color=[0 0.447 0.741]); % Draw left lane boundary egoLaneLeft = cast([laneDetections(i).left.curvature,laneDetections(i).left.headingAngle, ... laneDetections(i).left.offset],"single"); bepPlotters.LaneLeft.LineStyle = drivingUtils.getLaneTypeBEV(laneDetections(i).left.boundaryType); lb = parabolicLaneBoundary(egoLaneLeft); plotLaneBoundary(bepPlotters.LaneLeft,lb); % Draw right lane boundary egoLaneRight = cast([laneDetections(i).right.curvature,laneDetections(i).right.headingAngle, ... laneDetections(i).right.offset],"single"); bepPlotters.LaneRight.LineStyle = drivingUtils.getLaneTypeBEV(laneDetections(i).right.boundaryType); rb = parabolicLaneBoundary(egoLaneRight); plotLaneBoundary(bepPlotters.LaneRight,rb); pause(0.01); end
Import OpenStreetMap Road Network into Driving Scenario
The road network file used to generate the virtual scenario has been downloaded from the OpenStreetMap (OSM) website. The OpenStreetMap provides access to worldwide, crowd-sourced map data. The data is licensed under the Open Data Commons Open Database License (ODbL). For more information on the ODbL, see the Open Data Commons Open Database License site. Use the latitude and longitude data from the GPS to fetch an OpenStreetMap file containing the corresponding road network information. Use the roadNetwork
function to import this road network information into a driving scenario.
Create a driving scenario object and import the OSM road network into the generated scenario.
scenario = drivingScenario; % Fetch SD map according to GPS coordinates url = ['https://api.openstreetmap.org/api/0.6/map?bbox=' ... num2str(min(lon)) ',' num2str(min(lat)) ',' ... num2str(max(lon)) ',' num2str(max(lat))]; fileName = websave("drive_map.osm",url,weboptions(ContentType="xml")); roadNetwork(scenario,"OpenStreetMap",fileName);
Add Ego Vehicle Data from GPS to Imported Scenario
The ego vehicle position data is collected from the GPS sensor and stored as a MAT file. This file specifies the latitude, longitude, altitude in meters, velocity in meters per second, and timestamp values in the Unix POSIX timestamp format for each data instance recorded for the ego vehicle.
Compute the trajectory waypoints of the ego vehicle from the recorded GPS coordinates. Use the latlon2local
function to convert the raw GPS coordinates to the local east-north-up Cartesian coordinates. The transformed coordinates define the trajectory waypoints of the ego vehicle.
origin = [(max(lat) + min(lat))/2,(min(lon) + max(lon))/2,0]; waypoints = zeros(count,3); % Convert lat and lon to local coordinates to create waypoints for ego vehicle [waypoints(:,1),waypoints(:,2)] = latlon2local(lat,lon,alt,origin); % Filter to remove noise window = round(count*0.2); waypoints = smoothdata(waypoints,"rloess",window);
Compute velocity of the ego vehicle.
distancediff = diff(waypoints);
timediff = cast(diff(time),"double")./1000000;
egoSpeed = zeros(count,1);
egoSpeed(2:end) = vecnorm(distancediff./timediff,2,2);
egoSpeed(1) = egoSpeed(2);
Add the ego vehicle to the imported scenario.
egoVehicle = vehicle(scenario,ClassID=1,Mesh=driving.scenario.carMesh); trajectory(egoVehicle,waypoints,egoSpeed);
Identify Roads Intersecting Ego Path
The imported road network contains many roads. Extract the roads on which the ego vehicle is traveling.
[roads,roadData] = helperGetRoadsInEgoPath(scenario,waypoints);
The helperGetRoadsInEgoPath
function extracts the roads on which the ego vehicle is traveling. The function returns the RoadIDs
and roadData
structures, which contain information such as road centers, road widths, and road names for the relevant roads created in the imported scenario.
When junctions are present in the ego vehicle path, the helper function helperGetRoadsInEgoPath
tries to create a sequence of roads without junctions by extending the connecting roads of junctions. When the helper function helperGetRoadsInEgoPath
is not able to do so, it returns an error.
Create Lane Specifications
This example uses recorded lane detections to create lane specifications. The recorded data provides driving lane information in the vehicle coordinate frame. Initialize these parameters to create lane specifications:
numOfLanes
— Number of lanes in the road. The example assumes three lanes that have the same lane widths.startingLane
— Lane ID for the first waypoint of the ego vehicle. By default, this example sets this value to3
.
Find Ego Lane
The helper function helperGetEgoLanePosition
uses the startingLane
argument to compute the lane in which the ego vehicle is traveling. Sudden changes in lane width at points where the ego vehicle changes lanes can cause the function to return the incorrect lane.
numOfLanes = 3; startingLane = 3; % Fetch ego pose from imported scenario [yaw,pitch,roll] = drivingUtils.getEgoPose(scenario,waypoints); egoPose = struct("yaw",yaw,"pitch",pitch,"roll",roll); egoLanePosition = helperGetEgoLanePosition(laneDetections,egoPose,waypoints,startingLane,numOfLanes);
Create Lane Specifications from Lane Detections
Create lane specifications using the helperCreateLaneSpecification
function. This function accepts these input arguments: lane detections, ego vehicle waypoints, ego vehicle pose, IDs of roads on which the ego vehicle is traveling, number of lanes, and lane in which the ego vehicle is present at each waypoint.
Optionally, you can also specify these name-value arguments:
showAllLanes
— Set this parameter totrue
to create lane markings for all of the lanes. Otherwise, set this parameter tofalse
. When set tofalse
, the function creates lane markings for only the lane in which data is recorded, which is the ego vehicle lane. By default, the helper function sets this parameter totrue
. When creating non-ego lane markings, the function uses left-lane track detection for all lane markings between the current left-lane track detection and the left road edge. The function uses right-lane track detection for all lane markings between the current right-lane track detection and the right road edge.RoadEdges
— Specify lane marking styles for the left and right road edges, respectively, as a two-element vector of lane marking objects. By default, the helper function applies solid lane markings for both road edges.
The helperCreateLaneSpecification
function returns a structure that stores lane specifications and IDs for the roads.
% Lane marking styles for road edges roadEdges = [laneMarking("Solid",Color="y") laneMarking("Solid")]; % Create lane markings for all lanes % Create lane specifications lanespecifications = helperCreateLaneSpecification(laneDetections,waypoints,egoPose,roads,numOfLanes, ... egoLanePosition,showAllLanes=true,RoadEdges=roadEdges);
Generate Scenario
Create a new driving scenario. Add the extracted roads, lane specifications, ego vehicle, and ego trajectory to the scenario.
newScenario = drivingScenario;
Add the roads with their computed lane specifications.
for i = 1:length(lanespecifications) roadId = lanespecifications{i}.RoadID; pos = arrayfun(@(x) x.ID == roadId,roadData); road(newScenario,roadData(pos).RoadCenters,"Lanes",lanespecifications{i}.specifications); end
Waypoints generated from the GPS data are skewed to the left. Add a correction factor to shift the waypoints to the right, and smooth the waypoints to remove any noise.
correction = 3.5;
waypoints = mathUtils.shiftPoints(waypoints,correction,1);
window = round(length(waypoints)*0.2);
waypoints = smoothdata(waypoints,"rloess",window);
Add the ego vehicle and its waypoints.
egoVehicle = vehicle(newScenario, ... ClassID=1, ... Mesh=driving.scenario.carMesh, ... Length=2, ... Width=1); trajectory(egoVehicle,[waypoints(:,1) waypoints(:,2) waypoints(:,3)],egoSpeed);
Simulate and Visualize Generated Scenario
Run the simulation to visualize the generated driving scenario. The ego vehicle follows the trajectories generated from the GPS data. Verify the lane information generated from the recorded lane detections.
% Visualization currentFigure = figure(Name="Generated Scenario and Ground Truth Camera Images",Position=[0 0 1400 600]); movegui(currentFigure,"center"); % Add the chase plot hCarViewPanel = uipanel(currentFigure,Position=[0.5 0 0.5 1],Title="Chase Plot"); hCarPlot = axes(hCarViewPanel); chasePlot(egoVehicle,Parent=hCarPlot,ViewPitch=90,ViewHeight=120,ViewLocation=[0 0]); % Add the top view of the generated scenario hViewPanel = uipanel(currentFigure,Position=[0 0 0.5 1],Title="Camera View"); hPlot = axes(hViewPanel); i = 1; camerTime = cast([cameraImages(:).timeStamp]' - gpsData(1).timeStamp,"double")/10^6; while advance(newScenario) while i <= length(camerTime) && newScenario.SimulationTime >= camerTime(i) image(cameraImages(i).mov.cdata,Parent=hPlot); i = i + 1; end pause(0.001); end