Curb Detection in 3-D Lidar Point Cloud
This example shows how to detect curbs in lidar point clouds. A curb is a line of stone or concrete, that connects the roadway to the sidewalk. Curbs act as delimiters for the drivable area of the road. In the absence of GPS signals, detecting the curb position is important for path planning and safe navigation in autonomous driving applications.
This example uses a subset of the PandaSet data set from Hesai and Scale. The PandaSet contains lidar point cloud scans of various city scenes captured using the Pandar64 sensor.
Introduction
Curb detection in lidar point cloud involves the detecting the left and right curbs of a road with respect to the ego vehicle. The lidar sensor is mounted on the top of the vehicle.
Use these steps to detect curbs in the point cloud data captured by the lidar sensor:
Extract a region of interest(ROI)
Classify on-road and off-road points
Identify the road shape using the off-road points
Detect candidate curb points using the on-road points
Download Lidar Data Set
The size of the data set is 5.2 GB, and it contains 2560 preprocessed organized point clouds. Out of these, this example uses a subset of 27 point clouds. Each point cloud is specified as a 64-by-1856-by-3 array. The ground truth data contains the semantic segmentation labels for 13 classes. The point clouds are stored in PCD format, and the ground truth data is stored in PNG format.
Download the PandaSet data set using the helperDownloadPandasetData
helper function, defined at the end of this example.
% Set random seed to generate reproducible results rng("default"); [ptCloudData,labelsData] = helperDownloadPandasetData;
Select the first frame of the data set for further processing. Visualize the point cloud.
ptCloud = ptCloudData{1}; labels = labelsData{1}; % Visualize the input point cloud with corresponding ground truth labels figure pcshow(ptCloud.Location,labels) view(2) title("Input Lidar Point Cloud")
Preprocess Data
As a preprocessing step for detecting curb points, first extract a region of interest from the point cloud and classify the points within it as on-road or off-road points.
Based on the installed position of the lidar sensor, the point cloud data is sparse beyond a certain distance. To ensure the point cloud you extract is dense enough for further processing, specify an ROI within a limited distance of the sensor.
% Define ROI in meters xLim = [-35 20]; yLim = [-25 25]; roiIdx = ptCloud.Location(:,:,1) >= xLim(1) & ... ptCloud.Location(:,:,1) <= xLim(2) & ... ptCloud.Location(:,:,2) >= yLim(1) & ... ptCloud.Location(:,:,2) <= yLim(2);
Classify the point cloud into on-road and off-road points using these labels:
1 — Unlabeled
2 — Vegetation
3 — Ground
4 — Road
5 — Road Marking
6 —Side Walk
7 — Car
8 — Truck
9 — Other Vehicle
10 — Pedestrian
11 — Road Barrier
12 — Signs
13 — Buildings
labels(~roiIdx) = nan; % The on-road points consist of ground, road and sidewalk onRoadLabelsIdx = (labels == 3 | labels == 4 | labels == 6); % The off-road points consist of buildings and other objects offRoadLabelsIdx = (labels == 2 | labels == 11 | ... labels == 12 | labels == 13); % Visualize the point cloud, along with on-road and off-road points. figure subplot(1,3,1) pcshow(select(ptCloud,roiIdx)) view(2) title("Cropped Input Point Cloud") subplot(1,3,2) pcshow(select(ptCloud,onRoadLabelsIdx)) view(2) title("On-road Point Cloud") subplot(1,3,3) pcshow(select(ptCloud,offRoadLabelsIdx)) view(2) title("Off-road Point Cloud")
If semantic labels are not available with your data set, you can compute them using a deep learning network. For more information, see the Lidar Point Cloud Semantic Segmentation Using SqueezeSegV2 Deep Learning Network example.
Detect Road Shape
Apply the beam model to the off-road points to detect the road shape. For more information about the beam model, see [1] and [2].
Launch a sequence of beams from a launching point. In this example, the launching point is the location of the lidar sensor on the ego vehicle.
Divide the beams into beam zones according to the angular resolution of the sensor. In this example, the angular resolution is 1 degree.
Determine the beam angles and beam lengths with respect to the launching point.
For each beam zone, determine the distance between the launching point and closest point in that zone and the distance between the launching point and the farthest point in that zone.
Compute the normalized beam length of a beam zone as the ratio of the shortest and longest distance between the launching point and a point in that zone.
Apply a customized version of the toe-finding algorithm [3] to the normalized beam lengths to find the road shape.
The center angle of each toe slice is the road segmentation angle with respect to the launching point.
% Select the off-road point cloud offRoadPtCloud = select(ptCloud,offRoadLabelsIdx); % Use the helperRoadSegmentationAngles helper function defined at the end % of this example to compute the road segmentation angles [dk,roadAngles] = helperRoadSegmentationAngles(offRoadPtCloud); roadAngles = sort(roadAngles); roadAngles(end+1) = roadAngles(1); % Visualize the off-road points and the toe slices indicating the road shape figure subplot(1,2,1) pcshow(offRoadPtCloud,BackgroundColor=[1 1 1]) view(2) title("Off-Road Point Cloud") subplot(1,2,2) polarplot(dk) title("Road Shape")
Detect Road Curbs
Extract the feature curb points from the on-road points. Process each scan line of the on-road point cloud individually to extract the points satisfying the threshold on these features:
Smoothness Feature [2] — This feature checks for the smoothness of the area around a point. A higher smoothness value indicates that the point is an edge point, and a lower values indicates that the point is a plane point. Use the points with smoothness value greater than the specified smoothness threshold for further computation.
Height Difference Feature [2] — This feature checks for the standard deviation and the maximum difference of height around a point. Use the points with these values within the specified bounds for further computation.
Horizontal and Vertical Continuity Feature [1] — This feature checks for the horizontal and vertical continuity of a point with respect to its immediate neighboring points. Use the points with horizontal and vertical distances greater than the specified threshold for further computation.
% Set up the parameters for detecting the curbs numScanLines = 64; numNeighborsOnEachSide = 5; % Define the threshold for the smoothness feature smoothnessThreshold = 0.001; % Define the thresholds for the height difference feature (in meters) heightThreshold.heightMin = 0.02; heightThreshold.heightMax = 0.25; heightThreshold.heightSDMin = 0.02; heightThreshold.heightSDMax = 0.07; % Define the threshold for the horizontal and vertical continuity feature % (in degrees) angleResolutionThreshold = 0.24; ptInfluence = -2*numNeighborsOnEachSide;
Apply the height difference, smoothness, and horizontal and vertical continuity feature thresholds to the on-road points.
% Use the helperDetectFeatureCurbs helper function defined at the % end of this example to detect the feature curb points [featureCurbPoints,validSmoothnessIndices,validHeightDifferenceIndices, ... validHorizontalVerticalContinuityIndices] = helperDetectFeatureCurbs( ... onRoadLabelsIdx,numScanLines,ptCloud,numNeighborsOnEachSide, ... ptInfluence,smoothnessThreshold,heightThreshold,angleResolutionThreshold);
Visualize the detected features.
figure subplot(2,2,1) pcshow(select(ptCloud,onRoadLabelsIdx & ... validSmoothnessIndices)) view(2) title("Smoothness Feature") subplot(2,2,2) pcshow(select(ptCloud,onRoadLabelsIdx & ... validHeightDifferenceIndices)) view(2) title("Height Difference Feature") subplot(2,2,3) pcshow(select(ptCloud,onRoadLabelsIdx & ... validHorizontalVerticalContinuityIndices)) view(2) title("Horizontal and Vertical Continuity Feature") subplot(2,2,4) pcshow(featureCurbPoints) view(2) title("Feature Curb Points")
Next, compute the candidate curb points from the feature curb points:
Classify the feature curb points into different segments based on the road segmentation angles.
In each segment, define the points closest to the road segmentation lines as seed points. These seed points can be noisy and contain false positive points.
To improve the results, and to compute the candidate curb points, parabolically model the seed curbs in each segment, and use the RANSAC algorithm to remove the false positives that do not match the parabolic model.
% Define the threshold for RANSAC based filtering maxDistance = 0.1; % Define the segment size to find the seed curb points along the road % segmentation lines stepSegment = 0.1; % Define the length of the road segmentation lines to search for the seed % curb points roadLength = hypot(diff(featureCurbPoints.XLimits),diff(featureCurbPoints.YLimits))/2; % Use the helperDetectCandidateCurbs helper function, attached to this % example as a supporting file, to compute the candidate curb points from % the feature curb points candidateCurbPoints = helperDetectCandidateCurbs(roadLength,roadAngles, ... featureCurbPoints,stepSegment,maxDistance); % Visualize the point cloud overlaid with the candidate curb points figure pcshow(select(ptCloud,roiIdx).Location,"w") view(2) hold on pcshow(candidateCurbPoints,"r",MarkerSize=200) view(2) hold off title("Candidate Curb Points Detected")
Track Curb Points
Loop through and process the lidar data to extract and track the candidate curb points. Tracking curb points improves the robustness of curb detection. You can halt the curb tracking at the segmented roads, and restart it when the ego vehicle leaves the segmented roads. Curb tracking involves polynomial fitting on the XY-points using a 2-degree polynomial represented as , where a, b, and c are the polynomial parameters. Curb tracking is a two-step process:
Track curb polynomial parameters a and b to control the curvature of the polynomial.
Track curb polynomial parameter c to control the drift of the polynomial.
Update these parameters by using a Kalman filter with a constant acceleration motion model. Use the configureKalmanFilter
function to initialize a Kalman filter.
player = pcplayer([-35 20],[-25 25],[-10 10],MarkerSize=6); % Initial values for Kalman filter initParams.initialEstimateError = [1 1 1]*1e-1; initParams.motionNoise = [1 1 1]*1e-7; initParams.measurementNoise = 10; leftTracker.drift = []; leftTracker.filteredDrift = []; leftTracker.curveSmoothness = []; leftTracker.filteredCurveSmoothness = []; rightTracker.drift = []; rightTracker.filteredDrift = []; rightTracker.curveSmoothness = []; rightTracker.filteredCurveSmoothness = []; isTracking = false; isLeftCurbTrackingInitialised = false; isRightCurbTrackingInitialised = false; isPrevAngleSet = false; for fileIdx = 1:numel(ptCloudData) % Load the point cloud and it's ground truth labels ptCloud = ptCloudData{fileIdx}; labels = labelsData{fileIdx}; % Find the points lying within ROI xLim = [-35 20]; yLim = [-25 25]; roiIdx = ptCloud.Location(:,:,1) >= xLim(1) & ... ptCloud.Location(:,:,1) <= xLim(2) & ... ptCloud.Location(:,:,2) >= yLim(1) & ... ptCloud.Location(:,:,2) <= yLim(2); labels(~roiIdx) = nan; % Classify the points into on-road points and off-road points onRoadLabelsIdx = (labels == 3| labels == 4 | labels == 6); offRoadLabelsIdx = (labels == 2 | labels == 11 | labels == 12 | ... labels == 13); offRoadPtCloud = select(ptCloud,offRoadLabelsIdx); % Compute the road segmentation angles using off-road points [dk,roadAngles] = helperRoadSegmentationAngles(offRoadPtCloud); if ~isempty(roadAngles) roadAngles = sort(roadAngles); roadAngles(end+1) = roadAngles(1); if ~isPrevAngleSet && numel(roadAngles) == 3 prevAngle = roadAngles; isPrevAngleSet = true; end end % Compute number of road segments in front and behind the % ego vehicle numFrontRoadSegments = sum(roadAngles(1:end-1) < 95 | ... roadAngles(1:end-1) > 270); numRearRoadSegments = sum(~(roadAngles(1:end-1) < 95 | ... roadAngles(1:end-1) > 270)); % Check if curb tracking should be enabled or disabled if ~isTracking if (numFrontRoadSegments < numRearRoadSegments) || ... (numel(roadAngles) == 3) isTracking = true; end else if numFrontRoadSegments > numRearRoadSegments isTracking = false; isLeftCurbTrackingInitialised = false; isRightCurbTrackingInitialised = false; isPrevAngleSet = false; end if numel(roadAngles) < 3 && isPrevAngleSet roadAngles = prevAngle; end end prevAngle = roadAngles; % Compute the candidate curb points using on-road points featureCurbPoints = helperDetectFeatureCurbs(onRoadLabelsIdx, ... numScanLines,ptCloud,numNeighborsOnEachSide,ptInfluence, ... smoothnessThreshold,heightThreshold,angleResolutionThreshold); roadLength = hypot(diff(featureCurbPoints.XLimits), ... diff(featureCurbPoints.YLimits))/2; candidateCurbPoints = helperDetectCandidateCurbs(roadLength, ... roadAngles,featureCurbPoints,stepSegment,maxDistance); if isempty(candidateCurbPoints) continue; end % Divide the candidate curb points into front and rear curb points % based on the ego vehicle position frontCandidateCurbPoints = candidateCurbPoints( ... candidateCurbPoints(:,1) >= 0,:); rearCandidateCurbPoints = candidateCurbPoints( ... candidateCurbPoints(:,1) < 0,:); xVal = linspace(min(frontCandidateCurbPoints(:,1)),xLim(2),80); if isTracking leftCurb = []; rightCurb = []; model = pcfitplane(select(ptCloud,onRoadLabelsIdx),0.1, ... [0 0 1]); modelParams = model.Parameters; candidateAngles = atan2d(frontCandidateCurbPoints(:,2), ... frontCandidateCurbPoints(:,1)); candidateAngles(candidateAngles < 0) = candidateAngles( ... candidateAngles < 0) + 360; % Divide the candidate curb points, into left and right curb points if (roadAngles(1) >= 270 || roadAngles(1) < 90) && ... (roadAngles(2) >= 90 && roadAngles(2) < 270) idx = candidateAngles >= roadAngles(1) & ... candidateAngles < roadAngles(2); else idx = candidateAngles >= roadAngles(2) | ... candidateAngles < roadAngles(1); end leftCandidateCurbPoints = frontCandidateCurbPoints(idx,:); rightCandidateCurbPoints = frontCandidateCurbPoints(~idx,:); % Compute polynomials to track left and right curbs leftPolynomial = fitPolynomialRANSAC( ... leftCandidateCurbPoints(:,1:2),2,0.1); rightPolynomial = fitPolynomialRANSAC( ... rightCandidateCurbPoints(:,1:2),2,0.1); if ~isempty(leftPolynomial) if ~isLeftCurbTrackingInitialised [leftFilter,leftCurb] = helperTrackCandidateCurbs( ... leftPolynomial,xVal,modelParams,initParams); isLeftCurbTrackingInitialised = true; else [leftPolynomial,leftCurb,leftTracker] = helperTrackCandidateCurbs( ... leftPolynomial,xVal,modelParams, ... leftFilter,leftTracker); end end if ~isempty(rightPolynomial) if ~isRightCurbTrackingInitialised [rightFilter,rightCurb] = helperTrackCandidateCurbs( ... rightPolynomial,xVal,modelParams,initParams); isRightCurbTrackingInitialised = true; else [rightPolynomial,rightCurb,rightTracker] = helperTrackCandidateCurbs( ... rightPolynomial,xVal,modelParams, ... rightFilter,rightTracker); end end ptCloud = select(ptCloud,roiIdx); % Use the helperVisualiseOutput helper function, attached to this % example as a supporting file to visualise the final output outputPtCloud = helperVisualiseOutput(ptCloud,leftCurb,rightCurb, ... rearCandidateCurbPoints); else ptCloud = select(ptCloud,roiIdx); % Use the helperVisualiseOutput helper function, attached to this % example as a supporting file to visualise the final output outputPtCloud = helperVisualiseOutput(ptCloud,candidateCurbPoints); end if isOpen(player) view(player,outputPtCloud); view(player.Axes,[0 90]) camva(player.Axes,0) end end
Results
To analyze the curb detection results, compare them against the tracked curb polynomials by plotting them in figures. Each plot compares the parameters with and without the Kalman filter. The first figure compares the drift of curbs along the y-axis, and the second figure compares the smoothness of the curb polynomials. Smoothness is the rate of change of the slope of the curb curve.
figure ax = subplot(2,1,1); plot(leftTracker.drift) hold on plot(leftTracker.filteredDrift) hold off title("Left Curb Drift Along Y-axis") ax.Position(2) = ax.Position(2) - 0.08; ax = subplot(2,1,2); plot(rightTracker.drift) hold on plot(rightTracker.filteredDrift) hold off title("Right Curb Drift Along Y-axis") ax.Position(2) = ax.Position(2) - 0.06; Lgnd = legend("Drift Values","Filtered Drift Values", ... "Orientation","Horizontal"); Lgnd.Position(1:2) = [0.29 0.9];
figure ax = subplot(2,1,1); plot(leftTracker.curveSmoothness); hold on plot(leftTracker.filteredCurveSmoothness) hold off title("Left Curve Smoothness") ax.Position(2) = ax.Position(2) - 0.08; ax = subplot(2,1,2); plot(rightTracker.curveSmoothness); hold on plot(rightTracker.filteredCurveSmoothness) hold off title("Right Curve Smoothness") ax.Position(2) = ax.Position(2) - 0.06; Lgnd = legend("Curve Smoothness","Filtered Curve Smoothness", ... "Orientation","Horizontal"); Lgnd.Position(1:2) = [0.23 0.9];
Helper Functions
The helperDownloadPandasetData
helper function loads the lidar data set into the MATLAB workspace.
function [ptCloudData,labelsData] = helperDownloadPandasetData() lidarDataTarFile = matlab.internal.examples.downloadSupportFile("lidar", ... "data/Pandaset_LidarData.tar.gz"); filepath = fileparts(lidarDataTarFile); outputFolder = fullfile(filepath,"Pandaset"); % Check if tar.gz file is downloaded, but not uncompressed. if (~exist(fullfile(outputFolder,"Lidar"),"file")) ... &&(~exist(fullfile(outputFolder,"semanticLabels"),"file")) untar(lidarDataTarFile,outputFolder); end lidarData = fullfile(outputFolder,"Lidar"); labelsFolder = fullfile(outputFolder,"semanticLabels"); ptCloudData = cell(27,1); labelsData = cell(27,1); count = 1; for fileIdx = 1041:1067 ptCloud = pcread(fullfile(lidarData,fileIdx+".pcd")); % The example follows the convention that ego vehicle motion is along % the positive x-axis, hence, transform the point cloud theta = -90; rot = [cosd(theta) sind(theta) 0; ... -sind(theta) cosd(theta) 0; ... 0 0 1]; trans = [0 0 0]; tform = rigid3d(rot,trans); ptCloud = pctransform(ptCloud,tform); ptCloudData{count} = ptCloud; labelsData{count} = imread(fullfile(labelsFolder,fileIdx+".png")); count = count + 1; end end
The helperRoadSegmentationAngles
helper function returns the road segmentation angles.
function [dk,angles] = helperRoadSegmentationAngles(offRoadPtCloud) % Compute the angles of all the points in the off-road point cloud with % respect to the launching point (origin) roadAngles = atan2d(offRoadPtCloud.Location(:,2), ... offRoadPtCloud.Location(:,1)); roadAngles(roadAngles < 0) = roadAngles(roadAngles < 0) + 360; idx = floor(roadAngles); minBeamLength = ones(360,1); maxBeamLength = ones(360,1); for k = 1:360 % Use the points in the kth zone to compute the normalized beam length zonePtsIdx = idx >= (k-1) & idx < k; if any(zonePtsIdx) % Shortest distance between the zone points and the launching point minBeamLength(k) = min(hypot( ... offRoadPtCloud.Location(zonePtsIdx,1), ... offRoadPtCloud.Location(zonePtsIdx,2))); % Longest distance between the zone points and the launching point maxBeamLength(k) = max(hypot( ... offRoadPtCloud.Location(zonePtsIdx,1), ... offRoadPtCloud.Location(zonePtsIdx,2))); end end % Compute the normalized beam length dk = minBeamLength./maxBeamLength; dk(dk < 0.25) = 0; dk(dk < mean(dk)) = 0; peaksStart = find(diff(dk == 1) == 1) + 1; peaksEnd = find(diff(dk == 1) == -1); % Find indices of peaks to be merged idx = (peaksStart - peaksEnd) < 15; peaksStart(idx) = []; peaksEnd(idx) = []; if isempty(peaksStart) || isempty(peaksEnd) angles = []; else if peaksStart(1) ~= 1 peaksStart = [1; peaksStart]; end if peaksEnd(end) ~= 360 peaksEnd = [peaksEnd; 360]; end peaks = [peaksStart peaksEnd]; % Remove the outliers in peaks computation peaks(diff(peaks,[],2) <= 3,:) = []; % Compute the road segmentation angles from the peaks angles = mean(peaks,2); if abs(angles(1) + angles(end) - 360) > 0 && ... abs(angles(1) + angles(end) - 360) < 20 angles(1) = angles(1) + angles(end); if angles(1) > 360 angles(1) = angles(1) - 360; end angles(end) = []; end % Update the peaks information dk = zeros(360,1); for i = 1:size(peaks,1) dk(peaks(i,1):peaks(i,2)) = 1; end end end
The helperCheckSmoothness
helper function returns the indices of points satisfying the smoothnessThreshold
condition.
function isSmoothnessValid = helperCheckSmoothness(neighborPtCloud, ... currentPt,ptInfluence,numNeighborsOnEachSide,smoothnessThreshold) isSmoothnessValid = false(size(neighborPtCloud,1),1); diffX = ptInfluence*currentPt(:,1) + sum(neighborPtCloud(:,1,:),3); diffY = ptInfluence*currentPt(:,2) + sum(neighborPtCloud(:,2,:),3); diffZ = ptInfluence*currentPt(:,3) + sum(neighborPtCloud(:,3,:),3); numerator = vecnorm([diffX diffY diffZ]')'; denominator = numNeighborsOnEachSide*vecnorm(currentPt')'; isSmoothnessValid(numerator./denominator > smoothnessThreshold) = true; end
The helperCheckHeightDifference
helper function returns the indices of points satisfying the heightThreshold
conditions.
function isHeightDifferenceValid = helperCheckHeightDifference( ... neighborPtCloud,heightThreshold) isHeightDifferenceValid = false(size(neighborPtCloud,1),1); heightDifference = max(squeeze(neighborPtCloud(:,3,:)),[],2) - ... min(squeeze(neighborPtCloud(:,3,:)),[],2); heightStdDev = std(squeeze(neighborPtCloud(:,3,:)),0,2); isHeightDifferenceValid( ... heightStdDev > heightThreshold.heightSDMin & ... heightStdDev <= heightThreshold.heightSDMax & ... heightDifference >= heightThreshold.heightMin & ... heightDifference <= heightThreshold.heightMax) = true; end
The helperCheckHorizontalVerticalContinuity
helper function returns the indices of points satisfying the horizontalContinuityThreshold
and verticalContinuityThreshold
conditions.
function isHorizontalVerticalContinuityValid = helperCheckHorizontalVerticalContinuity( ... leftPt,currentPt,rightPt,angleResolutionThreshold) isHorizontalVerticalContinuityValid = false(size(currentPt,1),1); verticalAngle = atan2d(currentPt(:,3),hypot(currentPt(:,1), ... currentPt(:,2))); horizontalContinuityThreshold = hypot(currentPt(:,1),currentPt(:,2))* ... angleResolutionThreshold*(pi/180); verticalContinuityThreshold = horizontalContinuityThreshold.* ... sind(verticalAngle); distanceLeftHorizontal = sqrt(sum((currentPt(:,1:2) - leftPt(:,1:2)).^2,2)); distanceRightHorizontal = sqrt(sum((currentPt(:,1:2) - rightPt(:,1:2)).^2,2)); distanceLeftVertical = abs(currentPt(:,3) - leftPt(:,3)); distanceRightVertical = abs(currentPt(:,3) - rightPt(:,3)); isHorizontalVerticalContinuityValid(distanceLeftHorizontal >= ... horizontalContinuityThreshold & ... distanceRightHorizontal >= horizontalContinuityThreshold & ... distanceLeftVertical >= verticalContinuityThreshold & ... distanceRightVertical >= verticalContinuityThreshold) = true; end
The helperDetectFeatureCurbs
helper function detects the feature curb points from the on-road points.
function [featureCurbPoints,validSmoothnessIndices, ... validHeightDifferenceIndices, ... validHorizontalVerticalContinuityIndices] = helperDetectFeatureCurbs( ... onRoadLabelsIdx,numScanLines,ptCloud,numNeighborsOnEachSide, ... ptInfluence,smoothnessThreshold,heightThreshold,angleResolutionThreshold) % Declare and initialise the variables to store the indices of the points % satisfying the feature thresholds validSmoothnessIndices = false(size(onRoadLabelsIdx)); validHeightDifferenceIndices = false(size(onRoadLabelsIdx)); validHorizontalVerticalContinuityIndices = false(size(onRoadLabelsIdx)); for i = 1:numScanLines % Extract the points lying in a scan line scanLine = squeeze(ptCloud.Location(i,:,:)); % Extract the on-road points from the scan line points onRoadPtsInScanLine = scanLine(onRoadLabelsIdx(i,:),:); validPoints = find(onRoadLabelsIdx(i,:)); if size(onRoadPtsInScanLine,1) >= 2*numNeighborsOnEachSide + 1 leftPoints = nan(size(onRoadPtsInScanLine,1),3, ... numNeighborsOnEachSide); rightPoints = nan(size(onRoadPtsInScanLine,1),3, ... numNeighborsOnEachSide); onRoadPtsInScanLineCyclic = [... onRoadPtsInScanLine(end:-1:end-numNeighborsOnEachSide+1,:); onRoadPtsInScanLine; onRoadPtsInScanLine(1:numNeighborsOnEachSide,:)]; for j = numNeighborsOnEachSide+1: ... size(onRoadPtsInScanLineCyclic,1)-numNeighborsOnEachSide % Select the left neighbor points of each point leftPoints(j-numNeighborsOnEachSide,:,:) = reshape( ... onRoadPtsInScanLineCyclic(j-numNeighborsOnEachSide:j-1,:)', ... 1,3,numNeighborsOnEachSide); % Select the right neighbor points of each point rightPoints(j-numNeighborsOnEachSide,:,:) = reshape( ... onRoadPtsInScanLineCyclic(j+1:j+numNeighborsOnEachSide,:)', ... 1,3,numNeighborsOnEachSide); end leftPoints(isnan(leftPoints(:,1,1)),:,:) = []; rightPoints(isnan(rightPoints(:,1,1)),:,:) = []; % Use the helperCheckSmoothness helper function to find the points % satisfying the smoothness feature validSmoothnessIndicesInScanLine = helperCheckSmoothness( ... cat(3,leftPoints,rightPoints),onRoadPtsInScanLine, ... ptInfluence,numNeighborsOnEachSide, ... smoothnessThreshold).*validPoints'; validSmoothnessIndices(i, ... validSmoothnessIndicesInScanLine( ... validSmoothnessIndicesInScanLine ~= 0)) = true; % Use the helperCheckHeightDifference helper function to find the % points satisfying the height difference feature validHeightDifferenceIndicesInScanLine = helperCheckHeightDifference( ... cat(3,leftPoints,rightPoints),heightThreshold).*validPoints'; validHeightDifferenceIndices(i, ... validHeightDifferenceIndicesInScanLine( ... validHeightDifferenceIndicesInScanLine ~= 0)) = true; % Use the helperCheckHorizontalVerticalContinuity helper function % to find the points satisfying the horizontal and vertical % continuity feature validHorizontalVerticalContinuityIndicesInScanLine = ... helperCheckHorizontalVerticalContinuity( ... leftPoints(:,:,numNeighborsOnEachSide),onRoadPtsInScanLine, ... rightPoints(:,:,1),angleResolutionThreshold).*validPoints'; validHorizontalVerticalContinuityIndices(i, ... validHorizontalVerticalContinuityIndicesInScanLine( ... validHorizontalVerticalContinuityIndicesInScanLine ~= 0)) = true; end end featureCurbPoints = select(ptCloud,onRoadLabelsIdx & ... validHorizontalVerticalContinuityIndices & ... validSmoothnessIndices & ... validHeightDifferenceIndices); featureCurbPoints = pcdenoise(featureCurbPoints); end
The helperTrackCandidateCurbs
helper function tracks the candidate curb points using the Kalman filter.
function varargout = helperTrackCandidateCurbs(varargin) polynomial = varargin{1}; xVal = varargin{2}; modelParams = varargin{3}; if numel(varargin) == 4 initParams = varargin{4}; curveInitialParameters = polynomial(1:2); driftInitialParameters = polynomial(3); % Configure Kalman filter filter.curveFilter = configureKalmanFilter( ... "ConstantAcceleration",curveInitialParameters, ... initParams.initialEstimateError, ... initParams.motionNoise, ... initParams.measurementNoise); filter.driftFilter = configureKalmanFilter( ... "ConstantAcceleration",driftInitialParameters, ... initParams.initialEstimateError, ... initParams.motionNoise, ... initParams.measurementNoise); varargout{1} = filter; else filter = varargin{4}; tracker = varargin{5}; predict(filter.curveFilter); predict(filter.driftFilter); tracker.drift = [tracker.drift;polynomial(3)]; tracker.curveSmoothness = [tracker.curveSmoothness; polynomial(1)]; % Correct polynomial using Kalman filter updatedCurveParams = correct(filter.curveFilter, ... polynomial(1:2)); updatedDriftParams = correct(filter.driftFilter, ... polynomial(3)); % Update the polynomial polynomial = [updatedCurveParams,updatedDriftParams]; tracker.filteredDrift = [tracker.filteredDrift;polynomial(3)]; tracker.filteredCurveSmoothness = [tracker.filteredCurveSmoothness; polynomial(1)]; varargout{1} = polynomial; varargout{3} = tracker; end % Y-coordinate estimation yVal = polyval(polynomial,xVal); % Z-coordinate estimation zVal = (-modelParams(1)*xVal - ... modelParams(2)*yVal - modelParams(4))/modelParams(3); % Final Curb curb = [xVal' yVal' zVal']; varargout{2} = curb; end
References
[1]Zhang, Yihuan, Jun Wang, Xiaonian Wang, and John M. Dolan. "Road-Segmentation-Based Curb Detection Method for Self-Driving via a 3D-LiDAR Sensor." IEEE Transactions on Intelligent Transportation Systems 19, no. 12 (December 2018): 3981–91. https://doi.org/10.1109/TITS.2018.2789462.
[2] Wang, Guojun, Jian Wu, Rui He, and Bin Tian. "Speed and Accuracy Tradeoff for LiDAR Data Based Road Boundary Detection." IEEE/CAA Journal of Automatica Sinica 8, no. 6 (June 2021): 1210–20. https://doi.org/10.1109/JAS.2020.1003414.
[3] Hu, Jiuxiang, Anshuman Razdan, John C. Femiani, Ming Cui, and Peter Wonka. "Road Network Extraction and Intersection Detection From Aerial Images by Tracking Road Footprints." IEEE Transactions on Geoscience and Remote Sensing 45, no. 12 (December 2007): 4144–57. https://doi.org/10.1109/TGRS.2007.906107.