Main Content

Build Map from 2-D Lidar Scans Using SLAM

This example shows how to implement the SLAM algorithm on a series of 2-D lidar scans using scan processing and pose graph optimization (PGO). The goal of this example is to estimate the trajectory of the robot and build a map of the environment.

SLAM stands for simultaneous localization and mapping.

  • Localization — Estimating the pose of a robot in a known environment.

  • Mapping — Building the map of an unknown environment from a known robot pose and sensor data.

In the SLAM process, a robot creates a map of an environment while localizing itself. SLAM has a wide range of applications in robotics, self-driving cars, and UAVs.

In offline SLAM, a robot steers through an environment and records the sensor data. The SLAM algorithm processes this data to compute a map of the environment. The map is stored and used for localization, path-planning during the actual robot operation.

This example uses a 2-D offline SLAM algorithm. The algorithm incrementally processes recorded lidar scans and builds a pose graph to create a map of the environment. To overcome the drift accumulated in the estimated robot trajectory, the example uses scan matching to recognize previously visited places and then uses this loop closure information to optimize poses and update the map of the environment. To optimize the pose graph, this example uses a 2-D pose graph optimization function from Navigation Toolbox™.

In this example, you learn how to:

  • Estimate robot trajectory from a series of scans using scan registration algorithms.

  • Optimize the drift in the estimated robot trajectory by identifying previously visited places (loop closures).

  • Visualize the map of the environment using scans and their absolute poses.

Load Laser Scans

This example uses data collected in an indoor environment using a Jackal™ robot from Clearpath Robotics™. The robot is equipped with a SICK™ TiM-511 laser scanner with a maximum range of 10 meters. Load the wareHouse.mat file containing the laser scans into the workspace.

data = load("wareHouse.mat");
scans = data.wareHouseScans;

Estimate Robot Trajectory

Create a lidarscanmap object. Using this object, you can:

  • Store and add lidar scans incrementally.

  • Detect, add, and delete loop closures.

  • Find and update the absolute poses of the scans.

  • Generate and visualize a pose graph.

Specify the maximum lidar range and grid resolution values. You can modify these values to fine-tune the map of the environment. Use these values to create the lidar scan map.

maxLidarRange = 8;
gridResolution = 20;
mapObj = lidarscanmap(gridResolution,maxLidarRange);

Incrementally add scans from the input data to the lidar scan map object by using the addScan function. This function rejects scans if they are too close to consecutive scans.

for i = 1:numel(scans)
    isScanAccepted = addScan(mapObj,scans{i});
     if ~isScanAccepted

Reconstruct the scene by plotting the scans and poses tracked by the lidar scan map.

hFigMap = figure;
axMap = axes(Parent=hFigMap);
title(axMap,"Map of the Environment and Robot Trajectory")

Notice that the estimated robot trajectory drifts over time. The drift can be due to any of these reasons:

  • Noisy scans from the sensor, without sufficient overlap.

  • Absence of significant features in the environment.

  • An inaccurate initial transformation, especially when rotation is significant

Drift in the estimated trajectory results in an inaccurate map of the environment.

Drift Correction

Correct the drift in trajectory by accurately detecting the loops, which are the places the robot returns to, after previously visiting. Add the loop closure edges to the lidarscanmap object to correct the drift in trajectory during pose graph optimization.

Loop Closure Detection

Loop closure detection determines whether, for a given scan, the robot has previously visited the current location. The search consists of matching the current scan against the previous scans around the current robot location, within the specified radius loopClosureSearchRadius. Accept a scan as a match if the match score is greater than the specified threshold loopClosureThreshold.

You can detect loop closures by using the detectLoopClosure function of the lidarscanmap object and the add them to the map object by using the addLoopClosure function.

You can increase the loopClosureThreshold value to avoid false positives in loop closure detection, but the function can still return bad matches in environments with similar or repeated features. To address this, increase the loopClosureSearchRadius value to search a larger radius around the current scan for loop closures, though this increases computation time.

You can also specify the number of loop closure matches loopClosureNumMatches. All of these parameters help in fine-tuning the loop closure detections.

loopClosureThreshold = 110;
loopClosureSearchRadius = 2;
loopClosureNumMatches = 1;
mapObjLoop = lidarscanmap(gridResolution,maxLidarRange);
for i = 1:numel(scans)
    isScanAccepted = addScan(mapObjLoop,scans{i});
    % Detect loop closure if scan is accepted
     if isScanAccepted
         [relPose,matchScanId] = detectLoopClosure(mapObjLoop, ...
             MatchThreshold=loopClosureThreshold, ...
             SearchRadius=loopClosureSearchRadius, ...
         % Add loop closure to map object if relPose is estimated
         if ~isempty(relPose)

Optimize Trajectory

Create a pose graph object from the drift-corrected lidar scan map by using the poseGraph function. Use the optimizePoseGraph (Navigation Toolbox) function to optimize the pose graph.

pGraph = poseGraph(mapObjLoop);
updatedPGraph = optimizePoseGraph(pGraph);

Extract the optimized absolute poses from the pose graph by using the nodeEstimates (Navigation Toolbox) function, and update the trajectory to build an accurate map of the environment.

optimizedScanPoses = nodeEstimates(updatedPGraph);

Visualize Results

Visualize the change in robot trajectory before and after pose graph optimization. The red lines represent loop closure edges.

hFigTraj = figure(Position=[0 0 900 450]);

% Visualize robot trajectory before optimization
axPGraph = subplot(1,2,1,Parent=hFigTraj);
axPGraph.Position = [0.04 0.1 0.45 0.8];
title(axPGraph,"Before PGO")

% Visualize robot trajectory after optimization
axUpdatedPGraph = subplot(1,2,2,Parent=hFigTraj);
axUpdatedPGraph.Position = [0.54 0.1 0.45 0.8];
title(axUpdatedPGraph,"After PGO")
axis([axPGraph axUpdatedPGraph],[-6 10 -7 3])
sgtitle("Robot Trajectory",FontWeight="bold")

Visualize the map of the environment and robot trajectory before and after pose graph optimization.

hFigMapTraj = figure(Position=[0 0 900 450]);

% Visualize map and robot trajectory before optimization
axOldMap = subplot(1,2,1,Parent=hFigMapTraj);
axOldMap.Position = [0.05 0.1 0.44 0.8];
title(axOldMap,"Before PGO")

% Visualize map and robot trajectory after optimization
axUpdatedMap = subplot(1,2,2,Parent=hFigMapTraj);
axUpdatedMap.Position = [0.56 0.1 0.44 0.8];
title(axUpdatedMap,"After PGO")
axis([axOldMap axUpdatedMap],[-9 18 -10 9])
sgtitle("Map of the Environment and Robot Trajectory",FontWeight="bold")

See Also


Related Topics