Main Content

People Tracking Using TI mmWave Radar

This example shows you how to use data captured using the Texas Instruments (TI) mmWave radar for tracking people in an indoor environment. You learn how to use a processing chain to process radar point clouds and estimate the number of people in the scene as well as their states such as position and velocity.

Scenario and Data

In this example, you use a pre-recorded data set captured by a TI mmWave radar AWR6843 mounted on a stationary platform. At each instant, the radar data includes a point cloud and a timestamp in seconds. The data is recorded using the mmWaveRadar class. The point cloud is described using a cell array of objectDetection (Sensor Fusion and Tracking Toolbox) objects, where each objectDetection represents a point. The Measurement property of the objectDetection object in this example represents the azimuth (deg), range (m), and range-rate (m/s) of the detected point. The MeasurementNoise property describes the uncertainty (noise statistics) in the measurement. By default, the mmWaveRadar class assumes that standard deviation of the noise are same as the radar resolutions in azimuth, range, and range-rate dimensions. In addition, each objectDetection also contains the mounting information of the radar. In this example, the radar is assumed to be at the origin with its orientation aligned with the X-axis. For more information on how to represent mounting information using objectDetection, refer to the Convert Detections to objectDetection Format (Sensor Fusion and Tracking Toolbox) example.

The radar data is recorded using the configuration file HelperIndoorRadarConfiguration.cfg, attached with this example. It can be downloaded from the following location.

https://ssd.mathworks.com/supportfiles/timmwaveradar/data/IndoorTIRadarData.zip

You can also visualize the scenario used to record the data by playing the video file, IndoorRadarReferenceVideo.mp4 at the link above. To record new data using the same configuration file, use the code snippet shown below.

% Configuration file
cfgFile = fullfile(pwd,'\HelperIndoorRadarConfiguration.cfg');

% Create sensor 
rdr = mmWaveRadar("TI AWR6843ISK",...
    ConfigFile=cfgFile,...
    ReadMode="oldest",...
    DetectionCoordinates="sensor spherical",...
    EnableDopplerGroups=true,...
    EnableRangeGroups=false,...
    RemoveStaticClutter=false);

% Allocate memory
detectionLog = cell(1,0); 
timeLog = zeros(1,0); 

% Duration of recording in secondns
stopTime = 50; 

% Record until stop time
while isempty(timeLog) || timeLog(end) < stopTime
    [detectionLog{end+1},timeLog(end+1)] = rdr();
end

Download the data files into the current working directory. If you want to place the files in a different folder, change the directory name.

% Load data if unavailable. 
if ~exist('IndoorTIRadarData','dir')
    dataURL = "https://ssd.mathworks.com/supportfiles/timmwaveradar/data/IndoorTIRadarData.zip";
    datasetFolder = pwd;
    unzip(dataURL, datasetFolder);
end

% Load radar data
load(fullfile(pwd,'IndoorTIRadarData','IndoorTIRadarData.mat'),'detectionLog','timeLog');

% Load video data
vidReader = VideoReader(fullfile(pwd,'IndoorTIRadarData','IndoorTIRadarReferenceVideo.mp4'));

To visualize the radar data, the estimates tracks, and the intermediate steps of processing chain used in this example, you use the HelperTIRadarTrackingDisplay class attached with this example. The image below shows a sample of the radar point cloud and the camera reference image at a time instant.

display = HelperTIRadarTrackingDisplay('XLimits',[0 15],...
    'YLimits',[-8 8],...
    'MaxRange',25,...
    'CameraReferenceLines',zeros(2,0),...
    'RadarReferenceLines',zeros(2,0));

refImage = read(vidReader, 140);
display(detectionLog{140}, {}, objectTrack.empty(0,1), refImage);

Radar Processing Chain

You use the following processing chain to process raw radar point clouds into tracks.

The static reflections are first removed by inspecting the observed range-rate of all the reflections. Estimated tracks are used to prevent filtering of static reflections from dynamic objects while they stop for small time durations. The DBSCAN algorithm groups the dynamic point cloud from the same object into a cluster. The centroid estimation converts the point cloud from the same object into a single point observation with an associated uncertainty. The centroid estimates at each time instant are then tracked by a joint integrated probabilistic data association (JIPDA) tracker. The tracks represent the objects as points in the environment and provide position and velocity estimates.

To describe the noise statistics of the measurements for a point object model, you assume that the azimuth accuracy is about 3 degrees and the range accuracy is about 0.6 meters. These accuracies intend to describe the error between true object measurement and observed measurements. You update the recorded data log to provide the accuracy information using the MeasurementNoise property of the objectDetection. The range-rate accuracy is assumed to be the same as resolution of the radar.

for k = 1:numel(detectionLog)
    detections = detectionLog{k};
    for j = 1:numel(detections)
        detections{j}.MeasurementNoise(1,1) = 9; % 3^2 deg^2
        detections{j}.MeasurementNoise(2,2) = 0.36; % 0.6^2 m^2
    end
    detectionLog{k} = detections;
end

You specify the minimum range-rate of the measurement to be declared as dynamic.

minRangeRate = 0.5;

Next, you specify the parameters of the DBSCAN algorithm. You use the Mahalanobis distance cluster measurements into groups using the DBSCAN algorithm. You choose the epsilon threshold for DBSCAN as 3 and the minimum number of points per cluster as 1. Later, you will use the partitionDetections (Sensor Fusion and Tracking Toolbox) function later to partition the detections at each time instant into groups.

epsilon = 3;
minNumPts = 1;

You will obtain the centroid of each group using the mergeDetections (Sensor Fusion and Tracking Toolbox) function, which merges all the measurements and their noise statistics in a group using a standard Gaussian mixture merging algorithm.

Next, you create a JIPDA tracker using the trackerJPDA (Sensor Fusion and Tracking Toolbox) System object™. You set the TrackLogic property to enable the 'Integrated' logic. For the filter in the tracker, you use an extended Kalman filter (trackingEKF (Sensor Fusion and Tracking Toolbox)) with a 2-D constant velocity motion model defined in the initPeopleTrackingFilter supporting function. The initPeopleTrackingFilter function defines a low-uncertainty in cross range-rate when initializing the filter and a low process noise to model objects that do not maneuver rapidly.

tracker = trackerJPDA(TrackLogic="Integrated");
tracker.FilterInitializationFcn = @initPeopleTrackingFilter;

Further, specify clutter density, birth rate, and detection probability of the radar in the tracker. In this example, you use prior knowledge to calculate these numbers. The volume of the sensor in measurement space can be calculated using the measurement limits in each dimension. You assume that on average there 8 false alarms per step and 1 new target appearing in the field of view every 100 steps.

% Volume of measurement space
azSpan = 60;
rSpan = 25;
dopplerSpan = 5;
V = azSpan*rSpan*dopplerSpan;

% Number of false alarms per step
nFalse = 8;

% Number of new targets per step
nNew = 0.01;

% Probability of detecting the object
Pd = 0.9;

tracker.ClutterDensity = nFalse/V;
tracker.NewTargetDensity = nNew/V;
tracker.DetectionProbability = Pd;

Lastly, you specify the track management properties of the tracker to allow the tracker to discriminate between false and true objects and to create and delete tracks when they enter and exit the field of view, respectively.

% Confirm a track with more than 95 percent
% probability of existence
tracker.ConfirmationThreshold = 0.95; 

% Delete a track with less than 0.0001
% probability of existence
tracker.DeletionThreshold = 1e-4;

Now, you loop over the recorded data and run the processing chain to get estimates about the number of objects and their states. You visualize the point cloud, the clustered centroid measurements, as well as the position and velocity of each track.

tracks = objectTrack.empty(0,1);

for k = 1:numel(detectionLog)
    % Timestamp
    time = timeLog(k);

    % Radar at current time stamp
    detections = detectionLog{k};
    
    % Remove static returns
    isDynamic = false(1,numel(detections));
    for d = 1:numel(detections)
        isDynamic(d) = abs(detections{d}.Measurement(3)) > minRangeRate;
    end
    detectionsDynamic = detections(isDynamic);

    % Camera image
    refImage = read(vidReader, k);

    % Cluster detections
    if isempty(detectionsDynamic)
        clusters = zeros(0,1,'uint32');
    else
        clusters = partitionDetections(detectionsDynamic,epsilon,minNumPts,'Algorithm','DBSCAN');
    end
    
    % Centroid estimation
    clusteredDets = mergeDetections(detectionsDynamic, clusters);

    % % Track centroid returns
    if isLocked(tracker) || ~isempty(clusteredDets)
        tracks = tracker(clusteredDets, time);
    end

    % Update display
    display(detections, clusteredDets, tracks, refImage);

    if abs((time - 15)) <= 0.05
        im = getframe(gcf);
    end
end

Results

f = figure;
imshow(im.cdata,'Parent',axes(f));

In the display, you can visualize the raw data as well as results produced by the processing chain. Notice that both static reflections are filtered by the processing and dynamic obstacles around the radar are tracked, their identities are preserved across time, and their positions and velocities are estimated close to the point clouds returned by the radar.

Summary

In this example, you learned how to use a processing chain to process point cloud returns from a radar and how to track people in an indoor environment.

Supporting Functions

function filter = initPeopleTrackingFilter(detection)
% Create 3-D filter first
filter3D = initcvekf(detection);

% Create 2-D filter from the 3-D
state = filter3D.State(1:4);
stateCov = filter3D.StateCovariance(1:4,1:4);

% Reduce uncertainty in cross range-rate to 5 m/s
velCov = stateCov([2 4],[2 4]);
[v, d] = eig(velCov);
D = diag(d);
D(2) = 1;
stateCov([2 4],[2 4]) = v*diag(D)*v';

% Process noise in a slowly changing environment
Q = 0.25*eye(2);

filter = trackingEKF(State = state,...
    StateCovariance = stateCov,...
    StateTransitionFcn = @constvel,...
    StateTransitionJacobianFcn = @constveljac,...
    HasAdditiveProcessNoise = false,...
    MeasurementFcn = @cvmeas,...
    MeasurementJacobianFcn = @cvmeasjac,...
    ProcessNoise = Q,...
    MeasurementNoise = detection.MeasurementNoise);

end