Histogram-based object tracking
The histogram-based tracker incorporates the continuously adaptive mean shift (CAMShift) algorithm for object tracking. It uses the histogram of pixel values to identify the tracked object.
To track an object:
vision.HistogramBasedTracker object and set its properties.
Call the object with arguments, as if it were a function.
To learn more about how System objects work, see What Are System Objects? (MATLAB).
returns a tracker that tracks an object by using the CAMShift algorithm. It uses the
histogram of pixel values to identify the tracked object. To initialize the tracking
process, you must use the
hbtracker = vision.HistogramBasedTracker
initializeObject function to
specify an exemplar image of the object.
sets properties using one or more name-value pairs. Enclose each property name in
quotes. For example,
hbtracker = vision.HistogramBasedTracker(
Unless otherwise indicated, properties are nontunable, which means you cannot change their
values after calling the object. Objects lock when you call them, and the
release function unlocks them.
If a property is tunable, you can change its value at any time.
For more information on changing property values, see System Design in MATLAB Using System Objects (MATLAB).
ObjectHistogram— Normalized pixel value histogram
(default) | N-element vector.
Normalized pixel value histogram, specified as an N-element
vector. This vector specifies the normalized histogram of the object's pixel values.
Histogram values must be normalized to a value between
1. You can use the
initializeObject method to set the property.
returns a bounding
box, of the tracked object. Before using the tracker, you must identify the object to
track, and set the initial search window. Use the
bbox = hbtracker(I)
initializeObject function to do this.
I— Video frame
Video frame, specified as grayscale or any 2-D feature map that distinguishes the object from the background. For example, I can be a hue channel of the HSV color space.
bbox— Bounding box
Bounding box, returned as a four-element vector in the format, [x y width height].
Orientation, returned as an angle between –pi/2 and pi/2. The angle is measured from the x-axis and the major axis of the ellipse that has the same second-order moments as the object.
Score, returned as a scalar in the range [0 1]. A value of
corresponds to the maximum confidence.
To use an object function, specify the
object™ as the first input argument. For
example, to release system resources of a System
Track and display a face in each frame of an input video.
Create System objects for reading and displaying video and for drawing a bounding box of the object.
videoReader = VideoReader('vipcolorsegmentation.avi'); videoPlayer = vision.VideoPlayer(); shapeInserter = vision.ShapeInserter('BorderColor','Custom', ... 'CustomBorderColor',[1 0 0]);
Read the first video frame, which contains the object. Convert the image to HSV color space. Then define and display the object region.
objectFrame = im2single(readFrame(videoReader)); objectHSV = rgb2hsv(objectFrame); objectRegion = [40, 45, 25, 25]; objectImage = shapeInserter(objectFrame, objectRegion); figure imshow(objectImage) title('Red box shows object region')
(Optionally, you can select the object region using your mouse. The object must occupy the majority of the region. Use the following command.)
figure; imshow(objectFrame); objectRegion=round(getPosition(imrect))
Set the object, based on the hue channel of the first video frame.
tracker = vision.HistogramBasedTracker; initializeObject(tracker, objectHSV(:,:,1) , objectRegion);
Track and display the object in each video frame. The while loop reads each image frame, converts the image to HSV color space, then tracks the object in the hue channel where it is distinct from the background. Finally, the example draws a box around the object and displays the results.
while hasFrame(videoReader) frame = im2single(readFrame(videoReader)); hsv = rgb2hsv(frame); bbox = tracker(hsv(:,:,1)); out = shapeInserter(frame,bbox); videoPlayer(out); end
Release the video player.
 Bradsky, G.R. "Computer Vision Face Tracking For Use in a Perceptual User Interface." Intel Technology Journal. January 1998.
Usage notes and limitations:
See System Objects in MATLAB Code Generation (MATLAB Coder).