MATLAB Answers


Ambiguous Disparity Map and Inadequate 3D Scene Reconstruction

Asked by YILDIRAY YILMAZ on 3 May 2016
Latest activity Commented on by YILDIRAY YILMAZ on 6 May 2016
I'm trying to measure distance from camera to object using stereo images. I used 40 image pairs for calibration with matlab stereo calibration app. At the result of the calibration, overall mean error was 1.49 pixels. I applied all of the technics which are in the Matlab's depth estimation from stereo video tutorial.
My actual distance from camera to object is 2.97 meters. However, I found 4,32 meters as a result of my program's calculation. I guess there is something wrong in my disparity map and 3d scene reconstruction. Because my disparity map is really ambiguous and point cloud is so inadequate. I would like to get any suggestion regarding this topic. Also i tried to apply some technics such as median and wiener filter to remove noise from images which were written at this topic:
These are my outputs:
This is my code:
Read and Rectify Images
imageLeft = imread('D:\stereo\imgPairLeft1.png');
imageRight = imread('D:\stereo\imgPairRight1.png');
imageLeft = undistortImage(imageLeft, stereoParams.CameraParameters1, ...
'OutputView', 'same');
imageRight = undistortImage(imageRight, stereoParams.CameraParameters2, ...
'OutputView', 'same');
[imageLeftRect, imageRightRect] = ...
rectifyStereoImages(imageLeft, imageRight, stereoParams);
imshow(stereoAnaglyph(imageLeftRect, imageRightRect));
title('Rectified Images');
imageLeftGray = rgb2gray(imageLeftRect);
imageRightGray = rgb2gray(imageRightRect);
imageLeftGrayHisteq = histeq(imageLeftGray);
imageRightGrayHisteq = histeq(imageRightGray);
Compute Disparity
disparityRange = [0 80];
disparityMap = disparity(imageLeftGrayHisteq, imageRightGrayHisteq, ...
'DisparityRange', disparityRange, 'BlockSize', 15);
imshow(disparityMap, disparityRange);
title('Disparity Map');
colormap jet;
Reconstruct the 3-D scence
points3D = reconstructScene(disparityMap, stereoParams);
% Convert to meters and create a pointCloud object
points3D = points3D ./ 1000;
ptCloud = pointCloud(points3D, 'Color', imageLeftRect);
title('Point Cloud');
binaryImage = imageLeftGrayHisteq > 0 & imageLeftGrayHisteq < 60;
binaryImage = imfill(binaryImage, 'holes');
% Assign each blob different color
labeledImage = bwlabel(binaryImage, 8);
coloredLabels = label2rgb(labeledImage, 'hsv', 'k', 'shuffle');
Blob Analysis
blobMeasurements = regionprops(labeledImage, imageLeftGrayHisteq, 'all');
numberOfBlobs = size(blobMeasurements, 1);
% Sort the rows by Area.
[~,index] = sortrows([blobMeasurements.Area].');
blobMeasurements = blobMeasurements(index);
newIndex = sortrows(index);
% After sorting, last index is the largest blob
tvUnitBoundingBox = blobMeasurements(newIndex(end)).BoundingBox;
Determine the distance of tv unit to the camera. Find the centroids of tv unit.
centroid = [round(tvUnitBoundingBox(:,1) + tvUnitBoundingBox(:,3) / 2) ...
round(tvUnitBoundingBox(:,2) + tvUnitBoundingBox(:,4) / 2)];
% Find 3-D world coordinates of the centroids.
centroidsIdx = sub2ind(size(disparityMap), centroid(:,2), centroid(:,1));
X = points3D(:, :, 1);
Y = points3D(:, :, 2);
Z = points3D(:, :, 3);
centroids3D = [X(centroidsIdx)'; Y(centroidsIdx)'; Z(centroidsIdx)'];
% Find the distances from the camera in meters
distanceFromTvUnitToCam = sqrt(sum(centroids3D .^ 2));
% Display the tv unit and its distance.
label = sprintf('%02f meters', distanceFromTvUnitToCam);
imshow(insertObjectAnnotation(imageLeftRect, 'rectangle', tvUnitBoundingBox, label));
title('Detected Object');


Sign in to comment.

1 Answer

Answer by Dima Lisin
on 3 May 2016
Edited by Dima Lisin
on 3 May 2016
 Accepted Answer

Hi Yildiray,
There are several problems with your calibration:
  1. The checkerboard is not really flat. You have to glue it to a hard flat surface, like a flat sheet of plastic or glass, which does not bend easily.
  2. There seems to be a long delay between taking the two images of each stereo pair. If you look at the calibration images, you can see that your face had the time to change expression between the two shots. That means that the checkerboard also moves noticeably in the time between taking of the two images, because a person cannot hold anything perfectly still for any length of time. To fix this, you either have to get a "real" stereo camera with hardware triggering, which takes the two images nearly simultaneously, or you need to lean your checkerboard on something, so that it does not move between the shots.
  3. You need to have a greater variation of 3D orientations of the checkerboard. In most of your calibration images you are rotating the checkerboard in-plane, but it is always more or less parallel to the image plane. In other words, most of your checkerboards are almost parallel to each other, which does not give you much information for the calibration. You need to slant the checkerboard forward and backward, and left and right.
  4. Finally, when you calibrate you have to look at the reprojection errors graph, and exclude the images with really high reprojection errors. Ideally, you want your average reprojection error over all the images to be less than .5 of a pixel.

  1 Comment

Hi Dima,
It works. Under favour of your suggestions, I calculated the distance with almost %100 accuracy. Thank you very much again.
Best wishes,

Sign in to comment.