Main Content

Track Colored Objects Using Pixy2 Vision Sensor

This example shows how to implement an object tracking system using a LEGO® MINDSTORMS® EV3™ hardware and a Pixy2 vision sensor.

Introduction

Object tracking is one of the various robotics applications that includes interacting with a colored object. Relatively easy to buid, an object tracking robot can estimate the trajectory of a colored moving object with the help of a vision sensor, without any human intervention.

This example uses a two-wheeled robot built with LEGO MINDSTORMS EV3 hardware. The robot tracks a colored moving object using a Pixy2 vision sensor. The robot self-aligns itself with the trained colored object while maintaining a threshold distance.

Prerequisites

Note: This example supports Pixy2 LEGO firmware version 3.0.14.

Required Hardware

  • LEGO MINDSTORMS EV3 Brick

  • Pixy2 vision Sensor

  • Two LEGO MINDSTORMS EV3 Medium motors

  • Micro USB Cable

  • Pixy2 LEGO Adaptor

Task 1: Set up Robot

  1. Build a two-wheeled robot with a Pixy2 vision sensor mounted on it.

  2. Set up a connection between EV3 brick and your host machine. For details on how to set up the connection, see Task 2 in Getting Started with LEGO® MINDSTORMS® EV3™ Hardware example.

Task 2: Set up Pixy2 Vision Sensor Through PixyMon Utility

PixyMon is an application that acts as an interface to help you view what the Pixy2 vision sensor sees as either unprocessed or processed video.

Configure I2C Address: Open the PixyMon utility and navigate to File > Configure > Pixy Parameters (saved on Pixy) > Interface. Configure the I2C address between 0x54 and 0x57.

Configure Sensor to Detect Objects: You can use the PixyMon utility to configure the sensor for seven unique colored objects. For more information on how to train the sensor for Color Signature 1, refer to Train Pixy2.

Configure Line Tracking Mode: Open the PixyMon utility and navigate to File > Configure > Pixy Parameters (saved on Pixy) > Expert. Check Delayed turn box to enable intersection detection.

Task 3: Configure Model and Calibrate Parameters

This support package provides a preconfigured model that you can use to train the robot for object tracking.

To open the model, run this command at the MATLAB command prompt:

open_system('ev3_pixy2_tracking')

You can configure the following parameters in the Pixy2 Vision Sensor block:

1. Data Source: Refers to configuring the input parameters

  • Select the input port connected to the Pixy2 vision sensor in the EV3 brick input port parameter on the Pixy2 Vision Sensor block.

  • Select the I2C address parameter to the same I2C address as configured in PixyMon.

  • The default Tracking mode for Pixy2 vision sensor is Color Signature. You can use other tracking options like Color Code Signature, Color and Color Code Signature (Mixed), and Line.

  • The default Color Signature is 1. You can use ALL to track all the color signatures.

  • Set the Sample time parameter, so that the block can read values from the Pixy2 Vision Sensor block based on this time.

2. Algorithm: Contains Object Tracking subsystem that refers to Pixy2 X Frame Reference value and Reference Motor Power value in the Parameter Tuning area.

open_system('ev3_pixy2_tracking/Object Tracking')

  • Use the Pixy2 X Frame Reference slider in the Parameter Tuning area to adjust the frame of view for the Pixy2 sensor. The default alignment is center. The central view point for the sensor shifts to the left as the slider moves towards the left and vice a versa. The xy-coordinates, width, and height of the object are computed using the frame of view of the sensor.

  • Use the Reference Motor Power slider in the Parameter Tuning area to control (slow down or speed up) the motor movement. The default value is 7. This value is an input to the Alignment Control block.

3. Actuators: Refers to controlling the LEGO motors connected to the output ports

open_system('ev3_pixy2_tracking/Motors')

Ensure that the correct EV3 brick output port parameter is selected for the LEGO motors, labelled as Left Motor and Right Motor.

Task 4: Run Model on Robot

  1. On the Hardware tab of your Simulink model, click Monitor & Tune to run the model on the LEGO MINDSTORMS EV3 hardware.

  2. Place the colored object anywhere in the field of view of the Pixy2 vision sensor. Ensure that the robot follows the object and corrects its trajectory as it loses track of the object.

Other Things to Try

  1. Try training the sensor with objects of different color signatures. You can observe the results in real time.

  2. Object Tracking Using PID Controller: Use the PID Controller block to further improve how the robot tracks the moving object. The advantages of using a PID controller are improvement in response time, smoother and accurate motor control. To open the model, run this command at the MATLAB command prompt:

    open_system('ev3_pixy2_tracking_pidcontrol')

Related Links

See Also

LEGO EV3 Object Tracking System Using Pixy2 Vision Sensor and PID Controller