Main Content

Detect Object Collisions

You can use collision detection to model physical constraints of objects in the real world accurately, to avoid having two objects in the same place at the same time. You can use collision detection node outputs to:

  • Change the state of other virtual world nodes.

  • Apply MATLAB® algorithms to collision data.

  • Drive Simulink® models.

For example, you can use geometric sensors for robotics modeling. For examples of using collision detection, see Collision Detection Using Line Sensor and Differential Wheeled Robot in a Maze.

Set Up Collision Detection

To set up collision detection, define collision (pick) sensors that detect when they collide with targeted surrounding scene objects. The virtual world sensors resemble real-world sensors, such as ultrasonic, lidar, and touch sensors. The Simulink 3D Animation™ sensors are based on X3D sensors (also supported for VRML), as described in the X3D picking component specification. For descriptions of pick sensor output properties that you can access with VR Source and VR Sink blocks, see Use Collision Detection Data in Models.

  • PointPickSensor — Point clouds that detect which of the points are inside colliding geometries

  • LinePickSensor — Ray fans or other sets of lines that detect the distance to the colliding geometries

  • PrimitivePickSensor — Primitive geometries (such as a cone, sphere, or box) that detect colliding geometries

To add a collision detection sensor, use these general steps. For an example that reflects this workflow, see Collision Detection Using Line Sensor.

  1. In the 3D World Editor tree structure pane, select the children node of the Transform node to which you want to add a pick sensor.

  2. To create the picking geometry to use with the sensor, add a geometry node. Select Nodes > Add > Geometry and select a geometry appropriate to the type of pick sensor (for example, Point Set).

  3. Add a pick sensor node by selecting Nodes > Add > Pick Sensor Node.

  4. In the sensor node, right-click the pickingGeometry property and select USE. Specify the geometry node that you created for the sensor.

  5. Also in the sensor node, right-click the pickingTarget property and select USE. Specify the target objects for which you want the sensor to detect collisions.

    Instead of specifying the picking geometry with a USE, you can define the picking geometry directly. However, the directly defined geometry is invisible.

  6. Optionally, change default property values or specify other values for sensor properties. For information about the intersectionType, see Sensor Collisions with Multiple Object Pick Targets. For descriptions of output properties that you can access with a VR Source block, see Use Collision Detection Data in Models.

Here is an example of the key nodes for defining a collision detection sensor for the robot in the vrcollisions virtual world:

  • The Robot_Body node has the Line_Set node as one of its children. The Line_Set node defines the picking geometry for the sensor.

  • The Collision_Sensor defines the collision detection sensor for the robot. The sensor node pickingGeometry specifies to use the Line_Set node as the picking geometry and the Walls_Obstacles node as the targets for collision detection.

Sensor Collisions with Multiple Object Pick Targets

To control how a pick sensor behaves when it collides with a pick target geometry that consists of multiple objects, use the intersectionType property. Possible values are:

  • GEOMETRY – The sensor collides with union of individual bounding boxes of all objects defined in the pickTarget field. In general, this setting produces more exact results.

  • BOUNDS – (Default) The sensor collides with one large bounding box construed around all objects defined in the pickTarget field.

In the vrcollisions example, the LinePickSensor has the intersectionType field set to GEOMETRY. This setting means that the sensor that is inside the colliding geometry (consisting of the room walls), does not collide with the union of walls. A collision takes place only if sensor rays touch any of the walls. If the intersectionType is set to BOUNDS, collision detection works only for a sensor that approaches the room from the outside. The whole room is wrapped into one large bounding box that interacts with the sensor.

Make Picking Geometry Transparent

You can make the picking geometry used for a pick sensor invisible in the virtual world. For the picking geometry, in its Material node, set the Transparency property to 1. For example, in the Collision Detection Using Line Sensor virtual world, for the Collision_Sensor picking geometry node (Line_Set), in the Materials node, change the Transparency property to 1.

Avoid Impending Collisions

To avoid an impending collision (before the collision actually occurs), you can use the pickedRange output property for a LinePickSensor. As part of the line set picking geometry, define one or more long lines that reflect your desired amount of advance notice of an impending collision. You can make those lines transparent. Then create logic based on the pickedRange value.

Use Collision Detection Data in Models

The isActive output property of a sensor becomes TRUE when a collision occurs. To associate a model with the virtual reality scene, you can use a VR Source block to read the sensor isActive property and the current position of the object for which the sensor is defined. You can use a VR Sink block to define the behavior of the virtual world object, such as its position, rotation, or color.

For example, the VR Source block in the top left of the Collision Detection Using Line Sensor Simulink model gets data from the associated virtual world.

In the model, select the VR Source block, and then in the Simulink 3D Animation Viewer, select Simulation > Block parameters. This image shows some of the key selected properties.

For the LinePickSensor PointPickSensor, and PrimitivePickSensor, you can select these output properties for a VR Source block:

  • enabled – Enables node operation.


    The enabled property is the only property that you can select with a VR Sink block.

  • isActive – Indicates when the intersecting object is picked by the picking geometry.

  • pickedPoint – Displays the points on the surface of the underlying PickGeometry that are picked (in local coordinate system).

  • pickedRange – Indicates range readings from the picking. For details, see Avoid Impending Collisions.

For a PointPickSensor, you can select the enabled, isActive, and pickedPoint outputs. For the PrimitivePickSensor, you can select the enabled and isActive outputs.

The Robot Control subsystem block includes the logic to change the color and position of the robot.

Based on the Robot Control subsystem output, the VR Sink block updates the virtual world to reflect the color and position of the robot.


Consider adjusting the sample time for blocks for additional precision for collision detection.

Use Collision Detection in MATLAB

You can use collision detection in a virtual world that you define in MATLAB. This example is based on the vrcollisions virtual world. It does not use a Simulink model.

  1. Open and view the vrcollisions virtual world.

    w = vrworld('vrcollisions');
    fig = view(w, '-internal');
  2. Get the collision sensor and robot nodes of the virtual world.

    col = vrnode(w,'Collision_Sensor')
    rob = vrnode(w,'Robot')
    color = vrnode(w,'Robot_color')
  3. Move the robot, based on collision detection (when the isActive property is TRUE). At the default position, no collision is detected.

    for ii = 1:30
        % Move robot
        rob.translation = rob.translation + [0.05 0 0];
        % If collision is detected, change color to red.
        if col.isActive
            color.diffuseColor = [1 0 0];

Use Collision Detection Data in Virtual Worlds

You can use collision detection to manipulate virtual world objects, independently of a Simulink model or a virtual world object in MATLAB.

The Differential Wheeled Robot in a Maze virtual world defines two green IndexedLineSet pick sensors (Sensor1 and Sensor2) for the purple robot (Robot node).

The VRML code includes ROUTE nodes for each of the pick sensors.

The ROUTE nodes use logic defined in a Script node called ChangeColor.

See Also


Related Topics