NAOqi Vision - Overview | API


This module is experimental. In order to improve its performances, it is likely the algorithm used for the detection will change in the next version, which may result in API changes.

What it does

ALMovementDetection extractor enables to detect movements around the robot thanks to its camera.

How it works

The movement detection process is done by calculating the optical flow of some feature points in two successive images returned by the camera.

ALMemory key

Each time some movement is detected, the ALMemory key MovementDetection/MovementInfo is updated and an ALMemory event, MovementDetection/MovementDetected, is raised.

This key is organized as follows:

MovementInfo =

GeneralInfo: this field contains information about the moving points in the image. It has the following structure:

GeneralInfo =
  • MeanPosition = [x,y] contains the angular coordinates (in radians) of the center of gravity of the cluster.
  • MeanVelocity = [vx,vy] corresponds to the mean velocity of the cluster, computed from the velocity of all the moving points. It is an angular velocity, expressed in radians/s.
  • PositionsList is a list of the coordinates of all the moving points:
PositionsList =

Each pair [x,y] corresponds to the angular coordinates of a point (in radians).

  • VelocitiesList is a list of the velocities of all the moving points:
VelocitiesList =

Each pair [vx,vy] corresponds to the angular velocity of a point (in radians/s).

Thus, the final structure of the ALMemory key MovementDetection/MovementInfo is:

MovementInfo = [[ [x,y], [vx,vy], [[x_0,y_0],...,[x_n,y_n]], [[vx_0,vy_0],...,[vx_n,vy_n]] ]]


ALMovementDetectionProxy::setSensitivity() allows adjusting the sensitivity of the detection.

Sensitivity is a float value, between 0 and 1:

  • 0 means, only big movements are detected,
  • 1 means, small movements are detected.

The default value is 0.8.

The current value of the sensitivity can be accessed with the function ALMovementDetectionProxy::getSensitivity().

Performances and limitations

This module doesn’t know if the robot is moving or not (if it is walking for example). Make sure AlMovementDetection events are ignored when the robot is moving. See the Movement Tracker box in Choregraphe for an example of how to deal with head movements.

As the algorithm used for the detection is heavy on the CPU, this module cannot be run at the same time as many other modules. In order to deal with this limitation, it is likely that another algorithm will be used in the next version, which may imply some API changes.