Aldebaran documentation What's new in NAOqi 2.4.3?

ALBasicAwareness

NAOqi Interaction engines - Overview | API | Getting started


What it does

ALBasicAwareness is a simple way to make the robot establish and keep eye contact with people.

How it works

ALBasicAwareness is an autonomous ability which is enabled by default when ALAutonomousLife is in solitary or in interactive state. For further details, see: Autonomous Abilities.

The ALBasicAwareness module enables the robot to be aware of the stimuli coming from its surrounding environment. For further details, see: Types of stimulus.

The robot doesn’t look for stimuli, but if it gets one stimulus (with its associated position), he processes it by looking at the origin of the stimulus and checking if there is a human there.

  • If yes, he tracks the human: the robot is now engaged with the user (engaged person)
  • Else, it goes back to its previous occupation:
    • if the robot was already tracking somebody before the stimulus was detected, it resumes tracking
    • else, its head goes back to the standard position.

Types of stimulus

It is possible to enable or disable the following types of stimulus:

It is also possible to trigger stimuli manually (using ALBasicAwarenessProxy::triggerStimulus).

Engagement Modes

To allow a wider range of behaviors, ALBasicAwareness provides 3 engagement modes that specify how “focused” the robot is on the engaged person.

  • “Unengaged”: (Default mode) when the robot is engaged with a user, it can be distracted by any stimulus, and engage with another person.
  • “FullyEngaged”: as soon as the robot is engaged with a person, it stops listening to stimuli and stays engaged with the same person. If it loses the engaged person, it will listen to stimuli again and may engage with a different person.
  • “SemiEngaged”: when the robot is engaged with a person, it keeps listening to the stimuli, and if it gets a stimulus, it will look in its direction, but it will always go back to the person it is engaged with. If it loses the person, it will listen to stimuli again and may engage with a different person.

Tracking Modes

  • “Head”: the tracking only uses the head
  • “BodyRotation”: the tracking uses the head and the rotation of the body
  • “WholeBody”: the tracking uses the whole body, but doesn’t make it rotate
  • “MoveContextually”: the tracking uses the head and autonomously performs small moves such as approaching the tracked person, stepping backward, rotating, etc.

Pausing and Resuming

ALBasicAwareness can be paused, which means it will stop making the robot move until it is resumed. When resumed, if ALBasicAwareness was tracking someone before being paused, it will try to retrieve the last tracked person and then to resume the tracking. These pauses can be triggered:

  • automatically: when the activity uses the head motors in any way (e.g. when launching an animation, or running a function that moves the head joints such as ALMotionProxy::angleInterpolationWithSpeed). ALBasicAwareness is then considered of lower priority, and paused. It will be automatically resumed when the head motors are freed.
  • manually: when the activity uses ALBasicAwarenessProxy::pauseAwareness and ALBasicAwarenessProxy::resumeAwareness. In case the application needs to freeze the robot head for some time (e.g. when taking a picture), manually pausing is more interesting than stopping ALBasicAwareness, with which the currently tracked person is lost.

Of course this is only valid when ALBasicAwareness is enabled.

Performances and Limitations

ALBasicAwareness module is a “meta-module”, i.e. it uses other modules to add its extra features. The list of modules is the following:

Thus, parallel calls to these modules from another program or Choregraphe box when ALBasicAwareness is running should be done carefully.