SoftBank Robotics documentation What's new in NAOqi 2.8?

Interactive features - Getting started

This page goes over the robot’s main interactive features, and explains how, and whether, to include them in your application. It assumes you are familiar with Choregraphe basics. If not, see: Your first steps in Choregraphe.

Voice

When to use it? Almost all applications need some voice content, and only very specific cases (like dances) can do without.

Few words

If you simply want the robot to say a few words, use the Speech > Creation > Animated Say box.

Vocal interaction

However, if you want to design an interaction with the robot, we recommend that you create Dialog boxes, using QiChat, a language for describing what the robot will listen for and how he will answer.

For further details, see: using Dialog topics in Choregraphe.

Sounds & Music

When to use it? Most applications should use a few sounds to mark relevant moments.

When it makes sense and the robot is not speaking, adding sounds can help the user understand what is going on.

You can use the Play Sound box. For further details see: the Playing music tutorial.

Advanced users can use ALAudioPlayer.

The Tablet

When to use it? You don’t need to, but it will often increase the understandability of your application. Be careful not to overuse it.

Pepper‘s tablet can help the user understand what the robot is doing, and it can also be a way of getting user input when voice recognition fails.

You can package tablet content (images, webpages, videos) for your robot in an html folder in your application, and access it with the tablet boxes of the box library.

To get user input through the tablet, a couple solutions exist:

  • You can use the TabletTouched box from the box library
  • Or you can display a webpage with some buttons on it, as seen in [TBD template]

Advanced users can try displaying complex webpages that directly call robot modules using JavaScript SDK.

Animations

When to use it? The robot needs to be alive during your application, unless there is a good reason not to; however your application may not require dedicated animations.

Automatic

Thanks to ALAutonomousLife, ALAutonomousMoves and ALBasicAwareness activated by default, the robot has breathing motion, looks around and focuses on people, reacts to sounds, etc.

In addition, the robot usually talks using Animated Speech, which will make him move his arms as he speaks.

Customized

These are enough for the robot to keep looking alive during your application. If you need specific animations, you have two major choices:

  • Use animations from the existing boxes from the Animation library.
  • Make your own animations. For further details see: : Mastering movements.

You will also need to understand the basics of Stiffness and Self-collision avoidance.

Advanced users can make the robot move with the NAOqi motion services.

Touching

When to use it? Only if it makes sense in your application.

Pepper has tactile sensors, you can use Choregraphe boxes in Sensing > Touch to detect when one of those parts has been touched. For further details see: ALTouch.

Advanced users can also use ALMotion to detect when Pepper’s limbs have been moved by the user (by comparing their present position with their expected position).

Moving around

When to use it? Only if it makes sense in your app.

Pepper is also able to move from one point to another and avoid obstacles, using dedicated Choregraphe boxes from the Movement > Navigation library.

Advanced users can ALNavigation.