SoftBank Robotics documentation What's new in NAOqi 2.5?

Sample 1: a first dance application

Here is a typical, well designed, simple dance application.


Content outline: Pepper dances Date by “I’m fresh! You’re pretty!”.


Guided Tour

Let’s discover how it works.

The main behavior of this application contains 2 boxes: Init & Reset and Dance Timeline.

Init & Reset


Ensure that Pepper:

  • is in the right state before starting the dance,
  • is set back to its initial state after the dance is finished.

How it works

This is the first box being triggered, because it is directly linked to the to the onStart input of the Application.

Double-click the Init & Reset box to open it.

Let’s read the 2 main methods.

The onInput_onStart method is executed when the application is started, to:

  • Deactivate ALAutonomousMoves and ALBasicAwareness.

  • Go to the Stand posture.

  • Depending on the success or the failure of the goToPosture function call, it will be the success or the failure output.

    If successful, the Dance Timeline box is started, otherwise the application quits.

    def onInput_onStart(self):
        except Exception as exc:
            self.log("Unable to disable autonomous moves: %s" % str(exc))
        except Exception as exc:
            self.log("Unable to stop awareness: %s" % str(exc))
        result = self.postureProxy.goToPosture("Stand", 0.8)

The onUnload method of the box is executed when the application quits, to:

  • Reactivate ALAutonomousMoves and ALBasicAwareness.

    def onUnload(self):
        except Exception as exc:
            self.log("Unable to enable breathing: %s" % str(exc))
        except Exception as exc:
            self.log("Unable to start awareness: %s" % str(exc))

Dance Timeline


Dance Timeline box mixes:

  • body movements,
  • music from a sound file,
  • an image displayed on Pepper’s tablet.

How it works

This box is triggered when the Init & Reset box ends successfully, thanks to its onStart input linked to the success output of the Init & Reset box.

Double-click the Dance Timeline box to open it.


Let’s read the 3 main layers.

  • Motion: The motion keyframes of the dance animation, which represent the position of the robot and its body parts at a certain time.
  • music behavior layer: A Play Sound box. It plays an attached sound file.
  • tablet behavior layer: A Show Image box. It displays an image on the tablet.

Try it!

Try the application.

Step Action
In the Robot applications panel panel, click the install-app Package and install the current project on the robot button.

Make sure autonomous life is on.

If not, click the life Turn autonomous life on button.

Launch the application.


  • Make sure Pepper‘s language is set to English.

  • Say one of the Trigger sentences:

    “date dance”, “Pepper’s date dance”, “dance I’m fresh you’re pretty”, “Fresh and pretty dance”,

    You can also use the Application title by saying:

    “start Date Dance”.

Through the tablet

  • In the App Launcher, press the application’s icon.

You may also try the behavior only, by clicking on the play-button Play button.

Note that this example only works correctly on a real Pepper, since ALTabletService is not present on a virtual robot.

Make it yours!

Edit the part of the content you want to customize: let’s say the music.

Want to package it?

Step Action

Customize its properties.

You can keep most of the properties as they are, but the following ones must be adapted:

  • Icon
  • Application title
  • Application ID
  • Application description
  • Trigger sentences
  • Loading responses
Package it.