SoftBank Robotics documentation What's new in NAOqi 2.8?

ALMood Tutorials

NAOqi Emotion - Overview | API | Tutorials


Attention

In this example, you will discover ALMood’s attention signal.

Basic Usage

Run Attention_Example.crg in Choregraphe and try the following:

  • Look at the robot in the eyes. The robot should say “hey” as long as you look at him. We say the human is fullyEngaged.
  • Look away from the robot, using only your eyes. For example, gaze up or right while facing the robot. The robot will stop saying “hey”. We say the human is semiEngaged.
  • Turn your head completely away from the robot. The robot will stop saying “hey.” We say the human is unengaged.

To accomplish these behaviors, we simply write the following in QiChat:

u:(e:ALMood.attention $ALMood.attention == "fullyEngaged") Hey

Advanced Usage

Notice how ALMood combined with ALExpressionWatcher becomes a powerful tool. In the following example, the robot will say “ahem” if you do not pay attention to him for a while.

Run Attention_Example.crg in Choregraphe and try the following:

  • Look away from the robot for 10s. The robot will say “ahem”.

To accomplish this, notice there are three steps.

  1. Add the following expression to an Expression Watcher box (available in the Choregraphe Box Library):
(ALMood.attention == "semiEngaged" || ALMood.attention == "unengaged") ~ 10

This means that the human was either semiEngaged or unengaged for 10 seconds. In other words, the human was not giving their full attention to the robot. Note that the Expression Watcher “report mode” is set to Rising Edge.

  1. Connect the Expression Watcher box to a Raise Event box (available in the Choregraphe Box Library), with the following event.
"AttentionEvent/NotFullyEngaged10"
  1. Create the following rule in QiChat:
u:(e:AttentionEvent/NotFullyEngaged10) Ahem

Try it!

Similar to the above, you can check if the human has been looking at the robot for a long time. Open the same Attention_Example.crg in Choregraphe and modify it as follows.

  1. Write the following ALExpressionWatcher expression in the Expression Watcher, with the “report mode” as Rising Edge:
ALMood.attention == "fullyEngaged" ~ 5
  1. Connect the Expression Watcher box to a Raise Event box:
"AttentionEvent/FullyEngaged5"
  1. Create the following rule in QiChat:
u:(e:AttentionEvent/FullyEngaged5) Why are you staring at me?

Empathy with Body Language

In this example, you will discover how to use ALMood valence signal with ALExpressionWatcher.

Run Empathy_Example.crg in Choregraphe and try the following:

Look at the robot, then:

  • Smile
  • Smile for 2s
  • Smile for 5s
  • Smile for 7s
  • Smile for 10s

The robot will make non-verbal, positive feedback (like animated “emoticons”), increasing in intensity as time passes.

  • Frown for 3s

The robot will make a non-verbal, empathetic feedback.

How it Works - Basic Usage

In the above example, the robot makes a happy animation when he first sees you smile.

In QiChat, simply write the following:

# This rule is triggered when the robot first sees you are positive, or when you change from neutral/negative to positive
u:(e:ALMood.valenceChanged $ALMood.valenceChanged == "positive") ^run(empathy_example/animations/Interested_01)

How it Works - Advanced Usage

The continuous empathetic feedback (from 2 to 10s) is accomplished with three steps:

  1. Add the following expressions to Expression Watcher boxes, respectively (available in the Choregraphe Box Library):
(ALMood.valence == "positive") ~ 2
(ALMood.valence == "positive") ~ 5
(ALMood.valence == "positive") ~ 7
(ALMood.valence == "positive") ~ 10
(ALMood.valence == "negative") ~ 3

These check that the human was positive 2s, 5s, 7s, 10s, or negative for 3s.

  1. Connect the Expression Watcher boxes to Raise Event boxes, respectively (available in the Choregraphe Box Library), with the following events.
"MoodEvent/Positive2"
"MoodEvent/Positive5"
"MoodEvent/Positive7"
"MoodEvent/Positive10"
"MoodEvent/Negative3"
  1. Add the following in QiChat.
u:(e:MoodEvent/Positive2)
    ^runSound(Aldebaran/enu_ono_laugh_02)

u:(e:MoodEvent/Positive5)
    ^run(empathy_example/animations/Content_01)

u:(e:MoodEvent/Positive7)
    ^start(empathy_example/animations/Content_01)
    ^startSound(Aldebaran/enu_ono_mmh_pleasure_02)
    ^wait(empathy_example/animations/Content_01)

u:(e:MoodEvent/Positive10)
    ^start(empathy_example/animations/Loving_01)
    ^startSound(Aldebaran/enu_ono_shy_04)
    ^wait(empathy_example/animations/Loving_01)

# This rule is triggered when the human has been negative for 3s.
u:(e:MoodEvent/Negative3)
    ^run(empathy_example/animations/Shocked_01)

Empathy in a Dialog

In this example, you will discover how to make an empathetic dialog by combining all of the techniques we’ve learned so far:

  • ALMood.valence
  • ALMood.attention
  • Expression Watcher
  • QiChat

Combining Dialog Input and Valence

Run Empathetic_Listener_Example.crg in Choregraphe and try the following:

  • When the robot says “Hey! How’s it going?”, reply “Fine” while smiling.

The robot should reply positively, based on the following QiChat:

u1:(~fine "e:ALMood.valence $ALMood.valence == "positive"") ~joyful Great! $onStopped=1

Run the behavior again:

  • When the robot says “Hey! How’s it going?”, reply “Fine” while frowning.

The robot should detect that you are not fine, based on your emotional state.

u1:(~fine "e:ALMood.valence $ALMood.valence == "negative"") ~low Hmm, you don't seem ok.

Combining Valence and Expression Watcher

Run the above behavior again:

  • When the robot says “Hey! How’s it going?”, look downwards sadly for at least 2 seconds.

The robot should detect that you are sad, make an empathetic sound and propose to talk about it.

To accomplish this, we added an Expression Watcher box that raises an event MoodEvent/Negative2 based on this Expression:

(ALMood.valence == "negative") ~ 2

And we wrote the following QiChat rule:

u1:(e:MoodEvent/Negative2)
    ^startSound(Aldebaran/enu_ono_oh_sad) ^waitSound(Aldebaran/enu_ono_oh_sad) ~low What's wrong. Do you want to talk about it?

Combining Valence and Attention

  • When the robot says “Hey! How’s it going?”, look downwards sadly for at least 2 seconds.

The robot should detect that you are sad, make an empathetic sound and propose to talk about it. Say yes and the robot will begin empathetic listening.

  • Say anything to the robot and look at him. The robot will ask questions.
u3:(e:Dialog/NotUnderstood "e:ALMood.attention $ALMood.attention=="fullyEngaged"")
["~low Oh really? ~neutral"
" ~low Why is that? ~neutral "
" ~low How do you feel about it? ~neutral " ] ^stayInScope
  • Say anything sadly to the robot and look away, e.g. upwards. In this case, the robot will not interrupt you with words because it appears you are thinking. Instead, it will give non-verbal feedback which matches your mood, in this case empathetically sad sounds.
u3:(e:Dialog/NotUnderstood "e:ALMood.attention $ALMood.attention <> "fullyEngaged"" "e:ALMood.valence $ALMood.valence=="negative"")
    ~mm_low ^stayInScope
  • Say anything happily to the robot and look away, e.g. upwards. In this case, the robot will not interrupt you with words because it appears you are thinking. Instead, it will give non-verbal feedback which matches your mood, in this case, happy sounds.
u3:(e:Dialog/NotUnderstood "e:ALMood.attention $ALMood.attention <> "fullyEngaged"" "e:ALMood.valence $ALMood.valence=="positive"")
    ~mm_high ^stayInScope
  • You may exit the dialog anytime by saying “What should I do?” and thanking the robot, or “That’s enough.”

Python

Get current valence

almood_tutorial_getvalence.py

#! /usr/bin/env python
# -*- encoding: UTF-8 -*-

"""Example: Use currentPersonState Method"""

import qi
import argparse
import sys
import time


def main(session):
    """
    This example uses the currentPersonState method.
    """
    # Get the service ALMood.
    moodService = session.service("ALMood")
    print moodService.currentPersonState()["valence"]["value"]


if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--ip", type=str, default="127.0.0.1",
                        help="Robot IP address. On robot or Local Naoqi: use '127.0.0.1'.")
    parser.add_argument("--port", type=int, default=9559,
                        help="Naoqi port number")

    args = parser.parse_args()
    session = qi.Session()
    try:
        session.connect("tcp://" + args.ip + ":" + str(args.port))
    except RuntimeError:
        print ("Can't connect to Naoqi at ip \"" + args.ip + "\" on port " + str(args.port) +".\n"
               "Please check your script arguments. Run with -h option for help.")
        sys.exit(1)
    main(session)

To execute this script, type:

python almood_tutorial_getvalence.py --qi-url=tcp://<robot name or robot id>:9559

Get emotional reaction after a provocation

almood_tutorial_reaction.py

#! /usr/bin/env python
# -*- encoding: UTF-8 -*-

"""Example: Use getEmotionalReaction Method"""

import qi
import argparse
import sys
import time


def main(session):
    """
    This example uses the getEmotionalReaction method.
    """
    # Get the services ALMood and ALTextToSpeech.
    moodService = session.service("ALMood")
    tts = session.service("ALTextToSpeech")

    # The robot tries to provocate an emotion by greeting you
    tts.say("You look great today !")
    # The robot will try to analysis your reaction during the next 3 seconds
    print moodService.getEmotionalReaction()

if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--ip", type=str, default="127.0.0.1",
                        help="Robot IP address. On robot or Local Naoqi: use '127.0.0.1'.")
    parser.add_argument("--port", type=int, default=9559,
                        help="Naoqi port number")

    args = parser.parse_args()
    session = qi.Session()
    try:
        session.connect("tcp://" + args.ip + ":" + str(args.port))
    except RuntimeError:
        print ("Can't connect to Naoqi at ip \"" + args.ip + "\" on port " + str(args.port) +".\n"
               "Please check your script arguments. Run with -h option for help.")
        sys.exit(1)
    main(session)

To execute this script, type:

python almood_tutorial_get_reaction.py --qi-url=tcp://<robot name or robot id>:9559