Why urbiscript?

urbiscript is an interpreted dynamic programming language with specific constructs to handle event-based and concurrent programming.

It is well-suited for programming robotic behaviors which often requires running and synchronizing multiple concurrent tasks:

// Blocks until the event is triggered
waituntil(events.MovementDetection.MovementInfo?);
// Speak while standing up
tts.say("Finally something is happening!") &
robot.StandUp();
// When both tasks are finished, go on

// Create a tagged task that looks around
var lookAround = Tag.new();
detach({ // run it in the background
  lookAround: robot.headYaw = 0 sin:2s ampli:45deg
});

// Monitor for an event in the background
at(events.DarknessDetected?)
{
  tts.say("Nap time.");
  lookAround.stop(); // stop lookAround task
  robot.posture = robot.Posture.Sit;
};

A generic urbiscript tutorial is available at http://www.gostai.com/downloads/urbi/doc/urbiscript-user-manual.html

And the reference manual is here: http://www.gostai.com/downloads/urbi/doc/urbi-sdk-reference-manual.html

Installing and running urbiscript for nao

Getting urbi

Simply download the urbi-for-nao package that matches your computer’s operating system and architecture, and decompress it anywhere.

This will give you the urbi client GUI urbi-lab, and an urbi interpreter that runs locally, but can control a remote NAOqi.

You can also have urbi run directly on the robot, by using the urbi application that you can install from the store.

Run

If you chose to run urbi on the robot using the store application, connect to the urbi application web page at http://ROBOT_NAME/apps/urbi and click on the start urbi button.

Alternatively, to run a local urbi that connects to a remote NAOqi, use the wrapper script urbi4nao.sh or urbi4nao.bat depening on your system:

$ bin/urbi4nao.sh ROBOT_NAME

replacing ROBOT_NAME with the name or IP address of your robot. This will start an interactive urbiscript interpreter, connect it to your robot and setup nao-specific objects.

If you have rlwrap installed, you can prefix it to your command line for a better command line interface.

For reference, the script starts the following command:

bin/urbi-launch -s urbi4qimessaging -- -i --port 54000 nao.u ROBOT_NAME

Interact

The simplest way to start sending urbi commands is to use urbi-lab, provided in the urbi package as bin/GostaiLab.

You can find a quick introduction to urbi lab here: Using urbi-lab with nao

You can also type urbiscript commands in urbi-launch standard input, or connect to the interpreter with a simple rlwrap netcat localhost 54000.

Compatibility

If you need to run urbiscript code written for previous versions of NAOqi, you must load the naocompat.u file which provides the compatibility layer:

load("naocmopat.u");
CameraFormat.YCbCr; // this no longuer exists in the new API
[00000001] 2

General object layout

The initialization script initialises a tree of objects to represent the robot’s sensors and actuators, conforming to http://www.gostai.com/downloads/urbi-sdk/3/doc/urbi-sdk.htmldir/gostai-standard-robotics-api.html .

Each object is available through a long name, through the robot hierarchy:

robot.body.head.eye[left].led[0].val = 0;

...and through a short name, in Global

robot.body.head.eye[left].led[0].compactName;
[00249869] "eyeLedL0"
Global.eyeLedL0.val = 1;

You can use the commands robot.dump(); and robot.flatDump(); to display the whole hierarchy.

Generic service interface

The NAOqi multi-language API is comprised of services, each with its own set of methods. This API is of course available in urbiscript.

A proxy to a given service, for instance ALTextToSpeech, can be obtained from the robot.proxy object. Each proxy is created on demand and cached. The methods defined by the associated service can be called on this object:

robot.proxy.ALTextToSpeech.say("Hello");
// Or:
var tts = robot.proxy.ALTextToSpeech;
tts.say("Hello again");

You can use the standard urbiscript function localSlotNames() to list the available methods:

tts.localSlotNames().sort();
[00000001] ["getVolume", "say", "setLanguage"]

ALMemory interface

ALMemory can be accessed using the NAOqi API:

var mem = robot.proxy.ALMemory;
mem.insertData("foo", "bar");
mem.getData("foo");
[00000001] "bar"

Specialized interfaces are also provided.

ALMemory events

All the ALMemory events are present in robot.events as urbiscript events. Event names which contains a slash are put in sub-objects:

mem.raiseEvent("test/event", 0); // create the event
t: at(robot.events.test.event?) echo("trigger");
// reminder: the previous line will automatically create a tag t in toplevel interactive mode only.
mem.raiseEvent("test/event", 1);
[00000001] *** trigger
robot.events.test.event!(1); // trigger the event
mem.getData("test/event");
[00000002] 1
t.stop();

As you can see, newly created events are inserted in robot.events, and the urbiscript Event object works both for emitting and receiving the event.

Mapping ALMemory data

Any ALMemory data can be mapped to an urbiscript slot using robot.memory.mapData:

setSlot("x", robot.memory.mapData("testval"));
x = 1 | mem.getData("testval");
[00000001] 1
mem.insertData("testval", 2) | x;
[00000001] 2

Motion interface

In addition to the NAOqi robot.proxy.ALMotion standard service proxy, specialized APIs are provided to move the robot from urbiscript.

We recommend you familiarize yourself with the ALMotion API before reading this section.

Posture

The robot.posture slot can be used to query the current robot posture, or set a target posture. The enums robot.Posture and robot.PostureFamily are available:

robot.posture;
[00000001] "StandUp"
robot.Posture.localSlotNames();
[00060550] ["StandZero", "StandInit", "Stand", "Sit", "Crouch"]
// You can use either the string or the enum value:
robot.posture = "Sit";
robot.posture = robot.Posture.Crouch;

Walk

The high level robot.go and robot.stop functions are provided:

// Arguments: x,y,theta in meters and radians.
robot.go(100, 0, 0) | echo("finished"),
sleep(1s);
"obviously the robot is still walking at this point";
[00000001]  "obviously the robot is still walking at this point"
robot.stop();
[00000002] *** finished

The (x, y, theta) walk speed can also be controlled using robot.walk.speed, which accepts:

  • A vector on its val slot
  • floats on its x, y and yaw slots
robot.walk.speed.val = <0.1, 0, 0>; // forward 10cm/s
robot.walk.speed.yaw = 0.2; // also turn at 0.2 rad/s, so spiral movement.
robot.stop(); // stop moving
robot.walk.speed.y = 0 sin: 5s ampli:0.5; // simple goalkeeper behavior :)

Cartesian control

All robot chains (that you can enumerate with ALMotion.getBodyNames(“Chains”)) can have their end point controlled in position through an object pos in the associated component of robot.body:

  • val accepts a vector of six elements (x, y, z, yaw, pitch, roll)
  • x, y, z, yaw, pitch, roll accept a float

All chains are controlled in the torso frame, except body.pos itself which is expressed in the nao frame:

robot.body; // will enumerate subdevices and components
[00000001] <head, arm, leg>
robot.posture = robot.Posture.Crouch;
robot.body.pos.val = robot.body.pos.val + <0, 0, 0.05, 0, 0, 0> time:1s;
robot.body.pos.pitch = 30deg time:2s; // bow

Let’s have one arm free (0 stifness) to let the user move it, and mirror it with the second arm:

robot.body.arm[left].stiffness = 0;
robot.body.arm[right].stiffness = 1;
t: every(0.1s) robot.body.arm[right].pos.val = robot.body.arm[left].pos.val * <1,-1, 1, -1, 1, -1>,
// play with it!
t.stop();

The pos object is also provided on all joints in robot.body, but they are read-only except for head, arms and legs.

Whole-body motion

Convenient wrappers on top of the whole-body motion API are present in the robot.body.wb object:

var wb = robot.wb;
wb.enable = true;
wb.gotoBalance(wb.Legs.Left, 1s);
wb.balance = wb.Legs.Left;
wb.footState(wb.FootState.Fixed, wb.FootState.Plane);
wb.armLEnabled = true;
wb.armL = <0.15, 0.1, 0.5>;

In case the whole-body motion subsystem get stuck in unfeasible motions, you can disable it, return to a standard posture, and re-enable it:

wb.enable = false;
robot.posture = robot.Posture.Crouch;
wb.enable = true;

Writing to the following slots will also use whole-body motion, automatically enabling and disabling it when needed:

robot.body.arm[left].wbPos;
robot.body.arm[right].wbPos;
robot.body.head.wbPos;
robot.body.leg[left].wbPos;
robot.body.leg[right].wbPos;

Individual joint control

Objects for individual joints are instantiated in robot.body. Each have two gettable/settable slots, val and stiffness:

robot.body.head.yaw.stiffness = 1;
timeout(5s) robot.body.head.yaw.val = 0 sin:2s ampli:45deg;

If you use joint control intensively, you can optimize it for intensive use by calling nao.Joint.batchMode(). This will try to factor multiple joints write operations in a single NAOqi call, and use regular polling to get all joint positions instead of fetching it on demand. You can tweak System.period to control the joint polling period.

Joint chain control

Additionaly, each joint chain can be get/set as a vector of the individual joints composing the chain, using the chain slot ofr arm, head and leg:

// Mirror one arm on the other.
t: every|(20ms) robot.body.arm[right].chain = <1,-1,-1,-1,-1,1> * robot.body.arm[left].chain,

Video interface

An object representing the robot camera is available as robot.camera. The video stream parameters can be controlled by the framerate, resolution and colorspace slots. Use the load slot to start and stop streaming, and the val slot to access the last received image:

robot.camera.load = 0; // stop before changing parameters
robot.camera.resolution = 1; // 0 is max resolution, 1 divides by two, 2 by four,...
robot.camera.framerate = 5;
robot.camera.colorspace = camera.Colorspace.RGB; // see ALVideoDevice for recommendations
robot.camera.load = 1;
waituntil(robot.camera.val->changed?); // wait for first image
echo("%s x %s" % [robot.camera.width, robot.camera.height]);
[00000001] *** 640 x 480
assert(camera.val.data.size == 640*480*3);

Sensors

Sensor values can be read using the associated ALMemory key given in their documentation. For conveniance, some sensors are mapped to the component tree:

Long name Short name Description
body.gyro[x] gyroX Gyroscope, X axis
body.gyro[y] gyroY Gyroscope, Y axis
body.accel[x] accelX Accelerometer, X axis
body.accel[y] accelY Accelerometer, X axis
body.accel[z] accelZ Accelerometer, X axis
body.leg[left].foot.touch footTouchL Left foot touch sensor
body.leg[right].foot.touch footTouchR Left foot touch sensor
body.head.touch[front] headTouchF Head front touch sensor
body.head.touch[middle] headTouchM Head middle touch sensor
body.head.touch[back] headTouchB Head back touch sensor
body.sonar[left] sonarL Left sonar
body.sonar[right] sonarR Right sonar
body.battery.current battery.current Current battery draw
body.temperature temperature Body temperature

Each of those objects has a val slot with the current sensor value.

Leds

In addition to the standard ALLeds API, leds are mapped into the component tree at the appropriate location:

On the left side:

  • body.head.ear[left][0] to body.head.ear[left][9] (earLedL0 to earLedL8)
  • body.head.eye[left][0] to body.head.eye[left][7] (eyeLedL0 to eyeLedL7)
  • body.leg[left].foot.led (footLedL)

Right-side are named in a similar manner.

  • body.led is the chest leds.

Each led can be controlled in multiple ways:

  • Writing a value in the r, g or b slot in the range [0,1] to set one RGB component.
  • Writing the val slot to set led intensity.
  • Writing the rgb slot with a 24-bit value (8 bits per color component).
// Switch off all left ear leds.
robot.body.head.ear[left].val = 0;
// Cool rotation effect
var vals = [0, 0.25, 0.5, 0.75, 1, 0.75, 0.5, 0.25, 0, 0];
var offset = 0;
t: every|(200ms)
{
  offset++;
  for (var i: 10)
    robot.body.head.ear[left].get(i).val = vals[(i+offset)%10]
},

Audio interface

Speaker

Only two useful slots here, deviceSampleRate that can be used to get/set the sample rate used by the device, and val that accepts binary data and plays it:

// Suppose you have a my.wav sound file that is stereo 16000 Hz
// Get its content in the snd variable
var snd = File.new("my.wav").content|;
// Notice the use of '|;' to avoid displaying the wav on the console
robot.speaker.deviceSampleRate = 16000;
// Remove the wav header
snd.data = snd.data[44, -1]|;
// Play it on robot speaker
robot.speaker.val = snd|;

Binary data can be send with chunks of any size, from tens of milliseconds to multiple seconds.

Microphone

Microphone is handled by the robot.micro object:

  • Set parameters using the rate, channels and deviceSampleRate slots.
  • Start the audio stream by setting load to 1.
  • Monitor val for incoming raw audio data in the requested format.

Writing services in urbiscript

Accessing the session

The middleware Session API is implemented in urbiscript using the QiSession and QiObject objects:

var s = QiSession.new();
s.connect("tcp://localhost:9559");
var tts = s.service("ALTextToSpeech");
s.listen("tcp://0.0.0.0:0"); // start listening

The session used by the initialization is stored as robot.session.

Creating a service

The steps involved in creating a service accessible to other NAOqi clients are:

  • Instantiate a new QiObject.
  • Use advertiseMethod(name, sig, sigreturn) to declare your methods. urbiscript being a dynamically typed language, you have to explicitly give the function signature, as it cannot be deduced.
  • Use session.registerService(name, obj)
var session = robot.session;
var pinger = QiObject.new();
var pinger.increment = 1;
function pinger.ping(v) { v+increment};
pinger.advertiseMethod("ping", "i", "i");
session.registerService("ping", pinger);

// test it
var client = QiSession.new();
// connect a new session to the same service directory
client.connect(session.url());
var pingerProxy = client.service("ping");
pingerProxy.ping(12);
[00000001] 13
pinger.increment = 2;
pingerProxy.ping(12);
[00000001] 14