Download as pdf or txt
Download as pdf or txt
You are on page 1of 276

Written bY

PROF. KISUNG SEO


edited bY

ALDEBARAN ROBOTICS
& NT RESEARCH, INC

uSing nao:
introDuCtion
to interaCtiVe
humanoiD
robotS
2
Words From the Author

Robots are both replacing and assisting people in various fields including
manufacturing, extreme jobs, and service sectors. Robots will have a wider
range of application in the near future. Humanoid robots are attracting the most
attention compared to other robots because 1) they look similar to people so they
seem friendlier and are recognized as being a better fit for helping (or replacing)
humans for certain tasks, 2) much like humans, biped walking is possible and
jobs can be performed using both hands, and 3) they mimic the most evolutionally
outstanding human form and function.

Furthermore, humanoid robots are getting a lot of attention from educators


and researchers because they are surrounded by challenging issues including
difficulties in walking and general motion control, effectiveness issue with
processing the recognition sensors, and implementation of intelligence. It has
been very tough because there aren’t a lot of commercialized robots we can use
to develop new controls and intelligence algorithms and for fully utilizing the
advanced features of these humanoid robots in actual educational and research
sites. Fortunately, more humanoid robots are being released, and out of all of
them, NAO from Aldebaran Robotics is the world’s most widely known humanoid
robot being used for education and research. In August of 2007, it was designated
as the official platform for RoboCup (Robot Soccer World Cup) instead of Sony’s
Aibo (puppy robot), and has been adopted in Suzhou, China starting from the 2008
competition.

Humanoid NAO consists of 25 joints that make walking and general motion control
possible. Diverse interactions are possible through wireless/cable network enabled
communication, cameras, infrared sensors, microphone, speakers and LEDs.
The software structure is based on open source embedded Linux and supports
programming languages like C, C++, URBI, Python, and. Net Framework. It also
provides a graphic-based programming called Choregraphe.

This book will try to focus on using Aldebaran’s humanoid NAO robot to explain the
environment and tools, programming techniques, and basic theory and applications
for educational and research purposes of vocational high schools, universities, and
the general public.

This book is largely divided into two parts: Chapters 1-3 for beginners and
Chapters 4-6 for advanced users. Chapters 1-3 introduce Choregraphe and
Python necessary for basic NAO robot usage. Chapters 4-6 handle information
for professional use. I would like to advise anyone just learning about the NAO
robot and people who are unfamiliar with C and Python to become familiar with
the information in Chapters 1-3. Chapters 2 and 4-6 are recommended for anyone
with previous experience in robot programming or anyone who wants to perform
specialized algorithms and control commands.

3
Chapter 1 introduces the NAO robot and the Monitor program that can be used
to verify NAO’s internal memory and image processing. It will also explain how to
do the initial setup for the system. Because this chapter discusses NAO’s special
features, it would be good for readers who are not quite familiar with NAO.

Chapter 2 will teach you how to use Choregraphe, a graphic-based programming


tool, to operate NAO. Choregraphe uses a program module called Diagram to
explain how to program and how to set NAO’s movements in Timeline. Additionally,
it will provide a description of how to use box libraries and FTP in Choregraphe.

Chapter 3 will have a short introduction to Choregraphe scripts and Python for
NAOqi. There is a basic description of Python syntax and a discussion about
creating and editing Choregraphe script boxes. This would be a good chapter if you
are already familiar with Python.

Chapter 4 explains the NAOqi framework which forms the foundation of the NAO
robot and the DCM used for controlling all the devices. Special characteristics
including the NAOqi framework structure, file structure, and Broker as well as the
NAOqi framework are used to control NAO. It also explores how to load modules
into NAO using Linux, C++, and cross-compiling as well as what to do when several
commands are received in Time Command. There will also be an introduction to
the structures of DCM controlled devices and how to synchronize using DCM’s
synchronization method.

Robot kinematics in Chapter 5 explains NAO’s joint structure and provides


information for each joint. The Denavit-Hartenberg (DH) method is used to explain
the calculation for forward kinematics. In addition, Python will be used to create
an actual forward kinematics calculation program. This chapter will also describe
inverse kinematics calculations and use Python to implement the inverse kinematics
calculation program for NAO’s right arm. You will need quite a bit of mathematical
and robotics knowledge to understand the contents in Chapter 5.

Comprehensive Exercises in Chapter 6 use the information thus far to look at


different methods and examples for implementing NAO’s applications. Advanced
Choregraphe features and expansion methods will be used here and you will be
able to practice using Timeline Editor. In addition, landmark recognition will be
used to create a path finding program, and the multiplication example will help
you learn some of the techniques for Python and NAOqi API. Last, but not least,
image recognition will be used to classify objects and inverse kinematics and
NAOqi usage will be explained.

It was considerably difficult to write this book because there was a disadvantage
of dealing with such a specific model of humanoid robot.

4
There wasn’t much material about it, and the ones that were available were quite
disorganized. I was also conflicted about how to handle the variety of readership
because of the content and general difficulty of the subject matter. I am sincerely
hoping that this book will serve as a good introduction to humanoid robots.

I would like to express my sincere gratitude to the people at NT Research Inc. who
gave me both material and emotional support.

I would especially like to thank Jae-young Jang, Byung-yong Hyun, Su-hwan


Hyun, Oh-sung Kwon, Jae-min Lee, and Young-kyun Kim in Intelligent Systems
Laboratory for conducting the series of experiments with the NAO robot to help me
verify the information in this book. Although every effort has been put into gathering
information for this book, I am sure that there is still room for improvement, and I
acknowledge that this is wholly due to the fact that I still have a lot to learn about
this vast and amazing field.

April 2011
Ki-sung Suh

5
How to use
this Curriculum

You are allowed to reproduce


the content of this book and to share
it with your classroom only.

Aldebaran Robotics does not warrant the accuracy of the provided content
which shall be used at your own risk and under your control. Aldebaran Robotics
disclaims all liability related to the use as well as the content. All rights not
specifically granted herein are reserved to Aldebaran Robotics. Aldebaran
Robotics and/or its licensor shall retain all rights, title and interest and
ownership in and to the book and its content.

This curriculum has been done with the 1.8.16 version of Choregraphe,
our programming software. However, most of the features are compatible
with newer versions. The screenshot of the software included in this curriculum
may be different depending of the version of Choregraphe you have.

6
( table
of contents
>> WORDS FROM THE AUTHOR
>> How to use this curriculum

>> 1 - Introduction 8 >> 5 - NAO Kinematics 182


NAO is… 10 Overview 184
Preparation 17 Transformation Matrix 185
Connecting NAO 18 NAO Structure 186
Monitor (Former name : Telepathe) 22 Kinematics 193
Inverse Kinematics 205
>> 2 - Choregraphe 26
Introduction and Interface 28 >> 6 - Comprehensive examples 216
Choregraphe-NAO connection 33 Choregraphe Application 218
Box 35 Motion Control – Timeline Editor 235
Event and time Centered Programming 39 Getting Directions Using 246
Box Library 46 Landmarks – Using Choregraphe
Memorizing the Multiplication Table 251
>> 3 - Python 92 – Python and NAOqi Application
Before Getting Started 94 Combining Recognition and Movement 261
Overview 95 – Using Images for Object recognition
Data Types and Operators 98 and Grabbing Motion
Control statements 105
Functions 109
Class 112
Module 120
Comprehensive Practice Through 124
Choregraphe Script Modification
References 133

>> 4 - NAOqi &DCM 134


NAOqi Overview 136
Structural Overview 138
Using NAOqi 146
Cross Compiling fo Loading 153
Modules (Using C++, Linux)
DCM Introduction 163
Upper Level Architecture 164
Low Level Architecture 168
Preferences Files 172
and Sub Preference Files
DCM Bounds Methods 173
DCM Synchronization Methods 181

7
1 introDuCtion
LEARNING
Chapter 1 introduces the NAO robot and the Monitor
program that can be used to verify NAO’s internal
memory It will also explain how to do the initial
setup for the system.

Because this chapter discusses NAO’s special


features, it would be good for readers who are not
yet familiar with nao.

8
content
1.1 NAO is… 10

1.1.1 Common features 11


1.1.2 Configuration 12
1.1.3 Joint Configuration 14
1.1.4 Vision System 15
1.1.5 Audio 16
1.1.6 Software 16

1.2 Preparation 17

1.2.1 Package Configuration 17


1.2.2 Requirements 17
1.2.3 Software installation 17

1.3 Connecting NAO 18

1.3.1 Wired connection 18


using Ethernet
1.3.2 Wireless connection 18
using Wi-Fi
1.3.3 Using web service 20
for default settings
1.3.4 File Transfer using FTP 21

1.4 Monitor (Former name : Telepathe) 22

9
1.1 Nao is…

Image 1.1 - Humanoid robot NAO from Aldebaran Robotics

Humanoid Robot NAOs from Aldebaran Robotics are medium-sized open architecture robots (Image 1.1).
NAOs are being used all over the world for educational and research purposes in over 480 universities.
Aldebaran Robotics designed the NAO technology to be used in both secondary and higher education
programs. Because of the user-friendly programming environment, anyone can use NAO regardless
of the level of programming, and it can implement high-level functions using the open architecture.

Project NAO started in 2005 and was chosen instead of Sony’s Aibo (puppy robot) to be the official platform of
RoboCup (Robot Soccer World Cup) in August 2007. It has been used since 2008 at the competition in Suzhou,
China. In 2009, 24 teams from all over the world entered the RoboCup Competition using a total of 100 NAOs.

NAO can communicate with the PC via both a cable and wireless networks. Multiple NAOs can interact
with each other using infrared sensors, wireless network, camera, microphone and speaker. User input
can be made through the contact sensor, camera, and microphone. Status output can be delivered
to the user through multiple LEDs and speakers.

The NAO software is based on Gentoo Linux and supports multiple programming languages.
Programming languages like C, C++, URBI, Python, and .Net Framework, and Choregraphe can be used
for graphic-based programming.

10 1 - introduction
1.1 nao is…
1.1.1 Common featureS
NAO is 57.3cm in height, 27.3cm wide,
and weighs less than 4.3kg. The body
is made from a special plastic material
and has a 21.6V 2Ah lithium-ion battery
that allows up to 90 minutes of use.
There are diverse sensors like the 2-axis
gyro sensor and ultrasonic sensor,
and the multimedia can be implemented
through the use of cameras,
microphones, and speakers.

The NAO system uses the Gentoo


Linux operating system. NAO’s
overall operation is managed by
the NAOqi Framework as the
user and system communicate.
DCM (Device Communication
Manager) manages the
communication between NAO
devices like the actuator
and sensors.

Open architecture is
enthusiastically reflected
in NAO’s development
environment. Software
and software development
tools are provided so you
can use it with Windows,
Mac OSX and Linux
operating systems.

If Version 1.6 used C, C++,


Python, and URBI for programming,
Version 1.8 has additional support
for C# and Net Framework.

image 1.2 - NAO H25

1 - introDuCtion 11
1.1 nAo is…
1.1.2 Configuration

Image 1.3 - NAO Configuration

Image 1.3 shows NAO’s components. There are 25 total joints, and they are largely divided into head (2),
arm (12), waist (1), and leg (10). Arm and leg joints are left and right symmetrical. There are Hall-effect
sensors (32), contact sensors (3), infrared sensors (2), ultrasonic sensors (2), a 2-axis gyro sensor (1),
3-axis acceleration sensors (2), decompression sensors (8), and bumpers (2). There are also cameras (2),
microphones (4), and speakers (2) for image and voice processing.

12 1 - introduction
1.1 nao is…
Image 1.4

Image 1.4 shows NAO’s device structure. NAO’s head has an embedded system for full control, and there
is an ARM micro-controller in the chest to control the motor and power. The embedded system uses
embedded Linux (32 bit x86 ELF) and its H/W is composed of x86 AMD GEODE 500MHz CPU, 256MB SDRAM,
and flash memory. Ethernet (cable) and Wi-Fi (wireless, IEEE 802.11g) are supported, and the 2011 model
is Bluetooth enabled.

1 - introduction 13
1.1 nao is…
1.1.3 Joint configuration
Image 1.5 uses the Roll-Pitch-Yaw method to show NAO’s joints. HipYawPitch, which is one of the waist
joints, only has one degree of freedom since both the left/right joints are operated by one actuator.
The actuator that operates NAO’s joints are categorized into four different types depending on the motor
performance and reduction ratio.

Detailed information for each of the motors can be found in the Advanced-Hardware section. The Hall-effect
sensor that measures the motor rotation has 12 bit accuracy. Meaning, the degree is accuracy is 0.1.
Chapter 6 will discuss NAO’s kinematics.

Image 1.5 - Configuration and names of NAO’s joints

14 1 - introduction
1.1 nao is…
1.1.4 Vision System
Image 1.6 shows NAO’s camera locations and angles. There are two cameras attached to NAO;
the top camera focuses on the front while the bottom camera focuses on the foot. The vision system can be
used to implement mark recognition, face recognition, object recognition, image recording, and etc. If C++
programming language is used to develop the vision system, it can be done so by using the OpenCV library.
The current Version 1.8 supports OpenCV library Version 2.0. If you use Choregraphe, you can use the camera
for face and landmark recognition. If you use the Monitor program, you can verify the images and videos
processed by NAO’s cameras and do a basic setup. Chapter 1.4 will discuss the Monitor program.
The following lists the features of the vision system:

n Resolution n Color Space

- VGA (640 x 480) - YUV422 (default format of the camera)


- QVGA (320 x 240) - YUV (24 bits)
- QQVGA (160 x 120) - Y (8 bits)
- RGB (24 bits)
- BGR (24 bits)
- HSY (24 bits)

Image 1.6 - NAO’s camera configuration

1 - introduction 15
1.1 nao is…
1.1.5 Audio
Image 1.7 shows the configuration of NAO’s microphone. There are four microphones: one on each ear,
one in the front, and one in the back of the head. There are two speakers: one attached to each ear.
The speakers can be used to play music and to read texts entered by the user. The four microphones
can be used for simple voice recordings, but they also provide a function that can recognize the location
of the sound. Choregraphe supports recording and playback, sound detection, text reading, and etc.

Image 1.7 - NAO’s microphone configuration

1.1.6 Software
NAO provides Choregraphe, NAOqi, and Monitor as development software. Choregraphe is an easy-to-use
graphics-based development tool even novice users can easily handle. Chapter 2 will provide more details
about Choregraphe.

NAOqi is the programming framework used to program NAO, and it was designed to satisfy the requirements
needed in robotics. Main features include parallelization, resource management, synchronization, and event,
and it allows communication between other modules like motion, audio, and video. Chapter 5 will provide
more details regarding NAOqi.

Monitor is a program that receives feedback from NAO so you can easily verify joint or sensor values.
The feedback data includes image processing data, so if the image processing algorithm is being
implemented, you can verify the image result on the PC.

You can also set the resolution and color. There is a memory viewer for verifying the joint and sensor values
and a camera viewer used for setting up the camera and saving images. Laser sensor can be used for
monitoring if it is used as an external device.

16 1 - introduction
1.1 nao is…
1.2 Preparation

1.2.1 Package Configuration


The following lists NAO’s package configuration:

- NAO Robot
- Battery (Lithium-ion)
- Battery charger (100-240V)
- 4 Power adapter
- Installation CD

1.2.2 Requirements
The following lists the necessary minimum requirements for NAO’s development environment.

This book uses Windows XP 32-bit operating system as reference. Due to the nature
of the robot, there may be constraints in its actions if operating with a wired connection,
so a wireless environment is highly recommended.

n Hardware n Software

- 1.5 GHz CPU - Windows XP 32bits


- 512 MB RAM - Mac OSX Snow Leopard 10.6
- Certified OpenGL - Linux Ubuntu 32bits
graphics card - Python 2.6
- LAN with DHCP - CMake(C++ Programming)
- Wi-Fi card (for wireless) - Visual Studio 2005 or 2008(C++ Programming)

1.2.3 Software Installation


NAO’s development environment is largely categorized into either Choregraphe or SDK
using a programming language. You can install them through NAO’s installation CD or through
the Aldebaran-robotics home page.

When installing Choregraphe in the Windows operating system, both NAOqi and Monitor
are installed together. This book uses Choregraphe Version 1.8.16.

For text-based programming, you need a program appropriate for each programming language.
In order to program using C and C++, you need Visual Studio 2005 (or 2008) or GCC 4.4
(or a more recent version). You also need CMake.

When using Python, Version 2.6 is recommended. Installation method for using the Software
Development Kit is explained in detail in the reference section (Advanced-SDK).

1 - introduction 17
1.2 preparation
1.3 connecting Nao

As introduced above, NAO can communicate with PCs through either a wired connection using an Ethernet
cable or wireless connection using Wi-Fi. To use the wireless connection, you must set it up first.
After connecting the wired/wireless router (which supports the DCHP function) to the NAO robot and pressing
the power button, NAO will speak about its own status (“Hello, I’m NAO.
My internet address is xxx.xxx.xxx.xxx, my battery is fully charged.”). Here, wired connection is prioritized
before the wireless connection for the internet address.

NAO also supports FTP service. The user can use FTP to send and receive files with NAO.
When using web service or FTP, a login is needed. The initial ID and password is nao/nao.

1.3.1 Wired Connection


using Ethernet
Image 1.8 shows the NAO robot with a wired connection
using the Ethernet cable. NAO automatically sets the IP
using DHCP. When the power button is pressed after NAO
is connected, it will speak the following: “Hello, I’m NAO.

My internet address is xxx xxx xxx xxx, battery is fully


charged.” Here, xxx.xxx.xxx.xxx is NAO’s active IP address. Image 1.8 - Wired connection using the Ethernet cable

1.3.2 Wireless Connection using Wi-Fi


In order to use Wi-Fi to connect wirelessly to NAO, you must first set the network.
NAO is web-enabled for simple configuration and status verification. Here, we will only look at the wireless
network configuration.

Image 1.9 - NAO’s default web service screen

18 1 - introduction
1.3 connecting nao
A - Enter NAO’s IP address into the web browser. Image 1.9 shows a normal default screen.

B - The ID and password needed for login is nao/nao.

C - When Network tab is selected, Connections and Available Network will come up.
In Connections, the IP address of the wired/wireless connection and the Mac address
is shown. In Available Network, the list of networks currently availablwe for use is shown.

D - Select the wireless router you wish to use in Available Network.

E - If Connect is selected, the wireless IP address is set and the network status will change. to “ready”.

Image 1.10 shows processes (3)-(5) from above. NAO can use the wireless IP address without a cabled
connection to communicate with other systems once the wireless network configuration is complete

Image 1.10 - Wireless network setting

1 - introduction 19
1.3 connecting nao
1.3.3 Using web service for default settings
As shown in the wireless network configuration process, web service can be used to set and verify
the basic parts of NAO.

The menu in the web browser is largely divided into About, Network, Settings, and Advanced. Advanced
is divided into NAOqi, Memory, Process, Hardware, Remote Controls, Bluetooth, and Log. The following
explains each menu option:

• About:
Shows the basic information regarding NAOqi and the Network.
Use the information about NAOqi to show the version, status, language, module, and behaviors.
Information regarding the network is used to show the IP address of the Ethernet and Wi-Fi.

• Network:
Shows NAO’s network connection information and can set the connection.
This function can be used in network configuration.

• Settings:
NAO’s basic settings can be done here. ID and password can be set and NAO’s icon can be changed.
The changed icon shown here is the one shown in Choregraphe’s connection list.
You can also set the language, time zone and the volume. The current version supports Chinese,
English, French, German, Italian, Japanese, Korean, Portuguese and Spanish.

• Advanced:
You can set or verify advanced functions.

- NAOqi: You can see the status of NAOqi’s behavior and can control it using ‘start,’
‘pause,’ and ‘restart.’

- Memory: Search specific variables by name and see the current variable values.

- Process: Shows the list of processes NAO is currently executing. Process information includes
the process no. (Pid), terminal connected to the process (TTY), time, and process name.

- Hardware: Shows the NAO’s device and joint information, configuration value, and temperature.
Here, the device refers to the board that controls the battery, LEDs, sensors, and joints.
Joint information includes the temperature, value of joint rotation angle measured by sensors,
and motor activated value. Configuration value refers to NAO’s default version, and temperature
refers to the silicon (silicium) on the head and the temperature of the board.

- Bluetooth: Shows the Bluetooth devices connectable with NAO and provides the function
for setting the connection.

- Log: Shows NAO’s system log.

20 1 - introduction
1.3 connecting nao
1.3.4 File Transfer using FTP
FTP can be used for NAO’s file transfers.

The file transfer function has been added to Choregraphe Version 1.8.16. FTP programs like WinSCP
can be used with earlier versions. When WinSCP is used, you can access the robot using NAO’s IP address,
ID, and password.

Image 1.11 - Using WinSCP to access NAO’s folder

NAO uses Gentoo Linux, and the folder structure is the same as the one in the Linux folder.
The files related to NAO are stored in “/home/nao.” Image 1.11 shows the access to the NAO folder.
The video shown in Image 1.11 was recorded using Monitor’s camera viewer. Chapter 2 will show you
how to use FTP in Choregraphe.

1 - introduction 21
1.3 connecting nao
1.4 Monitor
(former name: Telepathe)

As introduced above, Monitor is a program that shows real-time information about NAO’s camera
and memory. Image 1.12 is Monitor’s default screen.

Image 1.12 - Monitor default screen

In order to receive feedback from NAO using Monitor, your PC and NAO must communicate with
each other. Both Memory Viewer and Camera Viewer set the connection information using Browse
Robots (Image 1.13). This is also how Choregraphe connects with NAO. The connection can be set using
the port number or IP address. After entering NAO’s wired/wireless IP address, press the “Connect to”
button to connect NAO with Monitor.

Image 1.13 - Connection settings of Monitor and NAO

22 1 - introduction
1.4 Monitor (former name: Telepathe)
NAO’s camera can be set using Monitor’s Camera Viewer. image 1.14 shows Camera Viewer’s video
configuration screen. You can set the functions for frame rate, resolution, video conversion, mark and
face detection, vision recognition, bottom camera selection, adjustment of color channel value, symmetry
conversion, and etc. The video stream is received from NAO when Camera Viewer’s playback button
is pressed (the frame rate can be, it depends on the network).

image 1.14 - Setting the video in Camera Viewer

Camera Viewer can be used to save images inside the robot and to transmit the saved files to your PC.
image 1.15 shows how the video recorder is used to save and transmit images. After selecting the “Video
Recorder” tab, enter the name of the video file to be saved. Afterward, when the record button is pressed,
NAO will save it to its secondary storage. After it is saved, the corresponding file will be sent to your PC.
At this point, the images are saved in “/home/nao/naoqi/share/naoqi/vision/” OR “/home/nao/recordings/
cameras/” (in newest versions) in Camera Viewer’s video recorder does not itself have a configuration
section. The images have the following characteristics:

- Codec Format: MJPEG(24 bits)


- Size: 320 x 240
- Frame Rate: 15 fps

image 1.15 - Executing the video recorder

1 - introDuCtion 23
1.4 monitor (Former nAme: telepAthe)
Memory Viewer can be used to view the variable values used by NAO’s system. Memory Viewer
can be expressed as a time graph, and simultaneous output of multiple variables is also possible.
image 1.16 shows the screen for setting the variable for observation in Memory Viewer.
When using Memory Viewer for the first time, you can create a new configuration file, open an existing
configurations file, or choose not to use one.

image 1.16 - Setting the parameters for Memory Viewer

The graph is activated if you select “Watch” and “Graph” from the variable list and play “Start/Stop
graph” located below the graph output screen. Variable values will be in 100ms units by default,
and if the Subscription Mode is set to “No subscription,” “Refresh all” must be pressed to update the screen.
“Every nb ms” updates the screen per unit of specified time. Time used here must have a value greater
than 100ms.

image 1.17 - Memory Viewer for the pressed left bumper

24 1 - introDuCtion
1.4 monitor (Former nAme: telepAthe)
25
2 ChoregraPhe
LEARNING
Chapter 2 will teach you how to use Choregraphe,
a graphic-based programming tool, to operate NAO.
Choregraphe uses a program module called Diagram
to explain how to program and how to set NAO’s
movements in Timeline.

Additionally, it will provide a description of how to


use box libraries and FTP in Choregraphe.

26
content
2.1 Introduction and Interface 28

2.1.1 Menu 29
2.1.2 Box Library 30
2.1.3 Diagramming Space 30
2.1.4 3D NAO 31
2.1.5 Predefined Position Library 32
2.1.6 Video Monitor 32

2.2 Choregraphe-NAO connection 33

2.2.1 Connection Settings 33


2.2.2 Enslaving 34
2.2.3 File Transfer 34

2.3 Box 35

2.3.1 Structure 35
2.3.2 Box generation 36

2.4 Event and time Centered Programming 39

2.4.1 Event-Based Programming 39


2.4.2 Time-based Programming 42

2.5 Box Library 46

2.5.1 LED Library 46


2.5.2 Sensors Library 49
2.5.3 Logic Library 53
2.5.4 Tool Library 60
2.5.5 Math Library 63
2.5.6 Motion Library 65
2.5.7 Walk Library 72
2.5.8 Audio Library 77
2.5.9 Video Library 83
2.5.10 Tracker Library 88
2.5.11 Communication Library 89

27
2.1 Introduction
and Interface

Choregraphe is a cross-platform application that can implement NAO’s actions through graphics-based
programming. Unlike text-based programming, graphics-based programming has a lower focus on
grammar, and programming is mostly achieved using a mouse rather than a keyboard to create codes.

Choregraphe is also usable in Windows, Linux, and Mac OS and provides some FTP or Monitor functions
introduced in Chapter 1. Implementing NAO’s actions in Choregraphe is a task that involves connecting
the elements of movement (box) into one group centered around a time or event.

u
v
x

Image 2.1 - Choregraphe interface

Image 2.1 shows the Choregraphe interface, and it is divided into four different categories
as shown below. Pose Library and Video Monitor functions are also provided.

j Menu Screen l Diagramming Space


k Box Library m 3D NAO

28 2 - choregraphe
2.1 introduction and interface
2.1.1 Menu
In Menu, there is a drop-down menu with File, Edit, Connection, Behaviors, View, and Help, and the icon
menu shows New Diagram, Open, Save, Previous/Next, Connection, Run, Cancel, Debug View, Loading,
and Motor Lock. The following shows each of the functions:

File Menu
- New project: Create a new project.
- Open project: Open a project.
- Open recent project: Open the project that was most recently worked on.
- Save project: Save the project.
- Save project as: Save the project under another name.
- Import: Bring up the workspace or Version 1.6 of the project.
- Exit: Exit Choregraphe.

Edit Menu
- Undo: Cancel the latest action.
- Redo: Perform the canceled action again.
- Preferences: Configure the Choregraphe environment.

Connection Menu
- Connect to: Connect NAO to Choregraphe.
- Disconnect: Disconnect from NAO.
- Play: Send and execute the program connected to NAO.
- Stop: Stop a program that is currently being executed.
- Debug/Errors output: If there is an error while operating
in debuggingcmode, error information will be presented.
- Connect to local NAOqi: Connect to NAO in simulation.
- Advanced: Used during an update, unusable in any version
older than 1.3

Behaviors Menu
- Manage the behaviors saved in the robot.
- Add behaviors saved in your PC to the robot.
- Add behaviors you are currently working on to the robot.
- Delete saved behaviors or save them to your PC.
- Stop all behaviors.

View Menu
- Robot View: Enable 3D NAO in Choregraphe’s work environmentand
set the point of view for the screen.
- Enable the Box List, Pose library, Video monitor, Project content,
Script editor, Debug Window, and Undo Stack necessary
for Choregraphe operation.
- Reset Views: Return Choregraphe’s task screen to the initial state.

2 - choregraphe 29
2.1 introduction and interface
2.1.2 Box Library
The boxes that can be used by Choregraphe are listed by function. Here, a box refers to an icon
with a function. The box library is largely divided into 13 types depending on function and is made up
of 70 or so boxes. This box can be used to control NAO’s various movements.

You can create a new box library or import a saved box library on the top part of the Box Library,
and a search function is provided. In the lower part, there is a short explanation regarding the selected box.
The default is the built-in box library, and it is indicated with a tab when a different box library is imported.

Section 2.5 Box Library explains the default box provided.

2.1.3 Diagramming Space


As a place where NAO’s movements are created, the user can drag-and-drop the boxes in the box library
here. In the library provided, there is a box configured with a combination of several different boxes.
If you double click this box, a diagramming space corresponding to that box will open.

Image 2.2 - Box connections

Image 2.2 shows how it was programmed to turn on the ear LEDs by using the Battery box and Ear LED box
when there is no battery capacity. Just the mouse can be used to program inside the diagramming space.

30 2 - choregraphe
2.1 introduction and interface
2.1.4 3D NAO
As the name of 3D NAO implies, it is a screen showing the three-dimensional NAO. 3D NAO mainly provides
three functions. First, it simulates the movement programmed by the user.
This function is possible only for joint movements, and simulation for using other elements like sensors,
LEDs, and cameras is not possible. Second, the user can enter the values for the joints the robot can actually
move. The mirroring function is used when left/right joints are manipulated in a symmetrical manner.
Third, the actual movements of the robot are shown through this screen.

You can set the point of view in Choregraphe’s View Menu or the user can use the mouse to control the point
of view. The screen can be moved left/right and up/down using the left mouse button, and the right mouse
button can be used for rotation. Also, the mouse wheel can be used to zoom in/out.

There are two methods for generating NAO’s posture. The first method is to control the joints in simulation
to generate the pose (Image 2.3), and the second method is to manually move the joints of the actual robot
to acquire the joint information.

Image 2.3 - Joint manipulation using 3D NAO

2 - choregraphe 31
2.1 introduction and interface
2.1.5 PreDefineD PoSition library
This is a menu that can be configured and used in Choregraphe’s View Menu.

It is a function that saves the joint value of a specific pose to use it like a box. If you register common poses
like standing and the default pose, you can program more conveniently.

The pose library is configured similarly to the box library (image 2.4). Choregraphe provides three different
poses provided as default. Zero is when all the joint values are 0, and Init is a good pose for transitioning
to the next movement. Stand is a standing pose that uses the least amount of battery. The Pose library loads
the predetermined joint value to generate NAO’s movements. The user can create frequently used poses
to add to the pose library, and this can be implemented using the File Menu of the pose library.

image 2.4 - Pose library

2.1.6 ViDeo monitor


Monitor’s Camera Viewer introduced in Chapter 1 supports NAO’s camera configuration. Choregraphe’s Video
Monitor (image 2.5) provides the functions necessary for implementing movements that require cameras.

Instructions on how to use the Video Monitor is explained in Section 2.5.9 through the Video Library’s practice
exercise.

image 2.5 - Video monitor

32 2 - ChoregraPhe
2.1 introduction And interFAce
2.2 Choregraphe-NAO
Connection

2.2.1 Connection Settings


When “connect to” is selected from Choregraphe’s Connection Menu, the Browse Robots window will appear
as shown in Image 2.6. You can use this the same way as Section 1.4 Monitor.

Image 2.6 - Connection setting for Choregraphe and NAO

Left side of Image 2.6 shows NAO’s connection list. The picture of NAO shown in the list displays
the condition of the robot. There are three types of NAO conditions:

NAOqi is active and wired or wireless connection is available. If the right mouse button is clicked,
a menu that can move to LED tests and web page is created.

NAOqi is in a stationary state and both wired and wireless connection is not possible. However,
connection is possible if it is wired and you force set the port (9559).

NAO possible for simulation. Because only simulations are possible, LEDs and sensors cannot be
implemented and only motion control is possible.

2 - choregraphe 33
2.2 Choregraphe-NAO Connection
2.2.2 Enslaving
Enslaving is locking and unlocking of joints. The robot may engage in overstrained poses
when testing behaviors. It can also be used when the user randomly and manually generates
the robot’s posture. The Enslaving function can be used through three different methods.

The first method is to use the menu where either the “Enslave all motor on/off” from the Connection Menu
or the icon from the menu window is selected. The second method is to select the “Enslave chain on/off”
button in the lower section of the joint configuration window after the 3D NAO joint is pressed. The third
method is to use the Stiffness box from Motion in the box library. Caution is necessary if the joint is locked
for a long time because the battery consumption is high and the temperature can go up rapidly.

Following the Enslaving condition, the color of the icon in the menu window changes:

- Green: Un-enslaving condition, the stiffness value is 0. The motor doesn’t move even
if there is a command.
- Yellow: Set as the stiffness value. Stiffness is explained in Section 2.5.6 Motion Library.
- Red: Enslaving condition, the stiffness value is 1. The motor moves according to commands.

2.2.3 File Transfer


As introduced in Chapter 1, Choregraphe supports NAO’s FTP service. In order to use this service,
the connection between Choregraphe and NAO must be preceded. If File Transfer from Connection
Menu is selected, the authentication process is performed. It will connect to NAO’s default folder if the ID
and password (nao/nao) are entered (Image 2.7).
At this point, the default folder is “/home/nao/.”

Image 2.7 - Authentication process (left) and folder list (right) for the FTP Service

34 2 - choregraphe
2.2 Choregraphe-NAO Connection
2.3 box

2.3.1 StruCture
Choregraphe’s best feature is the fact that it provides a graphics-based
programming. Boxes and icons are an important element of Choregraphe
and are placed in the diagram. Graphics-based programming is done
through connecting these boxes. Choregraphe itself provides 70 or so boxes,
but the user can create others as well.

The box is configured with an input, output, and the Parameter button
(image 2.8). Some boxes don’t have an output or the Parameter button.
The interface of the box is as follows: image 2.8 - Box

onStart input: An input for activating the functions of the box, located in the upper left corner.
It is usually connected to the signals produced by other boxes.

onStop input: An input for stopping the box functions, located second from the top in the upper left
corner.

onStopped output: An output that shows the value result when the box function stops, located
in the upper right corner. It is usually used as the input signal of other boxes.

onLoad input: This is used when connecting Timeline’s internal box. When the box behavior
is implemented, Timeline’s internal box behavior loads.

Input connected to ALMemory: connected to ALMemory, memory’s specific variable value is delivered.

onEvent Input: This input brings the data from an external source to inside the box.

Punctual output: This output sends out the data inside the box.

Parameter: Box parameters can be set, located in the lower left corner of the box.

The box inputs/outputs can have different colors depending on the type of connection signal.
The form of signal and color used by Choregraphe is as follows:

- “Bang”: Black, a common input/output, it is a signal that doesn’t have any information.
- Number: Yellow, it is a signal that has the number information.
- String: Blue, it is a signal that has the text information.
- Dynamic: Gray, it can use all three aforementioned signals, the type of signal
is determined when the signal is applied.

2 - ChoregraPhe 35
2.3 box
Image 2.9 - Box structure

There are three types (Script, Timeline, Diagram) inside the box (Image 2.9). Script uses Python to implement
the functionality of the box. Timeline programs NAO’s movements according to the passage of time.
Diagram programs NAO’s movements depending on the flow of events. Timeline, which is time-based, can
program NAO’s movements by frame. Diagram, which is event-based, an program NAO’s movements by box.

Programs with joint movements must use both Timeline and Diagram together. Some parts of the box
provided by Choregraphe can have all three types applied at the same time.

2.3.2 Box Generation


As previously mentioned, the user can generate a new box
to implement unavailable functions. Also, programs
that execute complicated functions consist of a combination
of boxes so the readability of the program may decrease
if too many boxes are placed in just one diagram. If this is
the case, separate the program by function. This will improve
the readability and maintainability of the program as long as the
separated boxes are kept in the diagram of a new box.

Here, we will look at how to generate Timeline


and Diagram boxes. Chapter 3 Python will explain
how to generate Script boxes. The following shows
how to generate a box and to set the interface.
Image 2.10 - Adding a new box

A - If you press the right mouse button in the diagram space, the edit menu will open (Image 2.10).
If ‘Add a new box’ is selected from this menu, a setup window for a new box will open.

36 2 - choregraphe
2.3 box
B - You can configure the new box through the setup window (Image 2.11).
The following is the description of the image:

uu y
y

v
v
zz
w
w

x
x

Image 2.11 - Setup window for a new box

j General Description: Set the name and description of the box.


k Image: Shape of the box can be set, and you can select up to four images.
l Inputs/Outputs/Parameters: Input, output, and parameter lists of the box can be set.
There are control buttons on the right side of the list. The “–“ button on the left removes
the corresponding element. The setup button in the middle opens the o IO description window
where you can configure the corresponding element. The “+” button on the right adds a new element
and opens the IO description window just like the setup button does. When adding an element
to the parameter, a Parameter button is created on the box interface.

m Offspring: Determines the box type as mentioned above. No Offspring is a script, and Timeline
is time-based and the Flow Diagram is a diagram.

n Plugin: It is a function that uses the library of some of the boxes provided by Choregraphe.
o IO description: Sets the detailed information for the input, output and parameters. Sets the element’s
name, description, signal type, and input/output form. Signal type and input/output form are the same as
what was introduced earlier. The number to the right side of Type refers to the number of data.
If it is greater than 1, the signal is transmitted in array form.

2 - choregraphe 37
2.3 box
Image 2.12 - Box for Timeline (left) and Diagram (right)

C - If you double click the new box, a window like the one you see in Image 2.12 will open.

In this section, we looked at how you can generate the box interface and how to create a new box.
The next section explains how to use the box to program.

38 2 - choregraphe
2.3 box
2.4 Event-based
and Time-based Programming

There are two ways for Choregraphe to program NAO’s movements. The first method is event-based
programming where it deals primarily with box placement and connection. The second method is time-based
programming that defines the robot’s joint movements according to time. Dance moves can be created using
time-based programming. The diagram can be generated according to the flow of time. Meaning, both time
and event-based programming can be used together. This section will provide a simple over view on how to use
Choregraphe for the two methods.

2.4.1 Event-based Programming


Box placement and connection are the two most important things in event-based programming.
The boxes are placed using the drag-and-drop method, and they become one program when you connect the
box input/output. The following is an exercise for using the box to program and execute.

n Example 2.1 Diagramming Method (Example File: Ex2.1_diagram.crg)

Example 2.1 uses the Say box to have NAO speak the strings entered by the user. We will use two Say boxes
to explore how to connect the boxes and the process executed by NAO.
A detailed description of the box is in Section 2.5 Box Library.

Image 2.13 and 2.14 shows box placements, parameter settings, and box connections.
Image 2.15 shows you how to execute programs with NAO. Image 2.16 shows you the process
of how NAO executes a program. The following is an example of how to program and execute NAO.

Image 2.13 - Box placement and parameter setup

2 - choregraphe 39
2.4 Event-based and Time-based Programming
a - After selecting the Say box from the Box List, place it in the diagram.
Repeat this process one more time.

b - After pressing the Set button of the Say box, enter the string to be spoken by NAO in the Text entry.

u v w
image 2.14 - Box Connection

C - Connect the box inputs/outputs as shown in Image 2.14. A signal is generated in the output
of the Say box when NAO finishes speaking. Descriptions for each connection are as follows:

- Connection j: The start inputs of the root diagram and the box are connected. When NAO executes
a program, the start input of the root diagram becomes active. If these two input/output are not
connected, the program will not automatically run. If you don’t want it to run automatically,
it will run when you double-click the first start input of the Say box.

- Connection k: The output of the first Say box and the input of the second Say box are connected.
The second Say box will run once the first Say box is completed.

- Connection l: The output of the second Say box and the output box of the root diagram
are connected. A signal going into the output of the root diagram signals the end of the program.

image 2.15 - Connecting and executing Choregraphe and NAO

40 2 - ChoregraPhe
2.4 event-bAsed And time-bAsed progrAmming
D - To send the program created in Choregraphe to NAO, it has to be connected to NAO first.
As you can see in Image 2.15, if you click the antenna shaped button, the Browse Robot window will
open (as previously explained in Section 2.2). Since it was explained in the previous section, we will omit
how to connect to NAO using the Browse Robot window.

E - After connecting Choregraphe with NAO, the program must be sent to NAO and then executed.
When the Play button is clicked, the blue bar on the right will move. This signals that the generated
program is being sent to NAO. The bar will be full when the program is sent completely.
This will take a long time if a program is using a lot of boxes.

Image 2.16 - Example 2.1 result

F - When NAO receives a new program, it is immediately executed.

j in Image 2.16 shows that the program has started.


k shows the termination signal of the first Say box entering the input signal of the second Say box.

2 - choregraphe 41
2.4 Event-based and Time-based Programming
2.4.2 Time-based Programming
Time-based programming primarily consists of NAO movements that occur over time, meaning a program
can be made because each frame defines NAO’s posture. Programming done using Timeline doesn’t define
the posture for all the frames but instead defines the posture for some of the frames.

Joint movements from the two frames with defined postures are defined by a consistent pattern.
An example will be shown to demonstrate how Timeline programs and manipulates joints.

n Example 2.2 How to create Timeline (Example File: Ex2.2_Timeline.crg)

Frame 0 Frame 5 Frame 15 Frame 25 Frame 35 Frame 45

Image 2.17 - Using Timeline to create movements

Example 2.2 shows the creation of new boxes in Timeline and how to move the arms (Image 2.17).

Since Section 2.3 already explained how to generate new boxes, an explanation regarding parameters
of new boxes will be omitted here. The following is a description of the Timeline window.

A - When Timeline box is created and double-clicked, the Timeline window will open (Image 2.18).
The following explains the Timeline window:

j Motion: Composed of Timeline editor and playback button. If Timeline Editor is used, you can define
the movement pattern of the frames. Chapter 6 will explain how to use Timeline Editor.
Motion playback occurs when the playback button is pressed.

k Behavior layers: Action layers with keyframe can be generated. Keyframe has a diagram that can place
boxes. This diagram will be used to generate an event at a particular time or to implement a process
for when an event occurs. Keyframe diagram’s start signal is applied from the set frame (point).

Edit menu will open when the right mouse button is pressed in keyframe.

This menu can be used to add and delete keyframes, change names, and to set the start frame.
The mouse can be used to drag the keyframe to set the start frame.

42 2 - choregraphe
2.4 Event-based and Time-based Programming
u x
v
w
y

Image 2.18 - Timeline window

l Edit: Set the frame per second (FPS) for motion playback, the number of frames, and the playback mode.
NAO will move faster if you increase the number of FPS. Depending on how resources are obtained,
playback mode can have the Passive mode, Wait mode, and Aggressive mode. Here, resources refer
to the elements necessary to implement programs in NAO’s system.
The following is a description of each mode:

- Passive mode: Basic mode, motion playback occurs even when necessary resources
are not prepared.
- Waiting mode: Waits until the necessary resources are ready.
- Aggressive mode: Terminates the movements being executed in order to acquire
necessary resources.
- In general, we let the default value “Passive Mode”

m Select the desired frame. You can use the left mouse button to select one frame or drag the mouse to select
an entire section of frames. Chosen frame has a number displayed next to the Edit button. Green line means
0 frame, purple line means the selected frame, and a red line means the end frame.

n You can manipulate the joints you wish to move.


Movement is defined in the chosen frame if you select a frame and move the joint.

B - Use the Edit button to set the Timeline; the following is the setup value:

- FPS: 10
- Size: 360
- Resource Acquisition: Passive Mode

2 - choregraphe 43
2.4 Event-based and Time-based Programming
Image 2.19 - Defined posture of Frame 5

C - To generate the ready position, select Frame 5 and NAO’s right arm. Place a check for mirroring
and manipulate the joints in the joint manipulation window (Image 2.19). Mirroring lets the left/right
joints move in identical form. When the joints are manipulated, you will see a dark gray bar in Frame 5.
This means that the movements have been defined in Frame 5.

When the joints are manipulated, you will see the circle next to the joint value become red.
This means that the corresponding joint is locked. Sometimes, the user must select whether
a particular joint will lock. For example, if left arm movement is defined in Frame 10 and right arms
movement is defined in Frame 20, the right arm will actually move from 1 to 20.
This happens because the joints are automatically interpolated for the undefined frames.
To make sure a joint will not move until a specific time, you must lock it.

A more convenient function is to save the joint information by each segment. If you press the right
mouse button inside the frame where the movement is defined, a sub-menu called “store joints
in keyframe” will appear. In this sub-menu, there are Whole body (where all joint information is saved),
Head (where the head joint information is saved), Arms (where the arm joint information is saved),
Legs (where the leg joint information is saved), and Forearms (where the wrist joint information is saved).

D - Define the posture for Frame 15, 25, 35, and 45 using the same method (Image 2.20).
Frame 15 and 35 have the same posture while Frame 5 and Frame 45 have the same posture.
To define identical postures, copy and paste the existing frame (where posture has already
been defined) rather than manipulating the joints again.

44 2 - choregraphe
2.4 Event-based and Time-based Programming
Image 2.20 - Defined posture for Frame 15 (above) and Frame 25 (below)

E - You can observe movements by clicking the Motion playback button in the Timeline window.

So far we looked at how to program NAO’s actions through a brief introduction to Choregraphe,
box structure, and a programming practice. Section 2.5 Box Library will explore the box library
that will be used for actual programming.

2 - choregraphe 45
2.4 Event-based and Time-based Programming
2.5 Box
Library

In general programming, a function refers to an independent program that executes a particular function.
A box is like a function, and the box library provided by Choregraphe contains a total of 70 boxes
that are mainly divided into 11 types. The box library configuration is as follows:

- LEDs: Provides the boxes that control NAO’s LEDs.


­ - Sensors: Provide the boxes that obtain values from NAO’s sensors.
­ - Logic: Provides the boxes for logical programming.
­ - Math: Provides math-related boxes and boxes that generate random numbers.
­ - Motion: Provides the boxes for many different motions.
­- Walk: Provides the boxes for walking.
­- Audio: Provides the boxes for input/output of voice/sound.
­- Video: Provides the boxes for object recognition using cameras.
­- Tracker: Provides the boxes for following specific objects.
- Tool: Provides the boxes for adjusting the input from other boxes.
- Communication: Provides the boxes for E-mail, Bluetooth, and Infrared communication.

2.5.1 LED Library


There are 5 LED related boxes provided as default, and you can verify whether they have been executed
using the robot.

A - Ear LED

Image 2.21 - Ear LED box and the Parameters screen

Image 2.21 shows the Ear LED box (left) and the Parameters screen (right). Ear LED box
can use the LED by adjusting the location (left, right), time, intensity of the blue, and the angle.
The following describes the parameters:

- Side: Decides the LED to be used. (Left, Right)


- Intensity: Adjusts the intensity of the blue color. (0.0-1.0)
­ - Duration: Adjusts the duration of brightness. (0.0-5.0 Seconds)
- Angle: Adjusts the LED angle. (0-360 Degrees)

46 2 - choregraphe
2.5 box library
B - Eyes LEDs

Image 2.22 - Eyes LED box and the parameters screen

Image 2.22 shows the Eyes LED box (left) and the Parameters screen (right). The Eyes LEDs box controls
the LEDs of both eyes; it turns on NAO’s eye LEDs for a set period of time. The following shows the box
parameters:

- Duration: Adjusts the duration. (0.0 - 5.0 Seconds)

Image 2.23 - Internal configuration of Eyes LEDs box

Image 2.23 appears when you double-click the Eyes LEDs box. There are Color Edit and Eyes LEDs boxes
inside. The color selection window opens when you double-click a color from Color Edit; this window can
be used to choose the desired color. A number array is produced from the Color Edit box. Section 2.5.4 Tool
Library has the detailed information for the Color Edit box. The Eyes LEDs box in Image 2.22 and the Eyes
LEDs box in Image 2.23 are different boxes. Script Editor will open if you double-click the Eyes LEDs box in
Image 2.26, and Python can be used to edit. Chapter 3 Python deals with script editing methods.

C - RandomEyes

Image 2.24 is the RandomEyes box where the LED colors


of the eyes are randomly selected. The colors are set to change
at specific time intervals. Time interval and range of colors can
be changed by editing the script.
Image 2.24 - RandomEyes Box

2 - choregraphe 47
2.5 box library
D - Switch LEDs

Image 2.25 - Switch LEDs box and the Parameters screen

Image 2.25 shows the Switch LEDs box and the Parameters screen. The LEDs of both ears
and legs can be controlled by using the Switch LEDs box. The following shows the parameters:

- LEDs group: select the LED (LeftEarLeds, RightEarLeds, LeftFootLeds, RightFootLeds)


- Time: set LED activation time (0.0-60.0 Seconds)
- Min. intensity: when a signal is applied to the stop input, select the intensity of the light. (0.0-1.0)
- Max. intensity: when a signal is applied to the start input, select the light intensity of the light.
(0.0-1.0)

E - Water Clock

Image 2.26 - Water Clock box and the Parameters screen

Image 2.26 shows the Water Clock box and the Parameters screen. The Water Clock box turns
on the LEDs for NAO’s ears and eyes as blue for a set period of time. Intensity of the LEDs weakens
as you get near this set time. The following shows the parameter:

- Duration: set LED activation time. (0.0-120.0 Seconds)

48 2 - choregraphe
2.5 box library
2.5.2 Sensors Library
There are numerous sensors on the NAO robot including infrared, bumper, and contact sensors.
The sensors can be used to identify the presence of obstacles or to determine user contact. Just like
the LEDs, functions that use sensors can only be verified using the actual robot.

A - Battery

Image 2.27 - Battery box and internal configuration

Image 2.27 shows the Battery box and internal configuration. The isLow box inside has two inputs: one for
the Battery box input and the other for connecting to the memory’s BatteryLowDetected. The second output
of the Batt ery box will activate when the power of the battery drops below a certain level. There is a high risk
of the robot breaking if excessive movements are carried out when the battery is low. Endless Walk box
in the Walk library shows an example of how to use the Battery box

B - Bumper

Image 2.28 - Bumper box and internal configuration

Image 2.28 shows the Bumper box and internal configuration. Bumpers are installed in the front of both feet,
and they detect whether or not the bumpers are pressed to recognize obstacles in front of the feet. The two
outputs in the Bumper box produce information regarding whether or not the left and right bumpers are being
pressed. The inside of the bumpers are configured with left and right boxes, and each box is connected to the
variables of LeftBumperPressed and RightBumperPressed. The output value is true if each variable has a value
greater than 0, and the output value is false if the value is 0.

2 - choregraphe 49
2.5 box library
C - Foot Contact

Image 2.29 - Foot Contact box and internal configuration

Image 2.29 shows the Foot Contact box and internal configuration. The pressure sensors (FSRs:
Force Sensitive Resistors) attached to NAO’s soles are used to determine whether or not the soles
are in contact with the floor (Image 2.30). The Foot Contact box determines the output signal according
to the footContactChanged variable value.

A signal will occur from the second output if it is in contact with the floor, and if there is no contact, a signal
will occur from the third output. You can determine when NAO is falling by using the pressure sensor
attached to the sole. Here, the protective function is activated to improve system safety. Detailed information
regarding the pressure sensor is in the “Hardware” section of the reference.

Image 2.30 - Location of the pressure sensors (FSRs)

50 2 - choregraphe
2.5 box library
D - Robot Pose

Image 2.31 - Robot Pose box and internal configuration

Image 2.31 shows the Robot Pose box and internal configuration. Robot Pose box produces the strings
for NAO’s current posture. The blue inside the output indicates a string signal. There are no other boxes
inside and the robotPoseChanged variable and the output are directly connected. Strings provided as output
values include “Unknown,” “Stand,” “Sit,” “Crouch,” “Knee,” “Frog,” “Back,” “Belly,” “Left,” “Right,” and
“Headback.”

E - Sonar

Image 2.32 - Sonar box and internal configuration

Image 2.32 shows the Sonar box and internal configuration. The Sonar box uses the ultrasonic sensors
located in NAO’s chest to detect whether or not there are obstacles in the front. Out of the three punctual
ouputs, the upper two will work when there are no obstacles detected by the left and right ultrasonic sensors.
When an obstacle is detected by the ultrasonic sensor, the blue punctual output produces a string output
with information regarding the direction of the obstacle.

The punctual output of the other Sonar box (located inside the Sonar box) is connected to ultrasonic
sensor related variables. They are each connected to the SonarLeftNothingDetected, SonarLeftDetected,
SonarRightDetected, and SonarRightNothingDetected variables.

2 - choregraphe 51
2.5 box library
F - TactilTouch

Image 2.33 - TactilTouch box and internal configuration

Image 2.33 shows the TactilTouch box and internal configuration. This touch sensor is attached to NAO’s crown
and is divided into three parts. It becomes active with user contact and produces information regarding whether
or not each part has been activated. The following three outputs each show whether or not the front, center,
and back sensors are in contact. The inside is configured with the if 0 box consisting of Python codes. This box
receives the value of memory’s variable, and the output is activated when this value is greater than 0. Starting
from the top in order, it is connected to the FrontTactilTouched, MiddleTactilTouched, and RearTactilTouched
variables. The user can send signals to the robot using this sensor. For example, to manipulate the forward, stop,
and reverse motions, the user can create behaviors depending on whether or not the sensors of the three parts
are active.

G - Tactile L.Hand, Tactile R.Hand

Image 2.34 - Tactile L(R).Hand box and internal configuration

Image 2.34 shows the Tactile L(R).Hand box and internal configuration. Unlike the Tactil Touch box,
the contact sensors here are attached to both arms. The output and functions are identical to the Tactil
Touch box.

52 2 - choregraphe
2.5 box library
H - TactilTouch

Image 2.35 - Fall Detector box and internal configuration

Image 2.35 shows the Fall Detector box and internal configuration. This box determines whether the robot
will fall. It also implements internal system protection. In the internal diagram of the Fall Detector box,
the Wait box and robotHasFallen variable are all connected, and they get activated by the Wait box after
a period of time. Time delay occurs in order to ensure the completion of the safety process. The default
delay time for the Fall Detector box is 0.5 seconds.

2.5.3 Logic Library


Logic library provides boxes with functions like loop or conditional statements provided by text-based
programming languages. These functions are a big help when implementing complicated programs.

A - Dispatcher

Image 2.36 - Dispatcher box

The Dispatcher box has one input and several outputs (Image 2.36) and plays a similar role as the switch
statement in C. If the signal coming in from the input matches the box element, the element will be
sent to the corresponding output. The first output becomes active when there is no matching element.
Input/Outputs are gray and both numbers and texts can be used.
Text must be entered with double quotation marks (“”).

The Dispatcher box can identify the different signal types. You can add and delete lists through
the menu (Insert row, Remove row) that appears when you click the right mouse button on top of the box.
Also, when you finish creating list, an empty list will automatically be added.

2 - choregraphe 53
2.5 box library
n Example 2.3 (Example File: Ex2.3_dispatcher.crg)

Image 2.37 shows a program that uses Random Int and Dispatcher boxes to control LEDs.
Random Int box creates random integers. The following example shows how LEDs are controlled.

- When the random number is 1: Turn off Ear LEDs, Turn on Eye LEDs.
- When the random number is 2: Turn off Ear LEDs, Turn off Eye LEDs.
- When the random number is neither 1 nor 2: Turn on Ear LEDs, Turn off Eye LEDs.

Image 2.37 - Example of how to use the Dispatcher box

B - Eyes LEDs

Image 2.38 - Choice box and the Parameters screen

54 2 - choregraphe
2.5 box library
The Choice box detects the user’s voice, and if the detected voice matches a word on the list, a string
is produced for the corresponding output (Image 2.38). Meaning, answers can be classified depending
on the question. The first input executes the Choice box. If the input receives a string, the box is executed
after the string is read. This string must be a question that matches the answer NAO will hear from the user.
The second input receives a list of words from an external source. This signal must be received before the
box starts. The list of words can be generated above the box. The first output of the box has a string output
regarding the status of the Choice box. Choice box status is as follows:

- “timeout”: When there is no response from the user for a set period of time.
­ - “notUnderstood”: When there are no words that match the user’s voice.
­- “onStop”: When a signal is applied to the stop.
­ - “wordRecognised”: When there is a word that matches the user’s voice.
- “onTactileSensor”: When the user has touched the contact sensor.

The user can issue a command using the head’s contact sensor in addition to speech recognition.
The front part of the contact sensor increases the index of the words while the back decreases it.
When the index changes, the word from the corresponding index is read. The center part delivers
the chosen word to the command. The following explains the Choice box parameters:

- Activate head: Selects whether or not the head joint is activated while Choice box is operating.
- Activate arms: Selects whether or not the arm is activated while Choice box is operating
­ - Activate legs: Selects whether or not the leg is activated while Choice box is operating.
­- Minimum threshold to understand: Sets the minimum threshold value for speech recognition (0.0-1.0).
- Minimum threshold to be sure: Sets the threshold value to obtain the answer to the user’s question
(0.0-1.0). If the value is lower than this threshold value, NAO will ask the question again.
- Speech recognition timeout when confirmation: Sets the point of time for determining the success
of speech recognition. Once speech recognition is completed during this time without a reply from
the user, NAO will decide that the speech recognition has been successful.
- Speech recognition timeout: Determines when to stop speech recognition.
- Maximum number of repetition when no reply: Sets the number of time the question is repeated
if the user does not respond (1-20).
- Fun animations activated: Determines whether or not the chosen movements are still implemented
if speech recognition fails.
- Repeat validated choice: Determines the repeated output of the chosen word. If this parameter
is selected, the box will terminate and there will be another output of the chosen word.
- Activate ears light: Determines whether or not the Ear LEDs will be activated.
­- Activate eyes light: Determines whether or not the Eye LEDs will be activated.
- Activate brain light: Determines whether or not the contact sensor LEDs will be activated.
­- Tactile sensor menu timeout: Determines the time limit of the contact sensor.
- Maximum number of repetition when failure: Determines how many times the question is repeated
when speech recognition fails.
- Activate help when failure: Determines whether or not help will be activated
when speech recognition fails.
- Activate help command: Determines whether voice prompted help will be activated.
NAO will explain the help when the user says “help.”
- Activate repeat command: Determines whether or not to repeat the voice prompted question.
NAO will speak the question when the user says “help.”
- Activate exit command: Determines whether or not to close the voice prompted box.
NAO will terminate the Choice box when the user says “exit.”

2 - choregraphe 55
2.5 box library
image 2.39 - Example of how to use the Choice box

image 2.39 is an example of a program that uses the Choice box. The question is entered by using
the Text Edit box. The expanded menu will open if you click the + button of the Choice box.
You can hear the pronunciation of a word if you click the playback button next to it.

C - Loop

image 2.40 - Loop box and the Parameters screen

The Look box has three inputs and two outputs (image 2.40). The Loop box has a similar function as the
‘for’ statement in C. The index increases by one when signal enters the start input and the output of the index
is activated. Loop max parameter can have a value between 0 and 500 for the number of iterations. The Loop
box index will initialize with 0 and is set to increase by one.

This is set in the script that appears when the Loop box is double-clicked. You only need basic knowledge
of Python to make this change.

56 2 - ChoregraPhe
2.5 box librArY
D - Multiplexer

Image 2.41 - Multiplexer box

The Dispatcher box compares one input with the box list to activate just one output.
The Multiplexer box is the opposite of the Dispatcher box and has several inputs but just one output
(Image 2.41).

When a signal enters the first input, it activates the «apple» string, and when a signal enters the second
input, it activates the «orange» string. When a signal enters both inputs, both “apple” and “orange” strings
are activated.

■ Example 2.4 (Example File: Ex2.4_multiplexer.crg)

In the example shown in Image 2.42, the contact sensor of the head and the Multiplexer box are used
to speak where the user has touched the robot. As explained earlier, because each contact sensor consists
of three parts, each box has three outputs.

These outputs can be connected to the Multiplexer box input. As shown in Image 2.42, a signal enters the first
input of the Multiplexer box when the front sensor is touched, and then «front» will be output.

Image 2.42 - Example of how to use the Multiplexer box

2 - choregraphe 57
2.5 box library
E - Wait for Signals

Image 2.43 - Wait for Signals box

Wait for Signals box consists of two inputs and one output (Image 2.43). The output is activated only when
the signal enters both inputs of the Wait for Signals box. The output is not activated if only one of the inputs
receives the signal. However, the two signals don’t have to come in at the same time, and they are initialized
when the output is activated.

■ Example 2.5 (Example File: Ex2.5_Wait for Signals.crg)

Image 2.44 - Example of how to use the Wait for Signals box

Image 2.44 is an example of how to use the Wait for Signals box to determine whether or not both left/right
bumpers are being pressed. When both bumpers are pressed, the robot will say “Two Bumper.” The following
shows the parameter setting.

- Say box: Text (Two Bumper)

F - Timer

Image 2.45 - Timer box and the Parameters screen

58 2 - choregraphe
2.5 box library
The Timer box has two inputs and two outputs (Image 2.45). Period is the timer cycle with seconds as
the unit of measure and 0.0 to 5000.0 as the range. After the timer is activated and the set period of time
passes, a signal will occur in the second output. Please note that a signal will occur in the second output
when a start signal is applied to the initial timer box.

■ Example 2.6 (Example File: Ex2.6_Timer.crg)

Image 2.46 - How to use the Timer box

Image 2.46 shows an example of how to use the Time box to say “Ten Second” in 10 second intervals.
After “Ten Second” is initially spoken, it will happen every 10 seconds thereafter.
The following shows the parameter setting:

- Timer box: Period(10.0)


- Say box: Text(10 second)

G - Wait

Image 2.47 - Wait box and the Parameters screen

Wait box has two inputs and one output (Image 2.47). Wait a set period of time without sending a signal.
The input/output is black, and you cannot send information (like strings or numbers). Timeout is a time
delay in seconds with 0.0 to 5000.0 as the range.

2 - choregraphe 59
2.5 box library
■ Example 2.7 (Example File: Ex2.7_Wait.crg)

Image 2.48 - How to use the Wait box

As explained in Timer box, the Timer box immediately activates the output as soon as the start signal
is received. Example 2.7 uses the Wait box and Timer box to get rid of this feature (Image 2.48).
Unlike Example 2.6, even if the start signal is applied, NAO will not immediately speak.

The following shows the parameter settings for each box:

- Wait box: timeout(10.0) ­


- Timer box: Period(10.0)
- Say box: Text (10 seconds)

2.5.4 Tool Library


The Tool library provides two functions. The first function is used to generate time-based movements
(movement of frames and frame start/stop). For example, the Tool library box can be used to program NAO
to use the sensor to halt while still in motion.

Image 2.49 - Goto And Play and Goto And Stop boxes and the Parameters Screen

60 2 - choregraphe
2.5 box library
The second is the constant function. In text-based programming language, the user defines the constant in order
to use specific information. The constant provided by Choregraphe has information regarding the angles, RGB
colors, numbers, and strings.
A box for each constant is provided. If a constant box isn’t used, you must edit the Python script.

A - Goto And Play, Goto And Stop

Goto And Play box and Goto And Stop boxes have only one input and identical parameters (Image 2.49).
As previously introduced, time-based motion is generated based on frames. Frame number tells you
how many frames to move and can be between 0 and 10000. 

This box is not used in the root diagram; it is used inside Timeline generated boxes (like the boxes
in the Motion library). After the Goto And Play box moves to the specified frame, the movement starts
from this frame. These two boxes can reflect event-based elements in time-based movements.

B - Play, Stop

Image 2.50 - Play box and Stop box

Although both the Play box and Stop box are outputs, they have no parameters and only one input
(Image 2.50). The Play box is similar to the Goto And Play box, but it starts playing from the current frame
without moving the frames. The Stop box also stops in the current frame also without moving the frames.

C - Angle Edit

Image 2.51 - Angle Edit

The Angle Edit box consists of two inputs and one output (Image 2.51). The user can enter the angle value
by selecting the degree or radian. The Angle Edit box first converts the angle value entered by the user into
a radian value, and then this radian value is used for output.

2 - choregraphe 61
2.5 box library
C - Color Edit

Image 2.52 - Color Edit box and color picker

Color Edit box consists of one input and one output (Image 2.52). The color picker opens when you click a
color from Color Edit. Once a desired color is selected in the color picker, the color information will be sent
out as R, G, and B. You can see an example of how to use Color Edit
in the Eyes LEDs box.

Eyes LEDs box consists of the Color LED box and Eyes LEDs box (Image 2.53). Information for the selected
colors is transmitted in the order of R, G, and B. Input inside the Eyes LEDs box receives three signals
in a number array. Image 2.42 shows you that the color edit output [61, 74, 255] is a number array.

Image 2.53 - How to use Color Edit

62 2 - choregraphe
2.5 box library
D - Number Edit, Text Edit

Image 2.54 - Number Edit box and Text Edit box

The Number Edit box and Text Edit box consist of one input and one output (Image 2.54).
Number Edit box produces numeric signals while Text Edit box produces text signals.

2.5.5 Math Library


The Math library provides math related boxes for dividing, multiplying, and generating random numbers.
Boxes for adding and subtracting aren’t provided.

Chapter 3 Python explains how to generate the add box.

A - Divide

Image 2.55 - Divide box

The Divide box has two inputs and one output (Image 2.55).
The inputs/outputs color shows that all three inputs/outputs send and receive numeric signals.
The Divide box performs divisions using the two numbers that came in through the input and then produces
the result. The first box is the Dividend and the second box is the Divisor.

2 - choregraphe 63
2.5 box library
Image 2.56 uses the Divide box to generate “1.0 ÷ 0.0.” There is an error when the Divisor is 0.

Image 2.56 - How to use the Divide box Example: 0.0 ÷ 1.0 (left) and 1.0 ÷ 0.0 (right)

B - Multiply

The Multiply box, like the Divide box, has two inputs
and one output (Image 2.57). It uses the two numbers
from the input to perform the multiplication
and then produces the result.
Image 2.57 - Multiply Box

C - Randomlnt

Image 2.58 - RandomInt box and the Parameters Screen

RandomInt box has one input and one output (Image 2.58). The RandomInt box generates random integers
where the minimum value is 0 and the maximum value is determined by the parameter. Max is the maximum
value of the random number with a range of 0 to 1X109. Shuffle heightens the complexity of the random
number. The RandomInt box is activated only once, so the Timer box or Loop box must be used in order to
continuously generate random numbers.

64 2 - choregraphe
2.5 box library
D - RandomFloat

Image 2.59 - RandomFloat box and Parameters screen

RandomFloat box, like the RandomInt box, has one input and one output (Image 2.59).
The difference is that there is a floating-point instead of an integer and there is no Shuffle and.
The maximum value can be set by adjusting Max and has a range of 0.0 to 9.9 X 1013.

2.5.6 Motion Library


The Motion library consists of boxes for simple movements. Several movements are provided
as an example; arm movements, bowing, empty Timeline box, sitting, standing, tai chi, and etc.
Sitting and standing are used quite frequently.

A - Arms Example

Arms Example box has two inputs and one output


and performs arm movements. A total of five frames
are defined and Frame 93 is the end frame. Image 2.61
is example of this box.
Image 2.60 - Arms Example box

Frame 0 Frame 26 Frame 51 Frame 72 Frame 93

Image 2.61 - Movement from the Arms Example box

2 - choregraphe 65
2.5 box library
B - Hello

Image 2.62 - Internal configuration of the Hello box

Hello box is a Timeline and Script box (Image 2.62). Internally, the FaceLeds layer is applied starting from
Frame 0. FadeLeds layer has a _AskForAttentionEyes script box. Frame 115 is the end frame with 25 FPS,
which is faster than the default value.

Hello box shows the motion for waving the arms to greet, and the colors of the eye LEDs change
(Image 2.63). There are intervals defined by two frames for a more natural motion.

Frame 52 Frame 60 Frame 66 Frame 75 Frame 83 Frame 93 Frame 111

Image 2.63 - Hello Box movements

C - Empty Timeline

Empty Timeline in Image 2.64 has a timeline structure as suggested


by the name, and it does not have a frame with a clearly defined
movement. You can do Timeline programming in this box without
generating a new box. Since Section 2.4 explains how to program
using Timeline for time-based programming, explanation regarding
the Empty Timeline box will be omitted here.
Image 2.64 - Empty Timeline box

66 2 - choregraphe
2.5 box library
D - Sit Down

Image 2.65 - Sit Down Box and the Parameters screen

The Sit Down box executes the sitting motion, and Maximum of tries is the number of times you can attempt
to sit (Image 2.65). You can make a maximum of 10 attempts. There are three outputs; the first output
becomes active when sitting is successful and the second output becomes active when it fails to sit.
If it is impossible to execute the sitting motion, the third output becomes active.

Image 2.66 - Internal configuration of the Sit Down box

Sit Down box consists of one layer, and the sitDownBehavior layer has five keyframes (Image 2.66).
The keyframe starting in Frame 1 is the DetectrobotPose frame, and it decides whether to move to another
keyframe. The DetectRobotPose frame is separated into the number of attempts made for sitting, pose
acquirement, and playback of the acquired pose. Get Robot Pose box is used to acquire the robot’s pose,
and you can use the pose acquired through the Dispatcher box to move to another keyframe.

2 - choregraphe 67
2.5 box library
The following shows the key frame for the acquired position.

When Standing

When Laying Facedown

When Laying on its Back

When Laying on its Side

The keyframe for standing or lying on either its back or front is configured with the Stiffness box, Timeline
box, Increase Count box, and DetectRobot Pose box. When laying on its side, it detects which side it’s facing
and executes the movement. Here, a pose is already defined in the FromStand, FromBelly, FromBack, and
RotateSide boxes for standing.

68 2 - choregraphe
2.5 box library
E - Stand up

Image 2.67 - Stand Up box and the Parameters screen

Stand Up box provides the standing motion and has the same input/output and parameters
as the Sit Down box (Image 2.67).

While the Sit Down box has categorized the robot’s pose into four different types, the Stand Up box
categorizes into five different types. Image 2.68 defines the crouching pose. The configuration
of the box is omitted here since it is similar to the Sit Down box.

Image 2.68 - Internal configuration of the Stand Up box

2 - choregraphe 69
2.5 box library
F - Stiffness

Stiffness box provides the Enslaving and Un-enslaving functions. Enslaving and Un-enslaving
were previously introduced as the locking and unlocking of the motor. Stiffness refers to the force of the
lock. Meaning, this is how much force will be applied to execute the command given to a motor.
When power is applied to NAO, Stiffness of all joints will be 0.

When Stiffness is 0, the joint will not move even if there is a command. When it is 1, all available force
will be used to execute the command. The greater the Stiffness, the greater the battery used, and the risk
of malfunction due to external shocks also increases.

Image 2.69 - Stiffness box and the Parameters screen

Stiffness box consists of two inputs and one output (Image 2.69). The start sets the Stiffness to Max,
and the stop sets it to Min. The parameters of the Stiffness box are as follows:

- Min stiffness: Stiffness value when signal has been applied to the stop. (0.0 - 1.0)
­ - Max stiffness: Stiffness value when signal has been applied to the start. (0.0 - 1.0)
- Duration: When Stiffness is applied. (0.0 - 1.0) The previous stiffness value is applied.
- Head, Left arm, Right arm, Left leg, Right leg: Where Stiffness will be applied.

70 2 - choregraphe
2.5 box library
G - Tai Chi Chuan

Image 2.70 - Tai Chi Chuan box and the Parameters screen

Tai Chi Chuan box defines the motions that enable NAO to execute Tai Chi Chuan (Image 2.70).
Use legs enables you to select whether or not the legs will be used. The internal configuration of Tai Chi
Chuan is shown in Image 2.71, and the movements will change depending on whether or not the legs
are used.

Image 2.71 - Internal configuration of the Tai Chi Chuan box

If the legs are used, Tai Chi Chuan is executed only in “Stand” and “Crouch” positions, and if the legs
are not used, Tai Chi Chuan is executed only in “Sit,” “Unknown,” “Stand,” “Crouch,” and “Knee” positions.

2 - choregraphe 71
2.5 box library
H - LeftHand, RightHand

Image 2.72 - LeftHand box and RightHand box

The LeftHand box and RightHand box shown in Image 2.72 are boxes that move NAO’s hands.
The hand opens when a signal enters the start and closes when a signal enters the stop. Since it is a script
box, you must edit the script in order to execute a more precise manipulation of the hands.

2.5.7 Walk Library


It is quite a difficult task to make NAO walk using a timeline-based programming method.
The Walk Library consists of several boxes related to walking.

A - OmniWalk

Image 2.73 - OmniWalk box

The OmniWalk box consists of four inputs and one output (Image 2.73). It is a script box where
four numeric signals are used to make NAO walk. The following explains the input:

- X: Determines the direction and speed of the forward and reverse movements. The value
can be from -1.0 to 1.0; a negative number means reverse and a positive number means forward.
­- Y: Determines the direction and speed for left and right. The value can be from -1.0 to 1.0;
a negative number represents the right side and a positive number represents the left side.
- Theta: Determines the direction and speed of the rotation. The value can be from -1.0 to 1.0;
a negative number represents a right turn and a positive number represents a left turn.
- Step Frequency: Determines the frequency of the walk. The value can be from 0.0 to 1.0; 0
is the stationary state and 1 is the maximum speed.

72 2 - choregraphe
2.5 box library
B - DemoOmni

Image 2.74 - DemoOmni box and the Parameters screen

We have just explored the OmniWalk box. DemoOmni box has created a program example using
the OmniWalk box.

The four OmniWalk box input signals become the parameters of the DemoOmni box (Image 2.74).

The DemoOmni box includes Omnibox’s regular walking motion and walking while waving
the arms. DemoOmni box is a combination of regular OmniWalk walking and walking while waving the arms.

Not only that, the safety feature is activated to prepare NAO for dangerous situations that may occur while
walking. Safety feature is activated when Force Sensing Resistor (FSR) fails to detect. Meaning, when both
feet are off the ground, NAO will stop walking.

The following shows the parameters:

- Left arm enabled: determines whether or not the left arm will move
- Right arm enabled: determines whether or not the right arm will move
­ - Stop walk when foot contact is lost: determines whether
or not the safety feature will be activated

2 - choregraphe 73
2.5 box library
Image 2.75 - Internal configuration of the Demo Omni box

The Demo Omni box consists of OmniWalk, Joystick, Protection, and EnableArms boxes (Image 2.75).
Joystick box uses the output to send the parameters of the Demo Omni box.
The Protection box activates “ENABLE_FOOT_CONTACT_PROTECTION” of the motion module.
The safety feature is explained in the reference (NAOqi Guide/Motion/Safety).

C - Endless Walk

Image 2.76 - Endless Walk Box and the Parameters Screen

Endless Walk box, as the name suggests, is for walking without stopping. OmniWalk box is used
for walking, and the sensor uses ultrasonic sensors as well as bumper and battery sensors. It is also
possible to use the camera to implement the face tracking function. Walking stops when the touch sensor
is used, and when NAO falls, the program ends after it gets back up. SpeedX, SpeedY, and SpeedRotation
parameters are same as the DemoOmni box parameters, and faceTracking shows whether or not the face
detection and tracking have been activated.

74 2 - choregraphe
2.5 box library
x

v w

u
Image 2.77 - Internal configuration of the Endless Walk box

Image 2.77 shows the internal configuration of the Endless Walk box. A simple explanation regarding
walking, sensors, and face tracking is provided in the TextEdit box. Endless Walk box uses a lot of boxes
because it has a lot of diverse functions. They are divided into four areas based on each function,
and the areas are not connected with each other.

The program operates normally even though each area is not connected because the Endless Walk box
memory (neverEndingWalk) is used. This type of programming is efficient when movements are controlled
by ultrasonic sensors (area k and m). The following provides an explanation for each area.

- Area j: This is where face recognition and tracking are executed. The faceTracking parameter
of the Endless Walk box is verified through the ifFaceTracking box. Face recognition is executed when
faceTracking is selected. Face Coord box uses FaceDetected, a memory variable, to acquire and produce
the angle of the face. The structure of the FaceDetected variable is explained in the reference
(NAOqi API/ALFaceDetection).

When the Face Coord box produces the angle of the face, the Tracker box uses this information to manipulate
the head. Random Int box generates a random number when this signal enters. If this random number is 1,
NAO will say “Hello,” and if this random number is 2, NAO will execute the greeting.

- Area k: This is where an ultrasonic sensor is used to detect obstacles; it consists of the Sonar box
and SonarDemo box. The output from the Sonar and SonarDemo boxes does not connect with other boxes.
The Sonar box used here is different from the Sonar box used in the Sensors library.

2 - choregraphe 75
2.5 box library
The Sonar box processes the value of the ultrasonic sensor to save the distance between NAO and the object
in the memory. SonarDemo box determines NAO’s actions when there is an object. The action here is not a
function of avoidance, but rather a movement that rotates the head depending on the direction of the object.

- Area l: This is NAO’s safety feature for when the battery is low. Because Endless Walk executes
a constant walking motion, the robot could fall down while walking if the battery is low.
If the battery power is insufficient, DemoOmni box in Area m will stop first. Even if a stop signal
is applied to the DemoOmni box, it will take a bit of time for it to completely stop. To guarantee
this time, a delay of three seconds is generated through the Sleep box. The Movement box lets
the NAO sit down, and the Stiffness box turns NAO’s joints into Un-enslaving status.

- Area m: This is where walking is executed and the robot is manipulated using the DemoOmni box.
Before starting to walk, the Init box sets the language, loudness of the sound, and Stiffness and stands
up. Init pose box is used to adjust the posture. The poolCommand box gets the information
(X, Y, and Theta) necessary for OmniWalk from the memory (neverEndingWalk) and sends this value
to the DemoOmni box. DemoOmni box is the box that actually commands the walk.

The MarcheInfinie box determines how the robot will be controlled. This is where you can
generate the information that will be used by the poolCommand box. It determines the actions
of the ultrasonic, contact, bumper, and FSR sensors.

D - WalkTo

Image 2.78 - WalkTo box and the Parameters screen

WalkTo box determines where to walk to (Image 2.78). The parameter includes X and Y (distance values) and
Theta (angle of rotation) regarding the destination, and other parameters are the same as the Demo Omni
box. The following shows the explanation regarding Parameter X, Y and Theta:

- X: Front (front, back) distance in meters. Range is from -2.0 to 2.0, and the default value is 0.2.
- Y: Side (front, back) distance in meters. Range is from -2.0 to 2.0, and the default value is 0.
- Theta: Angle of rotation in radian. Range is from -3.14 to 3.14, and the default value is 0.

76 2 - choregraphe
2.5 box library
2.5.8 Audio Library
NAO has four microphones and two speakers. Audio is an important medium for the communication between
NAO and people. Audio provides diverse functions to facilitate smooth communications.

A - setVolume

Image 2.79 - setVolume box and the Parameters screen

The setVolume box sets the audio volume (Image 2.79). The parameter is volume percentage,
and 0 means it has been muted.

B - setLanguage

Image 2.80 - setLanguage box and the Parameters screen

The setLanguage box can set the language used by NAO (Image 2.80). As introduced in Chapter 1, Chinese,
English, French, German, Italian, Japanese, Korean, Portuguese and Spanish can currently be used.

2 - choregraphe 77
2.5 box library
D - Music

Image 2.81 - Music box and the Parameters screen

Music box plays the audio files (Image 2.81). Supported audio formats include .wav, .mp3 and .ogg.
The following explains the parameter:

- Play in loop: Repeat playback function. Default value is one playback.


- Begin position: When the audio file will start measured in seconds. Maximum value is 600 seconds.
- Volume: Loudness of the audio; range is from 0.0 to 1.0, and the default value is 1.0.
- Panorama: Ratio of the left/right speaker volume. Range is from -1.0 to 1.0; only the left speaker
will activate if it is -1 and only the right speaker will activate if it is 1. Default value is 0.0.

Image 2.82 - Internal configuration of the Music box

Music box consists of a Music File box that determines the location of the file you wish to play and a PlayMusic
box that actually plays the file (Image 2.82). When you click the folder button of the Music File box, “Select a file”
window will open where you will be able to choose a file. The list in this window is a file registered in Choregraphe,
and it is sent together with the program. If the purpose is to send a program rather than to play music, it would be
good to delete the music file in this list.

78 2 - choregraphe
2.5 box library
E - Say

Image 2.83 - Say box and the Parameters screen

The Say box lets NAO read the Text (Image 2.83).
This is used to let NAO read specific texts or sentences. The parameter is as follows:

- Text: String of text to be read by NAO; double quotation marks (“”) are not used.
- Voice Shaping: A value for the depth of the voice; range is from 50 to 150 with 100 as the default
value. Lower number means a deeper voice; 75 is recommended for a male voice.
- Speed: A percentage value of the speaking rate; range is from 50 to 200 with 100 as the default
value. The lower the value, the slower the speaking rate.

F - Say Text

Image 2.84 - Say Text box and the Parameters screen

The Say Text box has a similar function as the Say box, but it receives the Text parameter
from the input. Since this parameter is same as the Say box, it will be omitted here.

2 - choregraphe 79
2.5 box library
Image 2.85 - Example of Say box (top) and Text box (bottom)

Image 2.85 is an example of how to use the Say and Say Text boxes. When there is an obstacle, the last
output of the box uses a string to let you know the location of the obstacle. In order to use the Say box, the
Dispatcher box has to be used to determine whether it’s the left or the right, and the two Say boxes have to be
used to select the Text for each instance. However, when Say Text box is used, you can read the string of text
received from the Sonar box.

Image 2.86 - SpeechReco box and the Parameters screen

The SpeechReco box determines whether or not the voice acquired from the microphone matches the set
word (Image 2.86). The first output is activated when speech recognition starts. The second output sends the
matching word when speech recognition is successful. The third output is activated when there is no matching
word. The following shows the parameters of the SpeechReco box:

- Word list: Words used for speech recognition; semicolon is used to separate the words.
- Language: Sets the language of the voice. Currently Chinese, English, French, German,
Italian, Japanese, Korean, Portuguese and Spanish are supported.
- Threshold: Numerical value that represents the precision of the speech recognition;
if the recognition of the word is lower than the Threshold value, it is assumed that the word
was not recognized. Range is from 0 to 1, and the default value is 0.4.
- Visual expression: Determines whether or not LEDs are used during speech recognition.

80 2 - choregraphe
2.5 box library
F - Pronounce

Image 2.87 - Pronounce box and the Parameters screen

The Pronounce box reads the word chosen by the user. Unlike the Say box, it reads all the registered
pronunciations when there are several different pronunciations available. The Pronounce box can be used to
improve the accuracy of the Speech Reco box. A good example is the French word “bonjour” where four different
pronunciations are provided. The Parameters include Word and Language.

G - Record

Image 2.88 - Record box and the Parameters screen

The Record box records the sound from the microphone (Image 2.88). There are four inputs;
the first starts the recording and the second stops it. The third input plays the recorded file
and the fourth stops it. The following explains the parameters.

- Duration: Time to be recorded in seconds. Range is from 0 to 60, and the default value is 5 seconds.
­- Filename: File name used when saving the recorded content as an audio file.
Audio files are saved in the “/home/nao” folder.
- Number of channels: Sets the number of microphones. 1 and 4 can be selected;
1 only uses the front microphone and is saved in .ogg format. 4 uses all four microphones
and is saved in .wav format.

2 - choregraphe 81
2.5 box library
H - Sound Location

Image 2.89 - Sound Location box and the Parameters screen

The Sound Location box detects nearby sounds and produces the angle of the sound source (Image 2.89).
The second output (soundLocation) produces two radian values for the angle of rotation for the sound source.
The first radian value is the azimuth and the second radian value is the elevation angle. The third output
(headPosition) contains information regarding the head and produces six values. The first three values
indicate the location of the head and the rest of the three indicate the head’s angle of rotation. Trust threshold
is the default value that determines whether or not the sound will occur; the range is from 0.5 to 1.0, and the
default value is 0.5. Enable move determines whether to move the head toward the sound source.

Image 2.90 - Internal configuration of the Sound Location box

Sound Location box acquires the information regarding the sound source from the soundLocated variable
(Image 2.90). The Head Track box moves the robot based on the information acquired from Sound Location.

82 2 - choregraphe
2.5 box library
2.5.9 video library
NAO has a high definition camera that looks straight ahead and a camera that looks downward.
NAO’s vision system is configured to obtain the necessary information from the video gathered through
the cameras. The Video Library provided by Choregraphe consists of boxes that use this vision system.

A - Select Cam

Image 2.91 - SelectCam box

Select Cam box chooses the camera to activate (Image 2.91). The first input activates the front-facing
camera, and the second input activates the downward-facing camera.

B - Face Detection

Image 2.92 - Internal configuration of the Face Detection box

Face Detection box uses the cameras to detect faces and produces the numbers of people present
(Image 2.92). The second output produces the number of detected faces, and the third output activated
when there are no faces detected.

There is an internal detection box and it takes FaceDetected variables from the memory.
The number of faces is calculated using the FaceDetected variable size.

2 - choregraphe 83
2.5 box library
C - Add/Del Faces

Image 2.93 - Internal configuration of the Add/Del Faces box

Add/Del Faces box builds the database necessary for face recognition (Image 2.93). The first input receives
the string of names that can distinguish between the faces. If the string signal enters this, the faces
and names are added. The second input initializes the database. The following explains how to add the faces
to this database.

When a string signal enters the first input, there is a five second delay because of the Wait box. During these
five seconds, the face is detected and the database is added. Facial data is added to the database when
LearForgetFace box receives the facial data from the Delay Msg box. The eye LEDs will be green if the facial
data is properly added to the database; if it fails, the LEDs will be red.

D - Face Reco

Image 2.94 - Internal configuration of the Face Reco box

The Face Reco box uses the cameras and the faces from the database to recognize faces (Image 2.94).
Face Reco box identifies who the faces belong to (if there is any). When face recognition is successful,
the second output produces the name of the face. When several faces have been recognized at once,
the outputs of names are done in order.

84 2 - choregraphe
2.5 box library
E - NAOMark

Image 2.95 - Internal configuration of the NAOMark box

NAO Mark box recognizes predefined marks (Image 2.95). If the mark is recognized, the mark’s number
is output into the second output, but if it is not recognized, the third output is activated.

Choregraphe provides 10 marks; the mark images and numbers can be found under “media/NAOmark.pdf”
in the installation CD (Image 2.96).

Image 2.96 - Predefined mark images and numbers

F - Vision Recognition

Image 2.97 - Vision Recognition box and the Parameters screen

Vision Recognition box compares the images acquired from the cameras with the images saved
in the database to determine the existence of an object (Image 2.97). This database is different from the one
used by the Face Recognition box. The database used in Vision Recognition can be generated by using the
video monitor we previously introduced. The second output produces the object’s name if there is an object
in the image. The third output is activated if the object does not exist.

2 - choregraphe 85
2.5 box library
n Example 2.8 (Example File: Ex2.8_vision reco.crg)

Image 2.98 - Example of how to use the Vision Reco box

Image 2.98 shows an example of how the Video monitor is used for database generation, how
the Vision box is used for detecting the cell phone, and how to determine whether the cell phone
is open or closed. The following explains how the Video monitor is used to generate this database.

u Run the video monitor


v Standby for studying
w Acquire the image

x Designate the area


to be studied
y Set the area to be studied
z Register the studied image

Image 2.99 - Using Video Monitor for the learning process

86 2 - choregraphe
2.5 box library
image 2.99 shows how the Video monitor is used to extract and register the object within an image.
First, Select Camera should be used to choose the downward-facing camera.

j The image acquired from NAO’s camera is shown when you click Video monitor’s playback button.
k A five second delay occurs when Video monitor’s study button is pressed.
This time is for acquiring a stable image.

l When the delay is over, the stopped screen appears.


m Click the mouse to select the outline of the object you wish to study. Here, it would be advisable to select
the entire object. If you select a narrow area, a message will appear informing you that there isn’t enough
information.

n The study area is set when the object’s outline becomes a closed curve.
o There are three areas you can register to: Book, Object, and Location. It has been registered
to Object in this example. Here, the registration is saved in the Choregraphe database.
When the send button is clicked, Choregraphe’s database is sent to NAO.

image 2.100 - Results from example (left) and an image for object detection using Monitor (right)

image 2.100 is an example of implementing Example 2.8. You will see that [“closed,” “cellphone”] is output
into the second output of the Vision Reco box when object detection is successful.

The Monitor program introduced in Chapter 1 can verify the result of NAO’s image processing. The right side
of 2.100 shows how Monitor was used to detect an object. You can tell that object detection was successful
because it states “closed,” “cellphone” on the cell phone.

2 - ChoregraPhe 87
2.5 box librArY
2.5.10 tracker library
In order for NAO to follow a specific object, it needs information about the object, recognition, and accurate gait.
Choregraphe provides the function for using the camera to track the red ball and faces.

A - WB Tracker

Image 2.101 - WB Tracker box and the Parameters screen

WB (Whole Body) Tracker box tracks the object while maintaining the initial pose (Image 2.101).
If tracking is successful, the second output is activated. If tracking is unsuccessful (if an object in the image
does not exist), the third output is activated. Target choice refers to the objected you wish to track, and it can
only track human faces or red balls with a radius greater than 6cm. Time before lost is time spent finding the
object. In other words, if the object isn’t found within this time, the tracking is deemed unsuccessful.
The measure is done in seconds and the range is from 0.0 to 5.0 with 1.0 as the default value.

B - Walk Tracker

Image 2.102 - Walk Tracker box and the Parameters screen

The output is the same as the WB Tracker. Target choice and Time before lost are also identical to the WB
Tracker. Threshold for walk forward/backward is NAO’s threshold value for how far the tracked object
is from NAO; it is measured in meters and the range is from 0.0 to 1.0.

88 2 - choregraphe
2.5 box library
These two parameters determine NAO’s direction of movement. NAO will move forward if the distance
to the tracked object is greater than the Threshold for walk forward value, and it will move backward
if the value is less than the Threshold for walk backward. If the value is in between the two threshold values,
NAO will remain in its place.

Image 2.103 - Internal configuration of Walk Tracker

The Walk Tracker box has the Tracker box and WalkToTarget box inside (Image 2.103). Tracker box recognizes
the object and calculates the direction. WalktoTarget box uses the information regarding the angle and direction
to move NAO.

2.5.11 communication library


Communication library sends and receives e-mail. This function may or may not be available depending
on the mail server being used, and you might have to set this within the mail server.

A - Send E-mail

Image 2.104 - Send E-mail box and the Parameters screen

2 - choregraphe 89
2.5 box library
Send E-Mail box is used to send e-mail (Image 2.104). The output is activated if the e-mail is sent successfully.
The following shows the parameters:

- From/To: Enter the e-mail address of the sender (recipient).


- Subject: Enter the e-mail title.
­- Contents: Enter the e-mail content.
­ - Attachment: Enter the file path of the attachment.
­ - Password: Enter the sender’s e-mail server password.
­- SMTP address: Enter the SMTP address of the mail server.
­- SMTP port: Enter the SMTP port number.

Image 2.105 - Sent mail

Image 2.105 shows how NAO captured the sent e-mail from the web page.

B - Fetch E-mail

Image 2.106 - Fetch E-mail box and the Parameters screen

Fetch E-mail box is used to receive e-mail (Image 2.106). These e-mails are saved in various formats.
The most common are text (.txt), webpage (.html), image (.jpg), and audio (.wav) files.
The parameters are:

90 2 - choregraphe
2.5 box library
- POP address: enter the POP address of the incoming mail server
- E-mail address: enter the e-mail address of the recipient
- Password: enter the e-mail account password
- SSL port: enter the SSL port number

image 2.107 - Output message after receiving e-mail

If you place the mouse on the output after an e-mail is received, a message will appear (image 2.107).
Information regarding the e-mail will be in the bottom. The e-mail is saved in «/var/volatile/tmp/.»
If you use an FTP program like WinSCP, you can read the e-mails saved in NAO on your PC (image 2.105)

image 2.108 - Where e-mail is saved

2 - ChoregraPhe 91
2.5 box librArY
3 Python
LEARNING
Chapter 3 will have a short introduction
to Choregraphe scripts and Python for NAOqi.
There is a basic description of Python syntax
and a discussion about creating and editing
Choregraphe script boxes.

This would be a good chapter if you are already


familiar with Python

92
content
3.1 Before Getting Started 94 3.6 Class 112

3.6.1 Declaration 112


3.2 Overview 95 3.6.2 Relationship between Class 113
and Instance
3.2.1 Determining the Dynamic 95 3.6.3 Constructor and destructor 114
Data Type 3.6.4 Static Method 115
3.2.2. Platform-Independent 96 3.6.5 Operator Overloading 115
Language 3.6.6 Inheritance 117
3.2.3 Simple and Easy Syntax 96
3.2.4 Built-in Data Structures 96
like Strings, Lists, Tuple, 3.7 Module 120
and Dictionnary
3.2.5 Automatic Memory 97 3.7.1 Module Usage 120
Management 3.7.2 Module Creation 121
3.2.6 Diverse Libraries 97
3.2.7 Scalability 97
3.8 Comprehensive Practice Through 124
Choregraphe Script Modification
3.3 Data Types and Operators 98
3.8.1 Random Eyes Box Script 124
3.3.1 Variable Name 98 3.8.2 Using Python to Create 126
3.3.2 Operator, Numeric 98 New Choregraphe Boxes
and String Representation
3.3.3 Lists, Tuple, Dictionary 101 3.9 References 133

3.4 Control statements 105

3.4.1 The “if”Statement 105


3.4.2 The “while” Statement 106
3.4.3 The “for” Statement 107
3.4.4 Range Method 107

3.5 Functions 109

3.5.1 Definition 109


3.5.2 Return Value 110
3.5.3 Parameter 110
3.5.4 Pass 111

93
3.1 Before getting
Started

Most functions provided by the NAO humanoid robot can be used through Choregraphe. A lot of diverse
tasks like turning the LED lights on, repeating set movements, making sounds, and etc. are supported by
the Choregraphe box, and it can be used to perform various types of robotic tasks. However, it is difficult
to execute tasks that aren’t supported by the box, and it is incredibly tough to use the box to construct
complicated alg orithms like image processing.

This is why you must be able to edit the parameters or box algorithms used by the previous box and know
how to create new boxes. This can be done using C/C++ and Python, which will be discussed in this chapter.
As an object-oriented, interpreter-based language, Python has the advantage of being able to quickly view
the content being tested.

Image 3.1 - Using Python to set the LogoDetect box parameters

NAO’s Choregraphe environment uses Python to edit various things like the parameter settings of the box
and default flow (Image 3.1). You will be able to use Python in the future to directly control NAO’s hardware
through linking NAOqi or DCM, thereby making it possible to make desired changes to existing functions. If
you know how to work with Python, you can take advantage of NAO’s advanced functions.

This chapter will not provide a detailed explanation of Python. This chapter will introduce Python’s basic
functions and explain some of the functions needed to operate NAO. The main focus will be on becoming
familiar with the functions shown in the Choregraphe box examples.

94 3 - python
3.1 before getting started
3.2 Overview

Python is an object-oriented interpreted language used for the purpose of communication.

It is a simple language that can quickly test and verify codes unlike existing languages where
you have to compile, execute, and debug. Hence the advantage is in being able to greatly reduce
the time spent on testing. Also, since data type is dynamically determined, you can easily create codes
unrelated to it.

- The following organizes Python’s powerful features.

3.2.1 Determining the Dynamic Data Type

Image 3.2 - Determining Python’s dynamic data type (ex01.py)

Image 3.2 above shows the code for the variables produced when ‘hello’ and 1234 are each placed in variable
‘a,’ and you will see that the data type changes automatically.

If you were using C, ‘int a’ type statement would be necessary in order to specify what type of variable was
being used initially, and assigning «hello» to the variable declared as ‘int a’ would have generated an error.

The most typical advantage in dynamic data determination is the creation of a generalized code.

Let’s say that you are creating a function called ‘add’ where two variables will be added, and assuming
that the integer and floating point value will be entered, the overloading function in C++ will be used to create
the code below:

int add(int a, int b)


{
return a+b;
}

float add(int a, float b)


{
return (float)a + b;
}

3 - python 95
3.2 overview
float add(float a, int b)
{
return a + (float)b;
}

float add(float a, float b)


{
return a+b;

However, in Python, one line of code (as shown below) will suffice.
def add(a, b)
return a+b

Generalized codes can be created by determining the dynamic data type, and there is an advantage
of increased productivity compared to C and C++.

3.2.2 Platform-Independent Language


Python runs on most operating systems including Linux, Unix, Windows, and Mac OS.
Since it is platform-independent and creates byte codes like Java, there is an advantage of being able to
implement it quickly in other operating systems.

3.2.3 Simple and Easy Syntax


Unlike other languages, Python does not separate code blocks by using { } or ‘being ... end.’
Python only uses indentations to separate blocks so it imposes consistency on programmers. Program
readability and comprehensibility can be improved through this.

3.2.4 Built-in Data Structures like Strings, Lists, Tuple,


and Dictionary

The default data structure provided by Python includes strings, lists, tuple, and dictionary. Lists can obtain
the element values from the corresponding location using the [ ] operator, which has the same form as the
existing array. However, unlike an array, other data can be inserted in the middle, so you can create internal
element values in completely different forms (like lists, strings, etc.). For example, the list structure shown
below contains other lists inside the strings and numbers (Image 3.3).

Image 3.3 - Example of Python’s list (ex02.py)

96 3 - python
3.2 overview
3.2.5 Automatic Memory Management
Since Python uses Garbage Collection like Java does, the user does not have to worry about dynamic memory
allocation and deallocation. If necessary, Python automatically allocates memory and automatically deallocates
when you are done using it. It can also automatically increase or decrease the amount of memory depending
on the need.

3.2.6 Diverse Libraries


Python’s standard library is very extensive and can be used for many different areas including Internet,
mathematics, and string processing. There is also a library for handling strings and sending/receiving e-mails.
Besides the default library, you can find and use others from the Internet.

3.2.7 Scalability
Python is very compatible with other languages. Other languages and Python can call each other’s modules.
Even for codes without a source with only a library interface, you can use them by employing a simple
interface function.

Image 3.4 - IDLE, Python’s simple development tool

Python is very compatible with other languages. Other languages and Python can call each other’s modules.
Even for codes without a source with only a library interface, you can use them by employing a simple interface
function.

For NAO, Python is installed when Choregraphe is installed, and most of the examples and source codes used
here can be tested using IDLE, a Python development environment (Image 3.4).
You can also edit and test some Choregraphe modules

3 - python 97
3.2 overview
3.3 Data Types
and Operators

3.3.1 Variable Name


Python can use text, numbers, and underscore ( _ ) for variable names. A number cannot be used as the
first letter and all variable names are case-sensitive and saved as such. Examples of variable names include
hello, abcd, NAO_robot, temp_, and number1. Examples of variable names that cannot be used include hello!
and number1. Reserved words shown in Graph 3.1 below cannot be used as variable names.

Graph 3.1 - Reserved words that cannot be used as variable namesw

and del from not while

as elif global or with

assert else if pass yield

break except import print

class exec in raise

continue finally is return

def for lambda try

An error like the one shown in Image 3.5 will occur if one of the reserved words are used as a variable name.

Image 3.5 - Error that occurs when using a reserved word as a variable name

3.3.2 Operator, Numeric, and String Representation


Python’s data type largely includes numbers and strings; the numbers include int, long, float, complex,
and etc. Int refers to an integer, and if you enter an integer as shown below in Image 3.6, it is recognized
as a decimal integer and the types are dynamically assigned.

Image 3.6 - Integer variable (ex03.py)

98 3 - python
3. 3 data types and operators
In order to assign other decimal numbers, you have to include Binary = ‘0b,’ Octal = ‘0o,’ and Hexadecimal =
‘0x’ in front of the value. Binary = bin, Octal = oct, and Hexadecimal = hex functions can be used to produce
another decimal value. image 3.7 is an example of this.

image 3.7 - Numeric data entry method of other numbers (ex04.py)

In Python, float type processes real values. Real numbers can be entered like 3.14 and 2.71 and as an
exponent type like 314e-2 and 271e-2 (image 3.8). image 3.8 shows an error that appears below the decimal
point in the result.This is a common point error that occurs in real number representation so it will not be
explained in detail here.

image 3.8 - Two methods for real number input (ex05.py)

Python also supports complex number representation. Complex number is indicated as ‘j’ and you can enter
and process complex numbers as shown in image 3.9 below.

image 3.9 - How to use complex number representation (ex06.py)

3 - Python 99
3. 3 dAtA tYpes And operAtors
Operators like basic arithmetic operations (+, -, *, /), remainder (%), power (**), and integer division (/Ø/) will
be provided for the numerical values. Power operator is higher priority than basic arithmetic operations,
and only the integer is taken as a result after the division (Image 3.10).

Image 3.10 - Power operator and integer division (ex07.py)

Python strings can be represented together using either single quotation marks (‘) or double quotation marks
(«). It can be used as shown in Image 3.11 below, but it cannot be used combined with each other. Also,
there is a «”» symbol for placing a lot of strings, and additional special texts like newline and tab are also
supported (Graph 3.2).

Image 3.11 - String and multi-string input methods (ex08.py)

Graph 3.2 - Special characters supported by Python

\n Newline character (line breaks)

\t Tab

\r Carriage return

\0 NULL

\\ ‘\’

\’ ‘ text

\” “ text

100 3 - python
3. 3 data types and operators
The addition operator (+) is used to merge strings, and the multiplication operator (*)
is used to repeat the strings. The [ ] operator can also be used to call each of the string elements,
and the first text starts from 0.

A slice operator is provided to import multiple strings, and it is used in [start:end] form. The [ ] operator
can also be used to read elements, but they cannot be modified. The element index also supports the - form;
if the index is from 0-4, it can be used in index form until -5 - -1 (image 3.12).

image 3.12 - An error that occurs when string operators and elements are changed (ex09.py)

3.3.3 LISTS, TupLe, DICTIONAry


a - Lists

Lists have a data structure similar to an array, but there is an advantage of being able to insert and delete
with great flexibility. Python does not support arrays but rather supports lists with a basic data structure,
and you can save data that are different from one another within the list. Lists can be created using the [ ]
operator, and the main methods include the append method for insertion, insert method, remove method
for deletion, and index, count, sort methods for searching and sorting.

image 3.13 below shows you how to generate a list. It has formed one list from strings that correspond to
each day of the week, and each element can be accessed using the [ ] operator.

image 3.13 - List generation and index access (ex10.py)

3 - Python 101
3. 3 dAtA tYpes And operAtors
The following adds an element to the list using the Append method. If the Append method is used,
you will see that the content has been added as the very last element of the list (image 3.14).

image 3.14 - List insertion (using the Append method, ex10.py)

The following adds an element to the list using the Insert method. Insert method first selects where
it will be inserted, and you can use image 3.15 below to verify that ‘Sunday’ has been inserted for 0.

image 3.15 - Using Insert for list insertion (ex10.py)

The Remove method removes the corresponding element, and this can be verified in image 3.16 below.

image 3.16 - List removal (ex10.py)

The Index method returns the position of the corresponding element, and the Count method tells you how
many times the corresponding element exists in the list. Also, the string is sorted in ascending alphabetical
order if the Sort method is used, and it is sorted in descending alphabetical order if the Reverse method
is used (image 3.17).

image 3.17 - Index, Count, Sort, and Reverse methods (ex10.py)

102 3 - Python
3. 3 dAtA tYpes And operAtors
b - Tuple

Tuple is similar to a list, but it is a data structure that only supports reading. Although the list is generated
and processed by the [ ] operator, Tuple is generated through the ( ) operator. The [ ] operator is used only
to read the data (image 3.18). Since Tuple is a read-only data structure that you cannot edit, only Count
and Index methods are supported. It plays an identical role as the list.

image 3.18 - Tuple generation and reading data (ex11.py)

C - Dictionary

Dictionary is a data structure that consists of a key and value. You can get the value by using the key
(image 3.19); the index is not supported and an error will occur if you use a key that does not exist
(image 3.20). You can assign additional keys and values to add new values, and Items, Keys, and Values
methods are supported to obtain the values.

image 3.19 - Defining and using the dictionary (ex12.py)

image 3.20 - Error that occurs when you use a key that does not exist (ex12.py)

3 - Python 103
3. 3 dAtA tYpes And operAtors
The Items method wraps all the Dictionary keys and values with Tuple before returning them, and the Keys
method wrap the keys with Tuple and the Values method wrap the values with Tuple before returning them
(image 3.21).

image 3.21 - Items, Keys, and Values methods (ex12.py)

104 3 - Python
3. 3 dAtA tYpes And operAtors
3.4 Control
StatementS

Python has its syntax executed one by one in sequential order. Control statements like conditions
and loops can be used by changing the flow in the sequential language. Loops are used for executing either
the same or similar tasks several times while conditions are used to determine whether or not to execute
the task according to the conditions. The most common loop includes the ‘for’ and ‘while’ statements,
and the condition has the ‘if’ statement.

3.4.1 The ‘If’ STATemeNT


The ‘if’ statement evaluates the conditions and decides whether or not to execute according to
the corresponding results. The code shown in image 3.22 below is a good example. The code below evaluates
the variable score and outputs «Good job.» if it is greater than 90, and the code outputs «You might have
to try harder.» if the variable score is less than 90.

“Good job.”

“You might have to try harder.”

Good job.

image 3.22 - Example of a control statement using the ‘if’ statement (ex13.py)

The ‘if’ statement is defined in the order shown below.

if (Condition) : (processing Syntax 1)


else : (processing Syntax 2)

If the condition is true, execute Processing Syntax 1, but otherwise, Processing Syntax 2 will be executed.
Unlike other languages, Python does not wrap the processing syntax with a block. This is why the codes
corresponding to Processing Syntax 1 must all have the same number of spacing, and if this is not the case,
an error may occur or it may not process the way you want it to (image 3.23). Even though the number of
spacing between the ‘if’ and ‘else’ statements look different from one another in Image 3.22, if you consider
the from the first sentence, you can see that they actually have the same number of spacing.

“Good job.”
“You might have to try harder.”

image 3.23 - An error that occurs when the spacing changes

3 - Python 105
3. 4 control stAtements
Additionally, it is possible to sequentially test several different conditions using elif. ‘if’ and ‘elif’ can be
combined to expand the program above to output a score for each, and the code that makes this happen
can be generated as shown below in Image 3.24. With C/C++, a part of this code had to be expressed
using 90 = score && score = 100, but it can now be expressed using 90 = score = 100.

Image 3.24 - An error that occurs when the spacing changes

3.4.2 The ‘while’ Statement


The ‘while’ statement repeatedly executes the internal block codes as long as the conditions
are true. It is used in the same form as the ‘if’ statement, and if the conditional equation is initially false,
the internal block of codes will be passed over without being executed. Image 3.25 below shows a code that
obtains the total addition of 1-5 Numbers. “ += 1” plays the same role as number = number + 1.

Image 3.25 - A code that obtains the total for 1-5 using the ‘while’ statement (ex15.py)

106 3 - python
3. 4 control statements
3.4.3 The ‘for’ Statement
The ‘for’ statement, like the ‘while’ statement, is a typical repeat statement; it’s used differently than C/C++.
The following shows the structure of Python’s ‘for’ statement.

for Element I in Sequential Form Object S :


Processing Syntax

Here, object S has a sequential form and has lists, strings, tuple, and dictionary.
Please refer to another book to use the iter method to create iterators.

Image 3.26 below shows an example of the ‘for’ statement. Each element of the number list is assigned to
the ‘i’ variable in order, and each element is produced sequentially through the ‘print i’ command.

Image 3.26 - Example of using the ‘for’ statement (ex16.py)

3.4.4 Range Method


If the ‘range’ function is used, you can create a list of numeric values constantly being repeated.
If the ‘range’ function and ‘for’ statement are combined, it can be used with the ‘for’ statement used by C/
C++. The syntax of the ‘range’ function is as shown below.

Range (start value, end value, increase value)

Image 3.27 below is the result of using the ‘range’ function to generate the list of numeric values.

Image 3.26 - Example of using the ‘for’ statement (ex16.py)

107

3 - python
3. 4 control statements
As shown in image 3.28 below, when the ‘range’ function and ‘for’ statement are combined,
it can create the same program the ‘while statement’ did that can add 1-5.

image 3.28 - Example of using ‘for’ and ‘range’ for adding (ex18.py)

108 3 - Python
3. 4 control stAtements
3.5 Functions

Functions are used to wrap several syntaxes into one for processing. Type, range and print are some
of the typical functions we have explored thus far. These functions have already been defined in Python to
play those roles; in this chapter we will explore how the user can define and use functions. Python functions
actually contain a lot more complicated and diverse information than what will be explained in this chapter,
so we will describe only the most basic information here. For more in-depth information, please refer
to a specialized book about Python.

3.5.1 Definition
Python declares functions in a slightly different manner than other existing languages. A function
is declared starting with ‘def’ and ends with a colon (:), and the beginning and end are denoted with
indentation. Please keep this in mind because it is different from C/C++ or BASIC where the beginnings
and endings are generally explicitly denoted.

Function declaration syntax is as shown below:

def Function Name ( Argument 1, Argument 2, … Argument N ) :


Processing Syntax
return Return Value

def’ is a syntax that declares a function, and ‘function name’ is the name that will be used when calling the
corresponding function. ‘Argument 1, Argument 2, ... Argument N’ inside the parentheses is used to record
the necessary external delivery variable when processing the function, and colon (:) is used to end the
declaration.

The processing syntax processes the corresponding function; the syntax can be used freely and another
function can be called by the processing syntax. ‘return’ is used to return the resulting value of the processed
function; you can end the function even without ‘return’ and ‘None’ value will be returned.

Image 3.29 below shows a function that adds two arguments (a and b) and returns the value.

Image 3.29 - Example of how to use declare and use the ‘add’ function (ex19.py)

The result is returned after a and b arguments are added. Since Python doesn’t have anywhere to declare
a Type, the argument type is determined at the point of delivery. Hence all types that support + operations
can be delivered, and addition is possible for both numbers and strings.

3 - python 109
3. 5 functions
3.5.2 return Value
As shown in image 3.29 above, the return value is possible because of the ‘return’ statement. When you
encounter a ‘return’ statement while executing a function, the corresponding function closes and returns
to where it was called from.

The function will end if ‘return’ is not used or if ‘return’ is the only thing written; in this case, object ‘None’
will be delivered as the return value (image 3.30).

image 3.30 - When nothing is written for ‘return’ (ex20.py)

‘return’ can only return one object. However, it can return several objects through the function that enables
you to turn multiple values into a tuple.

The next example uses the ‘calc’ function to return the entire result from the arithmetic
(image 3.31).

image 3.31 - How to return multiple objects (ex21.py)

3.5.3 Parameter
Python uses references to deliver parameters to functions. This is a bit different C/C++. For Python,
the decision is made according to whether or not you can edit the parameters.

For general number values, even if the data is modified within the function, this is not reflected outside
the function. However, for parameters formed by lists, if you change the data within the function, you will see
that the modified content is reflected outside the function (image 3.32).

110 3 - Python
3. 5 Functions
image 3.32 - Difference in processing functions between modifiable and non-modifiable types (ex22.py)

3.5.4 PaSS
‘pass’ syntax is used to create a code that won’t execute any kind of action. image 3.33 shown
below is a code that doesn’t do anything, and even if it did, it wouldn’t show any results.

image 3.33 - Result from using ‘pass’ (ex23.py)

‘pass’ is used fairly often when generating codes. For example, when creating a temporary function, module,
or class during a project, you can assign a name but not create any content for it.

This is when ‘pass’ is used. As explained in image 3.30 above, although you can use ‘return’ instead of ‘pass,
you have to use ‘pass’ for class because a return value does not exist.

3 - Python 111
3. 5 Functions
3.6 ClaSS

Object-oriented programming is possible for Python through class. In C++ or Java, it was possible
to implement all the functions in equal levels, and these functions are used for abstract programming.
This chapter will explore the method necessary for object-oriented implementation and class declaration.
However, we will not go into details about the concept behind object orientation, inheritance, polymorphism,
and information hiding.

3.6.1 DeClaration
Class declaration occurs when the data and method are both defined together. You can define a simple class
without any content using ‘pass’. In the code below, ‘classobj’ object is created when class is declared (they
happen at the same time). You can create an instance by calling the constructor to a random name. We will
not go into details about constructors here, but they are the first method used to create instances.

image 3.34 - Class declaration and creating an instance (ex24.py)

image 3.35 demonstrates how general classes are used. ‘import math’ is the module called to load
the ‘sqrt’ function which obtains the square root. This will be explained in detail in the next chapter.
In general, a module refers to a collection of functions with specific purposes. Point class declaration
is indicated through ‘class Point:’ and initializes and assigns the variables inside x and y. The ‘def
distance(self)’ method is declared here and ‘self’ plays the same role as ‘this’ in C++ and Java.
The class method in Python must have the first argument as ‘self’ by default.

This points to its own instance objects and accesses the x and y variables within the class through ‘self.x’
and ‘self.y.’ The ‘p1 = Point()’ refers to creating instance variable for the ‘Point’ class, and internal variables
and method are called by using the dot operator much like ‘p1.x’ and ‘p1.y’

image 3.35 - General class (defining variables and methods and instance call) (ex25.py)

112 3 - Python
3.6 clAss
3.6.2 relationShiP betWeen ClaSS anD inStanCe
Class instance refers to the memory space used for storing and saving the actual class content.
In image 3.36 below, class structure is declared by ‘class …,’ and ‘x1 = secondClass()’ and ‘x2 =
secondClass()’ create each of the instances of a class.

Both ‘x1.name’ and ‘x2.name’ stores the ‘hi’ string because no initial changes were made to the data,
but if ‘x2.name = “hello”’ is used to change the data, you will see that the content in ‘x2.name’ also changes.
Here, even though it has the same class as ‘x1.name,’ the value does not change to ‘hi.’ This is because ‘x1’
and ‘x2’ each have its own independent instance memory

image 3.36 - Independent variable space of class instance (ex26.py)

Another unique feature of Python is how you can dynamically and independently add instance and class
variables. Let’s say that you use the same instance of class for both ‘x1’ and ‘x2’ and add a new variable
named ‘age’ in ‘x1.’ This will now be a variable unique to ‘x1’ and will not affect ‘x2’ in any way. Although
this is one of Python’s unique features, it would be best not to use it, because if you don’t maintain
consistency within a class, it will be very difficult to figure out where an error has occurred when
the program gets bigger.

image 3.37 - Feature that adds a variable unique to an instance (ex27.py)

3 - Python 113
3.6 clAss
The ‘isinstance’ method can be used to learn about the relationship between class and instance.

An instance is placed for the first argument, and the name of a class will be placed
in the second argument. You can use this to determine whether an instance was created
from the corresponding class.

Image 3.38 - The ‘insistence’ method used to learn about the relationship between class and instance (ex28.py)

3.6.3 Constructor and Destructor


As explained in 3.5.1, the constructor method is used for initialization when creating a class,
and it is automatically called when an instance object is being created. Conversely, destructor
is called when an instance is being destroyed.

In Python, the constructor is defined as ‘_ _ init _ _()’ and destructor as ‘_ _ del _ _’ ().’ If ‘_ _’
is attached to the back of the variable or function name in Python, it means that it was predefined
for a special purpose. For the constructor method, you can deliver the member variables
for initialization when trying to create instances much like how you would deliver an argument
when a function is called.

Constructor and destructor are methods inside a class, so the first argument must point to
their own instance. The corresponding argument being used as ‘self’ can have a different name,
but it is recommended to keep it as ‘self’ since it is generally used as such.

Image 3.39 - Simple example using the constructor and destructor (ex29.py)

114 3 - python
3.6 class
Image 3.39 shows a simple example of both constructor and destructor. When calling a constructor in
this example, “Constructor Called” message is produced, and if a destructor is called, “Destructor Called”
is called.

A constructor is called when you create a ‘c1 = conClass()’ instance, and you can verify that a destructor
has been called when ‘c1 = 0’ is used to change the connection object to 0.
Also, a destructor is called even when you try to erase it from the memory using the ‘del(c1)’ command
through the ‘del()’ method.

Generally when constructors and destructors have to define the object’s initialization process
(for example, when exchanging emails using the NAO robot), they are used to deliver an email address
or IP address to the object.

3.6.4 Static Method


Static method can call an external instance without having to create one. The code is as shown in Image 3.40.
First, define the method through the ‘print_hello():’ syntax in ‘staticClass.’
A general method has ‘self’ as the first parameter that points to its own object, but static method does not
need a first argument.

Also, use “staticmethod” to explicitly declare on ‘static_print’ that ‘print_hello’ will be a static method.
When calling a static method, you can use ‘class name.static method,’ and you can call it without having
to create an instance.

Image 3.40 - Declare and call a static method (ex30.py)

3.6.5 Operator Overloading

Operator overloading refers to assigning new roles to the classes you made for the operators (+. -. *. /).
If it is a class for numbers, + can be the operator for addition, but if it’s a class for strings, + can be defined
by adding a text. Here, operator overloading is used, and Python provides a pre-defined method (Graph 3.3).

3 - python 115
3.6 class
Graph 3.3 - Predefined methods

Method Name Operator Example

_ _ add_ _ (self, other) + A+B

_ _ sub_ _ (self, other) - A-B

_ _ mul_ _ (self, other) A*B

_ _ div _ _ (self, other) / A/B

_ _ floordiv _ _ (self, other) // A//B

_ _ mod_ _ (self, other) % A%B

_ _ divmod _ _ (self, other) divmod() divmod(A, B)

_ _ pow _ _ (self, other[, modulo]) ** A ** B

_ _ lshift _ _ (self, other) A B

_ _ rshift _ _ (self, other) A B

_ _ and _ _ (self, other) & A&B

_ _ xor _ _ (self, other) ^ A^B

_ _ or _ _ (self, other) | A|B

_ _ abs _ _ (self) abs() abs(A)

_ _ abs _ _ (self) + +A

_ _ neg _ _ (self) - -A

_ _ invert _ _ (self) ~ ~A

A lot of other methods are also supported, but the graph above displays the most commonly used methods.
To see more, please refer to the Python reference.

The most typical example of operator overloading is the addition of strings (Image 3.41).
Use TString to define one string processing class so the + operator can add the strings.

116 3 - python
3.6 class
image 3.41 - String additions for operator overloading (ex31.py)

3.6.6 inheritanCe
Inheritance is one of the most important techniques in object orientation. Inheritance usually refers to
passing down all the parent class properties to the child class. When using this kind of class inheritance,
it prevents the creation of identical codes for each class, and you can increase the consistency of codes by
letting the child class inherit the properties that are only common in the parent class. Easy maintenance is
another advantage. Additionally, if you call the child class through the interface defined by the parent class,
you can access the specialized functions of each class by using the common interface

image 3.42 below is an example of the inheritance relationship. It is first divided by the number of legs
a robot has; it can largely be divided into ‘quadruped walking’ and ‘biped walking.’ The Aibo and Bioloid
are quadruped robots while Hubo and NAO are biped robots. Although Aibo and Bioloid can be referred
to as quadruped robots, they can just be considered as robots.

This is because those in the sub-layer subsume the characteristics of those that are in the upper layer.
More specific differences are recorded in the sub-layer. Common differences, rather than specific
differences, can exist as variables in the upper layer. Object-oriented programming uses and implements
the inheritance relationship between parent - child by using these hierarchical elements starting from
the most abstract part.

Aibo
Quadruped
Walking
Bioloid
Robot
Hubo
Biped Walking
NAO

image 3.42 - Example of inheritance relationship

3 - Python 117
3.6 clAss
image 3.43 shows you how you can represent the inheritance relationship as a code.
The robot has ‘move’ and ‘operating’ methods, and each sub-layer of the biped walking robot has a ‘leg’
variable for each of their legs.

Here, the inheritance relationship is implemented after being defined a class ChildClass(parentClass):
If inheritance relationship is used this way, instances that were created in Biped_Robot are made with both
‘move’ and ‘operating’ methods of the parent class. Although the variables aren’t immediately inherited,
you can resolve this by using Biped_Robot. _ _ init _ _(self, leg) to call the constructor for Biped_Robot.

image 3.43 - Inheritance relationship code (ex32.py)

The ‘issubclass’ method can be used to verify the relationship between the child class and parent class
(image 3.44). It will return ‘True’ only when there is a parent-child relationship. If not, ‘False’ will be returned.

image 3.44 - The ‘issubclass’ method for verifying the relationship between child class and parent class (ex32.py)

To add the Hubo robot instead of NAO, create the code shown in image 3.45.

>>>class Hubo(Biped_Robot):
def __init__(self,leg):
self.company = “Kaist”
self.os = “nothing, 16bit microprocessor”
Biped_Robot.__init__(self, leg)

image 3.45 - Using inheritance to add a new class (ex32.py)

118 3 - Python
3.6 clAss
Much like NAO, ‘move’ or ‘operating’ methods can be called for Hubo. Here, if you would like NAO
and Hubo to have different movements, you can use a technique called method overloading.

Method overloading behaves the same as operator overloading where you can redefine some parts
of the parent class to have them execute a different function in the child class.

To define new ‘move’ and ‘operating’ methods by using method overloading in Hubo and NAO, employ
the same method used for defining existing methods. So, if method overloading and inheritance are used,
the parent class can define the interface, and this will enable the use of sub-classes for programming.

Image 3.46 - Example of method overloading (ex32.py)

There are other functions used in object-oriented programming, but please refer to another book for more
detailed information.

3 - python 119
3.6 class
3.7 moDule

A module is a collection of codes with specific functions and is reused in many different places. We previously
used ‘import math’ and ‘math.sqrt’ in 3.5.1 to provide a brief introduction of modules while explaining classes.
Here, ‘math’ is the module used, and it contains all the functions used for various tasks related to mathematics.
The ‘sqrt’ function used in 3.5.1 is a function that obtains the square root, but there are other functions built into
it including log, sin, and cos. To verify what kind of functions exist, you can use the ‘dir’ function (image 3.47). As
a default, Python provides approximately 200 or so modules and these can be combined to easily create codes
with functions you want.

image 3.47 - Using the ‘Math’ module and ‘dir’ function to verify the function within the module (ex33.py)

3.7.1 moDule uSage


Module is used through ‘import.’ If you make the declaration in import module name form, you can use
the function that exists in the corresponding module. It plays the same role as #include in C/C++, and after
the declaration, the corresponding function can be called in module name. function name (for example,
math.sqrt) form.

In addition to functions, you can also use a constant defined by a specific name. The most typical example of
a constant is ‘pi.’ If ‘math.pi’ or ‘math.e’ is executed, you can use the predefined pi value and natural constant
value (image 3.48).

image 3.48 - Math module’s pi value and natural constant value (ex34.py)

In addition, there are a lot of other useful modules like ‘random’ used for creating random numbers
and ‘time’ and ‘date_time’ for calculating or managing dates and times (image 3.49).

image 3.49 - Random module (ex35.py)

120 3 - Python
3.7 module
When you call a certain function of a module, you can use only the function name rather than using module
name.function name. You can make the declaration as shown below:

from module name import function name

You can call the random function of a random module using the method shown in image 3.50 below.

image 3.50 - Simplified module function calls (ex36.py)

3.7.2 moDule Creation


Python’s module is saved inside the Lib folder located inside the folder where Python was installed.
The file is generally saved as a module name.py file, and the user can also create the same type of module.
The following module executes simple arithmetic operations. It will be as follows if you create a module
that executes arithmetic operations.

First, generate a calculate.py file that contains the following content by using Notepad, a text editor like
Wordpad, or IDLE’s file editing functions (File -> New Window in IDLE).

def add(a, b):

return a+b

def sub(a, b):

return a-b

def mul(a, b):

return a*b

def div(a, b):

return a/b

3 - Python 121
3.7 module
Save this inside the Lib folder where Python was installed. Then, read and call the module from Python
as shown in image 3.51.

image 3.51 - Call for user created module (ex37,py)

If you want to save it using a different path instead of saving it inside Python’s Lib folder, you must adjust
the variable value of the system environment. First, for Windows, you can either go to Control Panel
- System and Security - System or click the right mouse button on My Computer to access Advanced
System Settings. (image 3.52).

image 3.52 - Advanced System Settings

The System Properties window will appear if you click Advanced System Settings; you can click
the environment variable here (image 3.53).

In addition to searching the corresponding folder, Python will search the default folder when the module
is imported to verify whether the actually module exists if you edit he system variables and set up the folder
where the module is located.

As shown in image 3.54, click Create New from the Environment Variables window to set the variable name
to PYTHONPATH and set a folder with a module for the variable value (for example, c:\\modules).

122 3 - Python
3.7 module
Image 3.53 - System Properties window Image 3.54 - Setting the user module folder

For Linux, you have to add the syntax shown below to the shell file (bash_profile for the commonly used bash
shell).

Location of ‘export PYTHONPATH=$PYTHONPATH:/module’


Example: export PYTHONPATH=$PYTHONPATH:/home/root/modules

3 - python 123
3.7 MODULE
3.8 Comprehensive Practice
Through Choregraphe
Script Modification

Choregraphe box is composed mostly of Python scripts, so you can edit box movements with just a little bit of
knowledge regarding Python. Also, if you operate the NAOqi framework using Python, you can use Python to
program instead of using the Choregraphe box. Chapter 4 will have more information on this. Here, we will
discuss how to employ Python scripts used by the Choregraphe box and observe some different examples.

3.8.1 Random Eyes Box Script

Image 3.55 - Random Eyes box and the script screen

Random Eyes box continuously changes the eye colors into random colors. If you execute the corresponding
box, you will see that the LED colors of NAO’s eyes are constantly changing. This box uses the Random
module provided by Python, and the user can edit this to change the eye colors.

124 3 - python
3.8 comprehensive pratice through choregraphe script modification
First, when you click the right button to select the Edit Box Script after dragging the Random Eyes box
to Choregraphe’s task window, a script editor window will appear as shown in Image 3.55.

It is same as the source code we’ve seen thus far in IDLE, and you will see that the corresponding box script
is defined as MyClass which inherited GeneratedClass. Import Random has been declared to use the Random
module. You can see that the Random module was used in rRanTime = randomuniform (0.0,2.0) from the
onInput_onStart(self) method (Image 3.56).

Image 3.56 - onInput_onStart method for Random Eyes box script

The ‘onInput_onStart’ method shown in Image 3.56 is the box that becomes active when Random Eyes box
is executed, and the block of code right after ‘while True:’ is the code that makes the random LED color
changes.

If you want to create a code that changes the eye color at specific intervals, change random.uniform (0.0, 3.0)
in time.sleep(random.uniform(0.0, 3.)) to a fixed number.

The random.uniform method produces a random value with uniform distribution, and (0.0, 3.0) refers to
producing a real number between 0-3.0. Therefore, if you change it to time.sleep(1.0) to execute
the program, you will see that the color of the eyes changes every second.

To edit the code so a particular value of an eye color is repeated, you have to edit the value for
256*random.randint(0,255) + 256*256*random.randint (0,255) + random.randint(0,255) from ALLeds.
fadeRGB(«FaceLeds», 256*random.randint(0,255) + 256*256*random.randint(0,255) + random.randint(0,255),
rRandTime).

The fadeRGB method processes the RGB value in 256*R + 256*256*G + 256*256*B form, so you can light
the LEDs with a color of your choice when you input the RGB value (please refer to the reference for more
details).

3 - python 125
3.8 comprehensive pratice through choregraphe script modification
3.8.2 Using Python to Create New Choregraphe Boxes
Until now, default boxes were used to program through Choregraphe. Most functions executed using NAO
are provided as Choregraphe boxes. For functions that aren’t provided and functions you would like to edit,
you can edit the existing box as previously explained. However, if you edit an existing box for all the functions,
you will run into problems if the tasks are used repeatedly in several different places.

Here, we will explain how Choregraphe functions and Python are used to create boxes. For our example, Python
will be used to create an adder and Choregraphe will be used to register a new box.

A - Registering a New Box in Choregraphe

First, choose to add a new box as shown in Image 3.57. When you press the right mouse button inside
the Choregraphe task window, ‘Add a new box’ will appear. As shown in Image 3.58, when you select the
corresponding menu, a box will appear to let you add a new box.

Image 3.57 - Adding a new box Image 3.58 - ‘Create a new box’ window

The ‘Name’ above creates a name for the box the user is trying to create. Here, we will enter ‘Adder’ as the
name. ‘Tooltip’ shows a simple explanation regarding the corresponding box.
We will simply enter «the two numbers received will be added.»

Then, press ‘-’ to delete the values for onStart, onStop, and onStopped which were registered with default
values for Inputs and Outputs. Image 3.59 shows how this will look. Ignore and move on when you get
a message telling you that a default image will be used because there is no bitmap image available.

126 3 - python
3.8 comprehensive pratice through choregraphe script modification
Image 3.59 Adder box setting and the Adder box in Choregraphe

Add the Adder box we are creating to the library. However, if you try to add it to the default library (default),
you will get a message stating that it cannot be added because it is a read-only library. This is why you
should create a new library titled MyLibrary. You can add this as shown in Image 3.60 below.

Image 3.60 - Creating a new library

3 - python 127
3.8 comprehensive pratice through choregraphe script modification
As shown in Image 3.61 you can add the Adder box to MyLibrary.

Image 3.61 - Add a box to the library and verify

B - Add Box Variable

Although it now has an appearance, the new Adder box does not have any internal functions yet. The Adder
box will receive two external inputs and adds them to send it out as an output. Let’s now set the input
variable in order to receive the two inputs. First, select the Edit Box menu as shown in Image 3.62 to edit
the Adder box. Once this is chosen, a menu identical to the one shown in Image 3.59 will appear.

Image 3.62 - Edit Adder box Image 3.63 - Button for adding input variables

128 3 - python
3.8 comprehensive pratice through choregraphe script modification
Press the ‘+’ button to open the window to edit the input variables as shown in Image 3.63.

As shown in Image 3.64 below, enter ‘A’ as the Name. Adjust ‘Type’ using ‘Dynamic’ and use ‘onEvent’
for ‘Nature.’ There are four kinds of ‘Type’ including ‘Bang,’ ‘Number,’ ‘String,’ and ‘Dynamic,’ and four types
of ‘Nature’ including ‘onEvent,’ ‘onStart,’ ‘onStop,’ and ‘ALMemory.’
Each function is shown in Graph 3.4. Both ‘onEvent’ and ‘onStart’ play substantially similar roles.

Graph 3.4 - Functions and roles of input variables

Name Role

Bang Does not deliver anything other than the start signal for the input.

Number Only delivers a set of numbers for the input.


Type
String Only delivers a set of texts for the input.

Dynamic Determines the form according to signal types.

onEvent Processed when there is a signal that enters the box.

onStart Processed when the signal is delivered.


Nature
onStop Processed when a signal occurs to stop the box.

Receives data at set time intervals from NAO’s memory


ALMemory input
and processes them.

Image 3.64 - Setting the new input variable

3 - python 129
3.8 comprehensive pratice through choregraphe script modification
When you have finished adding variable A, add variable B with the same parameters as variable A.
A and B are two variables for input (Image 3.65).

Image 3.65 - Edit Box window after the add is complete

Now, add variable R in Outputs as shown below. Variable R is used to output the result of variables A and B
added. Although the output variable window looks identical to the input window above, unlike the ‘Nature’
input, there are two elements including ‘punctual’ and ‘onStopped.’ Regardless of time, ‘punctual’ immediately
sends the output as soon as the corresponding box is processed. Although ‘onStopped’ behaves similarly
as ‘punctual’ and immediately sends the output as soon as the box is processed, the difference is that it sends
the result only after processing the lower level boxes.

C - Editing Python Script Codes

Processes 1) and 2) above were used to add input variables A and B and output variable R in the Adder box.
The Python code will be edited here to add input variables A and B to output the result as variable R.

Image 3.66 - Script Editor

130 3 - python
3.8 comprehensive pratice through choregraphe script modification
First, select ‘Edit box script’ to open the script editor.

The script editor will then appear as shown in Image 3.66. The structure is similar
to the Random Eyes box. First, there is the __init__(self) constructor and additional onLoad, onUnload,
onInput_A, and onInput_B methods. We will edit __init__, onInput_A, and onInput_B methods as well as
adding an extra method for calculating additions.

First, edit the codes as shown below. When entering the value of variable R in Line 17 (process method),
unlike A and B, you must be cautious of the part that uses the constructor method.

1. class MyClass(GeneratedClass):
2. def__init__(self)
3. GeneratedClass__init__(self)
4. self.bA=False
5. self.bB=False
6. def onInput_A(self, p):
7. s self.R() #~ activate output of the box
8. self.bA = True
9. self.A = p
10. if self.bA and self.bB:
11. self.process()
12. def onInput_B(self, p):
13. self.bB = True
14. self.B = p
15. if self.bA and self.bB:
16. self.process()
17. def process(self):
18. self.R(self.A + self.B)
19. self.bA = False
20. self.bB = False

To explain the variable, classes use A, B, bA, bB, and R instance variables.

Here, A and B are temporary storage spaces and they each receive data from A and B inputs. Currently, bA
and bB are used as variables for evaluating whether A and B inputs have been established. R will be the
output value.

Constructor (Line 2-5) explicitly calls the previous parent class constructor and initializes variables bA
and bB as False. Variables bA and bB are used to save whether or not the corresponding values have been
properly delivered when A and B of the Adder box receives an input.

When input is delivered to A and B, onInput_A (Line 6-11) and onInput_B (Line 12-16) become executable
methods. The ‘self’ in onInput_A(self, p): refer to its class, and ‘p’ becomes an input signal delivered from
the outside. If the input is delivered to A, onInput_A method is called, so the input delivered to A is saved
in variable ‘p.’

3 - python 131
3.8 comprehensive pratice through choregraphe script modification
The onInput_A method has a section that saves value ‘p’ within its internal A variable (Line 9)
and a section bA = True that signifies whether the value has been received (Line 8). Additionally, if ‘if self.
bA and self.bB:’ receives normal values, a process for adding A and B is called. Meaning, both sides will
contain this processing syntax to make up for the fact that onInput_A and onInput_B can at times be executed
quicker than the other.

This part exists for both onInput_A and onInput_B methods because, if Choregraphe is executed,
all the boxes can be executed in parallel and input/output can also be simultaneously executed.

The ‘process’ method (Line 17-20) receives A and B values to add them and deliver the result to value R.
It also stops bA and bB variables from getting reset to False and being executed again. Here, R calls the
constructor to create the value instead of using substitution. When Choregraphe sends the next output
of data, you have to create a new instance to make a proper delivery of the changed status of the value.

132 3 - python
3.8 comprehensive pratice through choregraphe script modification
3.9 reference

About Python - http://www.python.org/about/


Python Reference - http://docs.python.org/reference/

3 - python 133
3.9 reference
4 naoqi
& DCm
LEARNING
Chapter 4 explains the NAOqi framework
which forms the foundation of the NAO robot
and the DCM used for controlling all the devices.
Special characteristics including the NAOqi
framework structure, file structure, and Broker
as well as the NAOqi framework are used
to control NAO.

It also explores how to load modules into NAO


using Linux, C++, and cross-compiling as well
as what to do when several commands
are received in Time Command. There will
also be an introduction to the structures of DCM
controlled devices and how to synchronize using
DCM’s synchronization method.

134
content
4.1 NAOqi Overview 136 4.7 Low Level Architecture 168

4.1.1 About NAOqi 136 4.7.1 Device Overview 168


4.1.2 NAOqi Term Definitions 137 4.7.2 Devices and Definitions 168
of the Auxiliary Devices
4.7.3 List of Communication Bus 170
4.2 Structural Overview 138 4.7.4 Device Type and List 170
4.7.5 Auxiliary device Type and List 171
4.2.1 File Structure 140
4.2.2 Broker 142
4.8 Preferences Files 172
and Sub Preference Files
4.3 Using NAOqi 146
4.8.1 Introduction 172
4.3.1 Setting the Environment 146 4.8.2 Structural Overview 172
or Using Python
4.3.2 Project Setup for NAOqi C++ 149
Programming 4.9 DCM Bounds Methods 173
4.3.3 Simple Example Using NAOqi 150
4.3.4 NAOqi Option 151 4.9.1 getTime 173
4.3.5 Remote Option 152 4.9.2 Set 173
4.3.6 NAOqi in NAO 152 4.9.3 createAlias 177
4.9.4 setAlias 178
4.4 Cross Compiling fo Loading 153 4.9.5 Special 180
Modules (Using C++, Linux)
4.10 DCM Synchronization Methods 181
4.4.1 Preparing the Cross-compile 153
Tools
4.4.2 Cross Compiling Process 154
Using CMake and Make
4.4.3 NAO Robot Setup 159
4.4.4 Module Execution 162

4.5 DCM Introduction 163

4.6 Upper Level Architecture 164

4.6.1 Structural Overview 164


4.6.2 Time command 164
4.6.3 Time command 165
linear interpolation

135
4.1 NAOqi
Overview

4.1.1 About NAOqi


Robotics research is being conducted in many different fields, and robots are actually already being utilized
for specific applications. There are a large number of software frameworks currently available to control
these robots. These frameworks can be specialized for automation applications, or specifically designed to
facilitate software integration, with object-oriented architectures. They can be further divided into categories
depending on the chosen particular programming language, or the fact that they comply with specific
purposes like real-time processing. Some of these frameworks are available as free software, while others
are commercial products.

OROCOS (Open Robot Control Software), YARP (Yet Another Robotic Platform), and Urbi (Universal Real-Time
Behavior Interface) are some of the current widely known frameworks. Since the release of Aibo, many
research institutes started developing their own framework.

NAOqi is a framework developed specifically by Aldebaran Robotics for using the NAO robot, and this
includes elements like parallel processing, resources management, synchronization, and event processing
generally required for robotics. Although NAOqi is configured with general layers similar to other
frameworks, these layers are created and processed in NAO, and this method is perfect for controlling
the robot. NAOqi also enables information sharing and programming through ALMemory and communication
between Homogeneous Modules like motion, audio, and video that serve other roles.

NAOqi is a SDK created in C++. It includes functions like simulation execution, calling Python, Urbi or C++
in Choregraphe, calling C++ functions in Python, and programming, simulation, and control functions.

136 4 - naoqi & dcm


4.1 naoqi overview
4.1.2 NAOqi Term Definitions
Graph 4.1 explains the general terms used in NAOqi.

Graph 4.1 - NAOqi Terms

Broker Broker is a program that receives and executes commands from specific IP addresses
and Ports. NAOqi($AL_DIR/bin) is called the «main broker.» Audioout(TextToSpeech)
is an independent broker connected to NAOqi.

Module Module is a class that includes functions for robot motions (including motion, TextToSpeech,
(specialized leds, etc.). Library called from $AL_DIR/modules/lib/autoload.ini is also called a module.
class for When calling a library from NAOqi, objects of the module are systematically instantiated.
ALModule) Modules are always linked to brokers.

Proxy Proxy is used to access the module. In order to call a method from the module, you must
create a proxy for the module.

CMake CMake creates the appropriate project for the desired OS (OSX, Linux, Win32) and IDE (Visual
Studio, Eclipse, etc). NAOqi requires CMake Version 2.6 or higher.

Remote Remote functions refer to functions executed in other executable modules.

Cross compile Compiles the module used within the robot.

Choregraphe Aldebaran tool that creates upper layer motions.

Monitor Aldebaran tool that visualizes NAO’s cameras, memory, etc.

Critical section Code that cannot be executed in two threads.

Extractor Converts NAO’s sensor values into data that can be used by NAO’s memory.

ALMemory NAO’s memory can be accessed by all modules, remote modules, remote tools, and other NAOs.

LPC Local Procedure Call.

IPC Inter-Process Communication.

IPPC Inter-Process Procedure Call.

RPC Remote Procedure Call.

Smart pointer Pointer where memory removal and deletion occurs automatically.

Mutex Manages the critical section.

4 - naoqi & dcm 137


4.1 naoqi overview
4.2 Structural
Overview

This chapter will explain the theory about the NAOqi structure. First, we will define the components
of distribution and the role of the module and explain some of the ways they interact with one another.
Image 4.1 below shows NAOqi’s framework structure.

NAOqi Framework

Choregraphe Monitor Motion Module Audio module…

Image 4.1 - NAOqi Framework

NAOqi framework works by having Choregraphe, Monitor, Motion module, and Audio module pass
information to each other. NAOqi is executed by having Broker deliver information and commands.
All the elements in 4.1 operate together to execute a variety of movements. The following explains
the different elements that configure the NAOqi framework.

• Module: Module is both a class and library that uses the function and API defined in ALModule
to obtain information or control regarding each module.

• Communication: Communication uses Local Procedure Call (LPC) or Remote Procedure Call
(RPC) to connect to NAO and exchange information.

138 4 - naoqi & dcm


4.2 structural overview
• ALMemory: ALMemory is the robot’s memory. Any module can use or read this data
and can monitor events. It can be called when an event occurs. ALMemory is an array of ALValue.

• Introspection: Introspection is the default element that monitors the functions for the robot API,
amount of memory, monitoring, and motions. The robot recognizes all the usable API functions. Also,
what releases a library will automatically delete the associated API functions. Functions defined in
the module can be added to the API by using BIND_METHOD. BIND_METHOD is defined in almodule.h.

• Python interpreter: It is an interpreter used for interpreting and processing Python commands
in NAOqi.

• Python wrapper: Python wrapper allows you to use functions with the same name in both C++
and Python.

• XML: XML is a form used to save compatible data.

• Proxy: All Aldebaran modules have been modularized. Rather than directly referencing other
module files, you can request the Proxy to find the corresponding module. If the module doesn’t
exist, an exception occurs. The user can call the corresponding function or module through the
Proxy from two independent brokers, mainBroker(local call) and myBroker(remote call).

• ALValue: In order to be compatible, some NAOqi modules or methods are saved as a specific
data type in ALValue.

• Logger: ALLogger uses web based SSH or the robot to look into software information or logs.

• Exception: All NAOqi errors are processed based on exceptions. Exceptions are processed after
all user commands are encapsulated in try-catch blocks.

• Thread Pool: NAOqi and NAOqi modules are configured without any interference between the
threads. There may be some interference in all modules that use the module generator. Although
user function can be called in parallel, you must implement it so there is no interference between
the threads. The user can use the critical section in order to protect this function.

• Smart pointer: Smart pointer is a class that helps with dynamic memory management.
Even though the user does not have to use the smart pointer, all methods of the framework
return the value through the smart pointer. Class structure is not private, so the user can create
a proxy that uses general pointers. (You must include corresponding directives like ‘new’ and ‘delete.’

4 - naoqi & dcm 139


4.2 structural overview
4.2.1 File Structure
NAOqi framework was produced for diverse functions and complex processing (parallel processing,
data processing, etc.) to run the robot, and many libraries have been used. This section will organize
and explain the libraries used to configure NAOqi and the headers of each library.

• Tinyxml - This is a library used to manage XML config files. For detailed information please refer to
http://www.grinninglizard.com/tinyxml/

• Libcore - A library with basic functions like type, smart pointer, and error.

Graph 4.2 - Libcore header configuration

alerror.h Refers to alrror.h to generate ALError

alnetworkerror.h Refers to alnetworkerror.h to generate a network error

alptr.h Refers to alptr.h to use the encapsulation of Boost smart pointer

alsignal.hpp Refers to alsignal.hpp to use the encapsulation of the Boost signal

altypes.h Refers to altypes.h to use NAOqi types

• Libtools - This is a library for managing files and times.

Graph 4.3 - Libtools header configuratio

alfilesystem.h Refers to alfilesystem.h to encapsulate the Boost file system

tools.h Header for conversion functions

• libfactory - This is a library for concurrent code creation for several developers; instantiates according to
the name of each function. (Please refer to the factory design pattern.)

• libsoap - This is a library for web service provision and usage. (gsoap 2.7.12)

• rttools - This is a definition header for the real-time communication management tool between devices.

• libaudio - This is a header with audio extractor definition.

• libvision - This is a library with image processing functions and image and screen definitions.

• libthread - This is a header related to the encapsulation of pthread.

140 4 - naoqi & dcm


4.2 structural overview
Graph 4.4 - Libthread header configuration

alcriticaltrueiflocked.h Encapsulation of critical section pthread. NAOqi does not have any interferences or
constraints between threads. If necessary, client applications must be managed as multi-
thread.Mutex does not block the other threads that weren’t created by critical section.

alcriticalsection.h Encapsulation of critical section pthread. Only one process creates the critical section
that can enter the section.

alcriticalsectionread.h Reads and write Mutex.

almutex.h Encapsulation of Mutex. Pthread

altask.h Threadpool executes the tasks. Task is created in altask and can be added
into the threadpool queue.

alcriticalsectionwrite.h Reads/writes critical section

alcriticalsectionread.h Reads/writes critical section

almonitor.h Adjusts threadpool size

• alcommon - This is a header that defines general NAOqi elements like module, proxy, and broker.

Graph 4.5 - Alcommon header configuration

albroker.h All executable modules create more than one broker in main.cpp.
Broker waits for http requests or remote C++ requests from PC applications.

alfunctor.h Pointer management

almodule.h ALModule (module with functions) definition

alproxy.h Enables Proxy creation within the module and calls the bound method. If the method
is located within the same executable module, Proxy will choose the quickest local call.
If the method is located in another executable module, it will choose the slower remote call.

alsharedlibrary.h Manages the dynamically loaded library (a library that loads at a time other than
when the program is starting)

alsingleton.h Singleton design pattern – makes it so there is only one instance of a specified class

altaskmonitor.h Task monitor used to see whether a task is being executed, has ended, or s waiting
to be closed.

althreadpool.h Threadpool manages the thread lists.

4 - naoqi & dcm 141


4.2 structural overview
• liblauncher - Liblauncher manages audoload.ini which has been set to start automatically when NAO starts.

graph 4.6 - Liblauncher header configuration

alvalue.h Definition header for ALvalue. ALValue is configured with common unions.

• iNAOqi - iNAOqi Enables Python wrapper and C++ functions to be used in Python.

4.2.2. broKer
#shell example
./bin/naoqi -b 127.0.0.1 -p 9559 #listen on ip 127.0.0.1 and port 9559

Module generator (address) creates the project for the user. It generates a library for connecting executable
files with the robot and for connecting with the main broker (must be added inside the autoload.ini setup file).
Module generator manages connections and locations. It the user to focus on applying the desired functions.
(Please refer to the reference [advanced/SDK] for more detailed information regarding module generators.)

image 4.2 - Broker configuration

142 4 - naoqi & DCm


4.2 structurAl overvieW
Image 4.2 above shows a simple example using a user-generated module called myModule.

MyModule is executed in a remote broker called myBroker which is communicated through the main broker
and IP 127.0.0.1:9559.

Broker is represented by a pink box.

Broker is also a launcher that uses the IP and port number


as a parameter; IP 0.0.0.0. can be used to get commands from all usable IPs.

#shell command
# IP 127.0.0.1 and port 9559 by default
./bin/naoqi -b 0.0.0.0

Web browser can be used to see the description of all API brokers (http://127.0.0.1:9559
in Image 4.2). In Image 4.2 above, MyBroker is connected to the main broker. Use the -pip command to
connect with other brokers.

#shell command
./modules/bin/myBroker -pip 127.0.0.1 # connect to mainBroker 127.0.0.1

Modules are represented by green circles. They can be called at the ‘launcher module’ runtime
or from the setup file autoload.ini ($AL_DIR/modules/lib).

#autoload.ini sample
[core] # required files
albase

[extra] # removable modules


launcher # module that can launch other modules
devicecommunicationmanager # interface with hardware
motion # module that manage motion
pythonbridge # embedded python interpreter

[remote] # run executable from ($AL_DIR/modules/bin)


audio out


Broker is both an executable object and server. This server receives remote commands
and ALProxy allows the connection for sending commands to the broker.

// C++ sample
ALProxy p = ALProxy(«module name», parentIP, parentPort);
p.info(«display it on remote broker parentIP and parentPort);

Modules can access the local broker through the getParentBroker function.

// C++ sample
getParentBroker()->getIP(); // Brings the user’s IP address data.

4 - naoqi & dcm 143


4.2 structural overview
Graph 4.7 and 4.8 explain the fields and Brokers used in NAOqi.

Graph 4.7 - ALModuleInfo: Configuration of modules and brokers

ALModuleInfo field Description

std::string name; module/broker name

int architecture; Operating system (linux/win32/macOSX)

std::string ip; broker IP address (if it is a broker)

int port; broker port (if it is a broker)

int processId; Deprecated

int modulePointer; Module address

bool isABroker; Determines whether it’s a broker(true) or module(false)

bool keepAlive; Automatically closes the broker if the parent broker closes

Graph 4.8 - ALBroker

Method Parameter Description

const std::string& pModuleName, Searches the module network


int getModuleByName();
al_ALModuleInfo& pModInfo for desired module

const std::string& pModule- Searches the module


int getLocalModuleByName(); Name, ALModuleInfo& pModInfo in the local process

Adds module to the local process


int registerModule(); ALPtr<ALModule> pModule (automatically called when library
is loaded)

Removes the module from


int unregisterModule(); const std::string& pModuleName the local process (cannot remove
if the module is running)

Removes the module reference


int removeProxy(); const std::string& pModuleName (automatically called when you
cancel the module registration)

TALModuleInfoVector
int getModuleList (); Calls the local module list
&pModuleList

TALModuleInfoVector Calls the local broker


int getBrokerList();
&pBrokerList that is connected

TALModuleInfoVector Calls the list of all the modules


int getGlobalModuleList ();
&pModuleList (including remote modules)

144 4 - naoqi & dcm


4.2 structural overview
TALModuleInfoVector Calls the list of all the modules
int getGlobalModuleList ();
&pModuleList (including remote modules)

const std::string& Initializes the port to receive


pBrokerName, commands from the set IP
const std::string& pIP, int pPort, and port. Uses module generator
int init(); const std::string& pParentBrokerIP, or createBroker to create simple
int pParentBrokerPort,bool brokers.
pKeepAlive = false

ALModuleInfo pBrokerToRegister Adds an executable module


int registerBroker();
to the current broker.

int getBrokerInfo(); ALModuleInfo &pModuleInfo Brings current broker information.

const std::string& pModuleName, Generic function for finding the


bool pSearchUp,bool pSearchDown, module within the process tree.
Int exploreToGetModuleByName(); const std::string& pDontLoo-
kIntoBrokerName, ALModuleInfo
&pModInfo

const std::string& pModuleName, Brings the API of the pModule-


int getMethodList() const; std::vector<std::string>&pMetho Name module.
dListName

const std::string& pModuleName, Brings the document regarding


int getMethodHelp() const; const std::string& pMethodName, pModuleName and pMethodName.
AL::ALValue &pMethodHelp

const std::string& pModuleName, Brings the pModuleName


int getInfo() const;
ALModuleInfo &pModuleInfo information.

Stops listening to the IP and port


commands. Removes all local
int shutdown(); modules. Releases the current
broker registration within other
processes and brokers.

Returns ‘false’ if it is the main


bool HasParent()
broker (NAOqi)

std::string getName() Brings the executable/broker name

std::string getIP() Brings the broker IP

int getPort() Brings the broker port

Brings the pProxyName pointer


ALPtr ALProxy getProxy(); const std::string& pProxyName (if it’s a local module) or remote
connection of pProxy Name
(if it’s a remote module).

4 - naoqi & dcm 145


4.2 structural overview
4.3 Using
NAOqi

The following programs are needed in order to use NAOqi to create programs:

- Visual Studio: VS2005 & service pack1, VS2008


- CMake: CMake Version 2.6 or higher
- Python 2.6
- libusb (DCM link)

4.3.1 Setting the Environment for Using Python


In order to use NAOqi and Aldebaran SDK in Python, a bit of environment variable setting is needed. You can
get SDK from NAO’s software installation CD or from the Aldebaran-robotics homepage. (Version 1.8.16 SDK
File Name: aldebaran-cpp-sdk-1.8.16-win32-vc90.zip)
The setup process may differ depending on the user’s OS. The following shows an example for Windows,
but install paths for Python and SDK may differ for each user as well.

• In Windows, naoqi.exe (as well as additional user-generated codes) requires several DLL files
so a path must be set (%PATH%).

Image 4.3 - Property settings

A - Click the right mouse button on Desktop and select «Properties» from the menu (Image 4.3).

146 4 - naoqi & dcm


4.3 using naoki
Image 4.4 - Advanced System Settings and Environment Variables

B - If you click «Advanced System Settings» on the left side of the System Properties window,
the system properties window will appear (Image 4.4). Then, click the environment variable
from the System Properties window.

Image 4.5 - Environment Variable setup window and editing Path

C - If you click ‘Path’ and then ‘Edit’ from the Environment Variable setup window, Edit System Variable
window will appear as shown in Image 4.5. You must add Python and SDK paths to ‘Path’ under System
Variable. Each path is separated with a semicolon (;).

4 - naoqi & dcm 147


4.3 using naoki
- Python: C:/Python26
- Python Script: C:/Python26/Tools/Scripts
- SDK bin folder: C:/Program Files/Aldebaran:/aldebaran-cpp-sdk-1.8.16-win32-vc90/bin
- SDK lib folder: C:/Program Files/Aldebaran/aldebaran-cpp-sdk-1.8.16-win32-vc90/lib

Image 4.6 - PYTHONPATH Settings

D - To use the SDK library in Python, create PYTHONPATH as shown in Image 4.6 and register the path
as shown below:

- SDK lib folder: C:/Program Files/Aldebaran/aldebaran-cpp-sdk-1.8.16-win32-vc90/lib

All environment variables must be set within the script, and they can be found in the SDK root. The user must
always use a script to execute NAOQi. (NAOQi-bin’s executable file does not directly support the execution.) Also,
in Windows, user-generated executable files must exist inside the bin/ directory.

148 4 - naoqi & dcm


4.3 using naoki
4.3.2 Project Setup for NAOqi C++ Programming
CMake is necessary to create C++ projects for NAO SDK. As shown in Graph 4.1, CMake is a program that
generates several types of folders, folder installations, and library references at once. First, verify whether
NAOqi’s broker and module packages are active before installing NAO SDK, and then execute the CMake
graphic user interface (Image 4.7).

Image 4.7 - Executing CMake

The following describes the setup process in CMake. (Image 4.8).

Image 4.8 - Using CMake for project setup

4 - naoqi & dcm 149


4.3 using naoki
A - For «Where is the code source» field, select the example folder from the NAOQi SDK install folder
(ex: «/path/to/aldebaran-sdk/modules/src/examples/helloworld»)

B - For “Where to build the binaries” field, choose the temporary build folder of the sub-folder example.
If the folder does not exist, create a new one. (ex: «/path/to/aldebaran-sdk/modules/src/examples/
helloworld/build»)

C - Click the «configure» window. Depending on the operating system and IDE being used, select ‘Ide to be
used.’ (Select «Visual Studio 8 2005» or «Visual Studio 9 2008» for Windows and «UNIX Makefiles»
for Linux or Mac operating systems)

D - Input “/path/to/aldebaran-sdk/toolchain- pc.cmake” by selecting “Specify toolchain file


for cross-compiling”

E - Click the «configure» button one more time if the configuration field becomes red. It was done correctly
if the background for all the fields turns gray.

F - Click the “Generate” button.

G - For Window, .sln file will be created inside your own build directory, and you can open this through IDE.

H - For Linux or Mac, compile the sample project. Input «make» inside the Build directory.

I - For SDK folder address, you can select the folder you installed.

4.3.3 Simple Example Using NAOqi


This chapter introduces how to use NAOqi using Python.
The following code creates and connects loggerProxy in NAO and delivers the text string.:

#ALProxy reference
from naoqi import ALProxy
#Set IP and port
IP = «nao.local»
PORT = 9559
#loggerProxy generation
loggerProxy = ALProxy(«ALLogger», IP , PORT)
#Deliver command
loggerProxy.info(«Python», «it works»)

Other modules and variables with «0» value named «myValueName» are set up to share using the following
code. The next code makes other modules share the variables named «myValueName» with «0» as the value.
InsertData is a function that records the data name and value in ALMemory. The reference (NAOqi API) has
the explanation regarding the functions used (can be used) in NAOqi.

150 4 - naoqi & dcm


4.3 using naoki
from naoqi import ALProxy
memProxy = ALProxy(«ALMemory»,robot_IP,9559) # robot_IP=192.168.123.150; IP assigned to NAO
memProxy.insertData(«myValueName», 0) # record 0 in the myValueName variable

The following shows how you can create modules in C++:

#include «almemoryproxy.h»
void myModule::init(void)
{
getParentBroker()->getMemoryProxy()->insertData(«myValueName»,0);
}

Python script can be used from your PC to call the robot’s function. If not, you can use the robot’s embedded
interpreter for quick activation. If the user doesn’t want the function to stop while being executed, parallel
calls can be used to resolve this. The following code uses parallel calls to speak the given test through NAO:

from naoqi import ALProxy


audioProxy = ALProxy(«ALTextToSpeech»,robot_IP,9559)
audioProxy.post.say(«my first parallel call») .post in front of #say function executes the parallel call.

Use CMake (makefile of Linux or .sln file of Microsoft Visual Studio) when creating and compiling a project.
Enter the following command to call the user library in NAOqi: myModule.so is called when NAOqi --load
myModule is executed, and the initialization method is automatically called.

4.3.4 NAOqi Option


Graph 4.9 shows the options that can be used when executing NAOqi

Graph 4.9 - NAOqi Option

Options Actions

-h [ --help ] Help message

--version Version output

-v [ --verbose ] Outputs the record to the console

-d [ --daemon ] Independent execution within the memory

--pid arg File name for recording PID

-n [ --broker-name ] arg (=mainBroker) Broker name, mainBroker is the default value

-b [ --broker-ip ] arg (=0.0.0.0) Broker IP, default value is 0.0.0.0

-p [ --broker-port ] arg (=9559) Broker port, default value is 9559

4 - naoqi & dcm 151


4.3 using naoki
4.3.5 Remote Option
In the remote program option, NAOqi option can be used equally, and options shown in Graph 4.10 are added.

Graph 4.10 - NAOqi Remote Option

Remote program options Actions

--pip arg (=127.0.0.1) Server IP, default value is 127.0.0.1

--pport arg (=9559) Server port, default value is 9559

4.3.6 NAOqi in Nao


NAOqi runs automatically only when NAO is connected to the network (Ethernet or wifi).

• NAOqi start command: /etc/init.d/naoqi


• NAOqi restart command: /etc/init.d/naoqi restart
• NAOqi end command: /etc/init.d/naoqi stop

152 4 - naoqi & dcm


4.3 using naoki
4.4 Cross Compiling
for Loading
Modules (Using C++, Linux)

To directly load the new module to NAO, you can either cross compile the modules using C++ to generate
the module or use Python to create the modules. When using C++ to create the modules, because the NAO
Robot is configured based on embedded Linux, you must compile the modules to fit the Geode in Linux. This
is because an error may occur when different CPU and operating systems are used since machine language
and commands are interpreted using different methods. NAO robot provides a separate cross compiler
to resolve this issue.

Python has interpreter properties, and these errors do not occur here since a Python interpreter
is already loaded inside NAO. Although it is better to use Python to create modules, C++ may be more
suitable for high-speed processing like image processing.

This chapter uses a simple example to explain the tools needed for loading modules into NAO, how to use
C++ to execute the cross compiler in Linux, and how to load the actual modules. Linux used in this book is the
Ubuntu 10.04 LTS Lucid Linx version, and some commands may not be compatible if a different version is used.

4.4.1 Preparing the Cross-compile Tools


First, you need the NAOqi library for Linux. Download NAOqi SDK v1.8.16 for Linux either from the Aldebaran
Robotics homepage or the CD provided by the NAO robot. It is named aldebaran-cpp-sdk-1.8.16-linux-i386.
tar.gz (for Version 1.8.16). Save this in /home/useraccount. (Here, user account refers to the account name
of the user.) Then, use the terminal to decompress using the following command.

Image 4.9 - Terminal location in Ubuntu Linux

tar xvjf aldebaran-cpp-sdk-1.8.16-linux-i386.tar.gz

4 - naoqi & dcm 153


4.4 Cross Compiling for Loading Modules (Using C++, Linux)
If you decompress, you will see that aldebara-cpp-sdk-1.8.16-linux-i386 directory has been created, and you
are now ready to program in Linux using C++. To cross-compile here, you need a tool called Cross compilation
Tool-Chain (CTC) in addition to the Linux NAOqi library. Download Cross Compilation Toolchain v.1.8.16
the same way as the NAOqi library and save it in /home/useraccountname/.

At the time this book was written, it was being distributed as ctc-1.8.16.tar.bz2 based on Version 1.8.16.
The corresponding file is decompressed in the /home/useraccountname/ directory. You can use the following
command to decompress in the terminal. Image 4.9 shows the location of the terminal in Ubuntu Linux.

tar xzvf ctc-1.8.16.tar.bz2

When decompressed, the tools needed for cross-compiling are saved in the /home/useraccountname/
Downloads/ctc-1.8.16-linux32 folder, and you are now ready to cross-compile. Check the SDK and cross-
compiler through the ls –al command in the terminal, and go on to the next step if both of them are there
(Image 4.10).

Image 4.10 - Check SDK and cross-compiler directory using the ls -al command

4.4.2 Cross-Compiling Process Using CMake and Make


Cross-compiling is a very simple process. This process is identical to the one used to compile existing C++
projects. Cross-compiler has to be chosen as the compiler. The following shows this process through
the HelloWorld example.

A - Moving to the HelloWorld Directory

In the Linux terminal, move to the /home/useraccountname/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/


src/ examples/helloworld folder. Use the ‘ls’ command to observe the file inside the folder (Image 4.11)

Image 4.11 - Configuring the HelloWorld example file

CMakeLists.txt file has the list that uses CMake to automatically execute the compiling, and bootstrap.
cmake file has the records of the exceptions that occur while using CMake. Source code files that actually
get compiled are alhelloworld.h, alhelloworld.cpp, and helloworldmain.cpp.

154 4 - naoqi & dcm


4.4 Cross Compiling for Loading Modules (Using C++, Linux)
B - Build Environment Setup Using CMake-GUI

If you execute CMake-GUI in the terminal window, a screen will appear as shown in Image 4.12.

Image 4.12 - CMake-GUI screen in Linux

If CMake-GUI does not get executed here, this is because the files related to CMake are not installed. If
Ubuntu Linux is used and is not connected to the internet, you can use the following command for the install:

sudo apt-get install cmake

The ‘apt-get’ command adds/removes Windows programs; it is responsible for adding and removing
programs in Ubuntu Linux. The ‘sudo’ command is used to obtain temporary authorization to add/remove
programs. It will ask for a password if you run the command above, and CMake will be installed once the
correct password is entered (Image 4.13). If you get an error that CMake cannot be found, you can still
install using Ubuntu’s system/management/synaptic bundle management menu.

4 - naoqi & dcm 155


4.4 Cross Compiling for Loading Modules (Using C++, Linux)
image 4.13 - CMake installing using the ‘apt-get’ command

You can use CMake-GUI if you install it using the command shown below:

sudo apt-get install cmake-gui

Once the installation is complete, perform the same task used to set the the C++ project using CMake in
Windows. First, add a new directory named ‘build’ in the helloword directory.
You can add this by using the ‘mkdir build’ command, and you can execute CMake-GUI by moving
it to the corresponding directory.

/home/useraccount/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/src/examples/helloworld/build

u
v

image 4.14 - CMake-GUI setup

156 4 - naoqi & DCm


4.4 cross compiling For loAding modules (using c++, linux)
Identical to the previous C++ setup in Windows

j Where is the source code


/home/useraccount/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/src/examples/helloworld/

k Where to build the binaries


/home/useraccount/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/src/examples/helloworld/build

After the input, press the Configure button in l (image 4.14). Select Unix Makefiles for ‘Specify the
generator for this project’ and press Next after selecting ‘Specify toolchain file for cross-compiling’ from the
options (image 4.15). Then, enter the toolchain-geode.cmake file previously decompressed into the ctc-
1.8.16-linux32 folder into ‘Specify the Toolchain file.’ (image 4.16).

Toolchain file - /home/shhyun/ctc-1.8.16-linux32/toolchain-geode.cmake

image 4.15 - Configuration setup image 4.16 - Toolchain setup

image 4.17 appears when you finish the process above. If there is a checkmark next to HELLOWORLD IS
REMOTE, remove the check and then click the Configure button. If HELLOWORLD IS REMOTE is checked,
the module is configured so that it is loaded in NAO only for remote execution.

If there are no other issues, the Generate button (located to the right of the Configure button) will be
activated. If you click this, Generating Done message will appear as shown in image 4.18 to let you know that
the project file has been created correctly, and then you can end CMake.

4 - naoqi & DCm 157


4.4 cross compiling For loAding modules (using c++, linux)
image 4.17 - Screen after configuration is complete image 4.18 - Result after pressing the Generate button

C - Using the ‘make’ Command for the Compiling Task

If the process above has been completed successfully, new files will be created in the helloworld/build
directory (image 4.19). These files contain information regarding project compilation and executable file
generation, and execution is very simple.

The ‘make’ command has to be executed using the terminal. However, you have to first move
it to a previously specified directory (/home/useraccount/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/src/
examples/hello world/build). Then, execute the ‘make’ command in this directory.

image 4.19 - File added to the ‘build’ folder

158 4 - naoqi & DCm


4.4 cross compiling For loAding modules (using c++, linux)
If you execute the ‘make’ command, you will get the result shown in Image 4.20.
This refers to compiling the allhelloworld.cpp file and helloworldmain.cpp file to create an object file with a .o
extension and turning this into a library file titled libhelloworld.so. If there are no error messages and you get
the result shown in Image 4.20, it has been compiled properly.

Image 4.20 - Results from executing ‘make’

If you go through the process thus far, the cross-compiling process for creating the module for NAO using
C++ will be complete. The file created will be saved as /home/useraccount/aldebaran-cpp-sdk-1.8.16-
linux-i386/modules/src/examples/helloworld/build/sdk/lib/naoqi/libhelloworld.so. It is a file that has to be
uploaded into NAO later on, so you must remember this location.

4.4.3 NAO Robot Setup


The process thus far has explored how to compile the modules that will be added to NAO.
We will now explore how to set it up for that the modules can be executed in NAO. There are two processes:
uploading the file through Choregraphe and setting it up so that the module is uploaded automatically
through NAOqi. Since all processes excluding cross-compiling can be executed in Windows, we will base
the explanation on Windows.

A - Execute the NAOqi Module by Changing the autoload.ini File



In order to upload the module we have created into NAO, you have to set it so that the corresponding module
is uploaded to NAOqi when NAO is booting. All you have to do is set the autoload.ini file, but this file does not
exist in a user-accessible NAO folder. You will have to edit and use the autoload.ini file provided as default by
Choregraphe or NAOqi SDK. First, you will find the autload.ini file either in the folder where Choregraphe is
installed or the ‘preferences’ folder where NAOqi SDK is installed.

This file has to be uploaded into NAO but you must change some of the information first. The autoload.ini file
information is as follows. (Some may vary depending on the version).

4 - naoqi & dcm 159


4.4 Cross Compiling for Loading Modules (Using C++, Linux)
[core]
albase
launcher
albonjour
[extra] leds
motion
motionrecorder

framemanager
pythonbridge
videoinput
behaviormanager
helloworld
#urbistarter

You have to add the the ‘helloworld’ section shown below. Meaning, you have to load the other modules first
and then upload the ‘helloword’ module into the memory at the very end.
The basic task is over if you upload the revised autoload.ini file and the libhelloword.so file (the one
we previously created through cross-compiling) into NAO’s memory.

Image 4.16 in 4.4.2 had a section where you prevent HELLOWORLD_IS_REMOTE from being checked.
If it is checked, you have to add an additional [remote] line right above ‘helloworld’ to have it operate
normally. If registered as a [remote] module, you can remotely execute other corresponding modules
from then on.


[remote]
helloworld

B - File Upload Using Choregraphe

Now, upload the new module inside NAO. After connecting with NAO using Choregraphe’s Connect,
click File Transfer from the Connection menu to transfer the file (Image 4.21). nao/nao is the default login
ID and password. There are several folders, but the naoqi folder is the main target for setup.

Image 4.21 - File Transfer Menu

160 4 - naoqi & dcm


4.4 Cross Compiling for Loading Modules (Using C++, Linux)
The following is where you have to upload the autoload.ini and libhelloworld.so files.

autoload.ini - /naoqi/preferences/
libhelloworld.so - /naoqi/lib/naoqi/

The two files must be uploaded to corresponding locations, but the /naoqi/lib/naoqi/ folder where you
have to upload the libhelloworld.so file is not created by default. This is why you must manually create the
corresponding folder and upload each of the two files to their respective locations.

C - Check Module Loading

To check whether the module was loaded properly after going through this process, you just have
to re-execute NAOqi inside NAO. You can execute this by connecting to NAO’s IP address. If NAO’s IP address
is 192.168.123.150, use the internet browser to connect to 192.168.123.150 and then enter the same ID
and password you used when the file was previously uploaded.

If you click NAOqi from the Advanced menu, it informs you NAOqi’s current status and provides a menu
to turn it off and back on (image 4.22). After using the Restart button to restart, click the Log menu from
the Advanced menu to check whether the ALHelloWorld module was registered properly in the memory.
If it was registered normally without an error, you will see a message shown in Image 4.23, and you are
now ready to execute the corresponding module.

image 4.22 - Menu for rebooting NAOqi

image 4.23 - Using the Log menu to check whether the module was registered

4 - naoqi & DCm 161


4.4 cross compiling For loAding modules (using c++, linux)
4.4.4 Module Execution
Since you have now verified that the module has been registered properly, we will now check to see
if the module will operate properly. The HelloWorld example is the most basic module and doesn’t execute
any movements. However, an error will occur if the attempt at executing the module fails, and you can check
whether the module is operating properly through this.
You can use it the same way the existing NAOqi module is used.

from naoqi import ALProxy


IP = “192.168.123.150” PORT = 9559
ALHelloWorld_Proxy = ALProxy(“ALHelloWorld”, IP, PORT)
ALHelloWorld_Proxy.helloWorld()

If you sequentially execute the commands above in Python, the helloWorld module is called.
An error will occur if there is an issue, but it will execute normally if there are no issues.

In order to actually use this process, you must first use NAOqi to implement functions like speaking
and moving the upper body to the helloWorld module. Then, you must cross-compile in a series and upload
this to NAO’s memory and use it by calling the module.

162 4 - naoqi & dcm


4.4 Cross Compiling for Loading Modules (Using C++, Linux)
4.5 DCM
introduction

DCM (Device Communication Manager) is a part of the NAOqi system. It is a NAO software module
and manages the communication of all the devices (board, sensors, actuator, etc) excluding the cameras
and sound (Image 4.24).

As previously introduced, DCM is automatically installed if NAOqi is installed as an important library.


An important function of DCM is to connect the «Upper level» and «Lower level.» DCM is connected
through the Head Devices and i2c interface, ChestBoard, and the USB interface.

4 - naoqi & dcm 163


4.5 dcm introduction
4.6 Upper
Level Architecture

4.6.1 Structural Overview

Image 4.24 - DCM Structure

The following two methods can be used for other modules to access the robot sensors and actuators
(Image 4.24).

First, in order to access the sensors, you must find the value inside ALMemory that has the name
of the auxiliary device. DCM automatically updates the sensor values inside ALMemory. Modules will only use
the newly updated sensor values.

Second, the following describes how to access the actuators. The modules use the values updated by DCM
through «Timed Command.» However, this method cannot directly change the DCM value inside ALMemory.
In this instance, usage request is received from DCM and the actuator value inside ALMemory is changed
within DCM itself.

4.6.2 Time Command


Time Command refers to delivering the command regarding when a command will be applied to the actuator
of the auxiliary device (Image 4.25).

The user can send one or more timed commands in order to deliver the same command to one or more
actuators. Time is measured in ms and is a 4-byte integer. If the robot’s motherboard has the module,
it can request DCM to obtain the current time or read it directly.

164 4 - naoqi & dcm


4.6 upper level architecture
image 4.25 - Timed Command example

DCM saves all Time Commands of each actuator. It then analyzes the DCM cycle of the next command based
on the current time and uses linear interpolation to calculate the appropriate command. Here, the previous
command is deleted right after it is used.

Maintain the last command if the command hasn’t arrived yet, and the last command will be sent in the next
DCM cycle.

4.6.3 time CommanD linear interPolation


Linear interpolation is explained in this chapter. Linear interpolation calculates the value between
two temporary locations when only the two-dimensional end point values are given.

You can obtain this by using the equation below which represents the F(x) function near the x0 , x1
variables through function expansion.

This is the simple interpolation formula referred to as either ‘proportional part’ or ‘linear interpolation.’
We will look at two examples that calculate the appropriate command by using linear interpolation.

4 - naoqi & DCm 165


4.6 upper level Architecture
A - First Example

Let’s first assume that the DCM cycle is 10ms (Image 4.26). Linear interpolation is used to calculate the
appropriate command if you receive Command 1 from (10, 10) when it is t=30ms and Command 2 from (80, 40).

Image 4.26 - Linear Interpolation for two commands

• t = 30ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 60
Command = (( 40 – 10 ) * 10) / 60 + 10 = 15

• t = 40ms
DiffTime1 = 40 – 30 = 10 , DiffTime2 = 80 – 30 = 50
Command = (( 40 – 15 ) * 10) / 50 + 15 = 20

• t = 50ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 40
Command = (( 40 – 20 ) * 10) / 40 + 20 = 25

• t = 60ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 30
Command = (( 40 – 25 ) * 10) / 30 + 25 = 30

•t = 70ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 20
Command = (( 40 – 30 ) * 10) / 20 + 30 = 35

• t = 80ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 10
Command = (( 40 – 35 ) * 10) / 10 +35 =40

166 4 - naoqi & dcm


4.6 upper level architecture
B - Second example

We will assume the DCM cycle is 10ms here as well (Image 4.27). When t=10ms, linear interpolation will be
used to calculate the appropriate command when D CM receives four commands ((15,10), (25, 30) , (45, 20) ,
(65, 0)) as shown in Image 4.27.

Image 4.27 - Linear interpolation for four commands

• t = 10ms
DiffTime1 = 10 – 0 = 10 , DiffTime2 = 15 – 0 = 15
Command = (( 10 – 0 ) * 10) / 15 + 0 = 6.66

• t = 20ms
DiffTime1 = 20 – 15 = 5 , DiffTime2 = 25 – 15 = 10
Command = (( 30 – 10 ) * 10) / 10 + 10 = 20

• t = 30ms
DiffTime1 = 30 – 25 = 5 , DiffTime2 = 45 – 25 = 20
Command = (( 20 – 30 ) * 5) / 20 + 30 = 27.5

• t = 40ms
DiffTime1 = 40 – 30 = 10 , DiffTime2 = 45 – 30 = 15
Command = (( 20 – 27.5 ) * 10) / 15 + 27.5 = 22.5

• t = 50ms
DiffTime1 = 50 – 45 = 5 , DiffTime2 = 65 – 45 = 20
Command = (( 20 – 20 ) * 5) / 20 + 20 = 15

• t = 60ms
DiffTime1 = 60 – 50 = 10 , DiffTime2 = 65 – 50 = 15
Command = (( 20 – 15 ) * 10) / 15 +15 = 5

• t = 70ms
Command = 0

4 - naoqi & dcm 167


4.6 upper level architecture
4.7 Low
Level Architecture

4.7.1 Device Overview


In order to use DCM, it is necessary to explore the overview of all the devices that belong to NAO (Image 4.28).

Image 4.28 - Overview of NAO’s devices

4.7.2 Devices and Definitions of the Auxiliary Devices


The devices are controls for the auxiliary devices. DCM communicates with electronic boards and micro
controllers of the internal devices. Each device is defined by the bus type and specific address values
and has a proper name and its own distinct type.

Auxiliary devices are mostly actuators and sensors controlled by the devices. An auxiliary device is defined
by the device itself and the device type and number. Each device has its own distinct name. This name is
used to communicate with the upper level.

168 4 - naoqi & dcm


4.7 low level architecture
A - LED Example

“Face/Led/Red/Right/0Deg/Actuator”: This is one of the LED actuator names and it refers to the red LEDs
near the right eye that corresponds to 0 degrees. These LEDs have an important key value called “Value”
which is a float type value from 0.0 (LED off) to 1.0 (LED full).

You have to add “Device/SubDeviceList/” if you want to use this as an auxiliary device. Meaning,
the perfect key name is “Device/SubDeviceList/Face/Led/Red/Right/0Deg/Actuator/Value.” Currently
the key name that can be used to obtain the LED value is saved inside ALMemory.
Also, you must use this name to send the timed-command value for this actuator to DCM.

B - Joint Sensor Example

“LShoulderPitch/Position/Sensor”: this is the name for one of the joints (left shoulder pitch).
This joint has an important key value called “Value,” and it is a float-type radian value. If you would like
to use this as an auxiliary device, you have to add “Device/SubDeviceList/.”

Meaning, the perfect key name is “Device/SubDeviceList/LShoulderPitch/Position/Sensor/Value.”


Just like the actuator, the key name used to obtain the current joint location value is saved in ALMemory.

Image 4.29 - Flow of devices and auxiliary devices

4 - naoqi & dcm 169


4.7 low level architecture
4.7.3 List of Communication Bus
The following is a simple introduction to the list of communication bus that can communicate through devices.

- MotherBoard: virtual bus for the devices inside the motherboard


- MotherBoardI2C: I2C bus connected to the robot’s head
­ - Chest: virtual bus for the devices inside the chest board
­- ChestI2C: I2C bus connected to the robot’s body
­ - RS485Down: RS485 bus for all the boards connected to the robot’s legs
­- RS485Up: RS485 bus for all the boards connected to the robot’s arms and knees
­ - LeftHandI2C: not used
­ - RightHandI2C: not used

4.7.4 Device Type and List


The following is a simple introduction to devices that are available.

- MotherBoard: The main CPU board is located in the head and has a Geode processor.
- ChestBoard: The chest board has an ARM processor.
­ - MotorBoard: All the motor boards inside the robot control all the joints excluding the legs and hands.
­- MotorBoardHand: It is the motor board for the robot’s hands. It currently plays the same role
as the MotorBoard.
­- MotorBoardFoot: It is the motor board for the robot’s legs. After the announcement
of NAO’s first version, this board no longer controls the motor.
­- TouchBoard: This board has a capacitive sensor near the uppermost part of the head.
­- FaceBoard: It is a board that’s around the robot’s eyes and has LED and IR sensors.
­- USBoard: It is a board with ultrasonic sensors.
­- InertialSensor: It is a board with accelerometer and gyrometer sensors.
­- EarLeds: This board controls the LED sensors of the ears.
­- Battery: It is a board inside the battery.

170 4 - naoqi & dcm


4.7 low level architecture
Image 4.30 is name/type/bus Number of all devices in the NAO robot.

Image 4.30 - Device name/type/busNumber

4.7.5 Auxiliary Device Type and List


The following introduces the auxiliary devices available.

A - Actuators

- Joint: Joint is an actuator that lets you adjust one joint angle of the robot.
- JointHardness: It is an actuator that adjusts the voltage sent to the motor to control the joint.
­ - Led: LED with just one color can adjust the value from 0 to 100%.
- Power: Not used.
­- Charge: Not used.
- UsSend: An actuator that sends the ultrasonic sensor value.

B - Sensors

- JointPosition: Sensor value for the angle location of the robot’s joint
- Current: Current value of one specific joint motor
- FSR: FSR sensor value
- Touch: Status of the capacitor proximity switch (Pressed=1.0, Not Pressed=0.0)
- USReceived: Return value of the ultrasonic sensor
- Accelerometer: Return value of the acceleration sensor
- Gyrometer: Return value of the gyro sensor
­- Angle: Angle of the entire robot (Receives the return value from the inertia board.)
­- Temperature: Temperature value of the motor or battery
­ - Switch: Status of the button on the chest or the bumpers on the feet (Pressed=1.0, Not Pressed=0.0)
­- Battery: Battery status sensor

Please refer to Appendix X for a more detailed list of auxiliary devices.

4 - naoqi & dcm 171


4.7 low level architecture
4.8 Preference Files
and Sub Preference Files

4.8.1 Introduction
DCM has two configuration files. One is the Device.xml file which has the hardware characteristics of the
robot itself and the other is the DCM.xml file that sets the specific parameters for DCM. These two files are
applied equally to all the robots, have default values, and are located in the naoqi/preferences folder.

The NAO robot is separated into two parts: body and head. Each part has Device_Head.xml and Device_Body.
xml files called «subPref.» Device_Head.xml file is saved in the flash of the internal geode board, and
Device_Body.xml file is saved in the ChestBoard flash. When reading the aforementioned .xml files through
DCM, the .xml files are copied saved at the same location where Device.xml and DCM.xml files are saved.

When the aforementioned subpref files are read while booting the system, they will both change to have
the same key values as the key values inside Device.xml and DCM.xml.

Image 4.31 - Structure of the ‘preference’ and ‘sub preference’ files

4.8.2 Structural Overview


Device.xml and DCM.xml files are read from the naoqi/preference directory. DCM has a copy of the file saved
in RAM and transfers the file to ALMemory. DCM will read the subpref file afterward.

Device.Head.xml file is also read from the same directory and saved inside the memory. The key/value inside
the file is sent to ALMemory. DCM reads the Device_Chest.xml file from the chestboard’s flash memory
and sends the key/value to ALMemory after creating a copy in the naoqi/preference directory.

172 4 - naoqi & dcm


4.8 Preference Files and Sub Preference Files
4.9 DCm
bounD methoDS

4.9.1 get time


The getTime command returns the current time information of DCM for the time command.
The time information returned is an integer (signed) and has a 1ms precision.
graph 4.11 - GetTime commands

Int DCM::getTime(int pTime)

Input: optional time for adding/removing time in ms


Output: DCM time in integer form

image 4.32 is an output of the current DCM time information.

image 4.32 - Example of DCM time information (1)

image 4.33 shows an example of 10 seconds after the current DCM time.

image 4.33 - Example of DCM time information (2)

4.9.2 Set
The Set command uses one or more timed commands to deliver control signals (graph 4.12).

graph 4.12 - Set command

Void DCM::set(ALValue& pCommands)

Input: ALValue array data for device control


pCommands[0]: device name used for control
pCommands[1]: update types
“Merge”, “ClearAll”, ClearAfter”, “ClearBefore”
pCommands[2][x]: no. x timed-command list
pCommands[2][x][0]: float command
pCommands[2][x][1]: DCM time information for command application
pCommands[2][x][2]: priority (optional)

Output: None

When a new command is sent through DCM to control the same device, the command will operate by using
the next four update methods.

4 - naoqi & DCm 173


4.9 dcm bound methods
[«ClearAll» ]
Simplest method out of the four. Previous command will be deleted and the new command
will be used. (Image 4.34)

Image 4.34 - Update Type: ClearAll

[«Merge»]
Also a very simple method. The new command is combined with the previous command.
(Image 4.35)

Image 4.35 - Update Type: Merge

174 4 - naoqi & dcm


4.9 DCM Bound Methods
[«ClearAfter»]
Removes everything after the first point of the new command (Image 4.36).

Image 4.36 - Update Type: ClearAfter

[«ClearBefore»]
Removes the earlier part of the new command (Image 4.37).

Image 4.37 - Update Type: ClearBefore

4 - naoqi & dcm 175


4.9 DCM Bound Methods
image 4.38 lights the chest LEDs in red every 10 seconds. DCM interpolates - then uses - the value
it received according to time.

image 4.38 - Example of LED Lighting(1)

The next command sends four commands to control the red LEDs (image 4.39). The LEDs gradually
get brighter for two seconds and gradually dim until four seconds, and then repeats this between six
and eight seconds.

image 4.39 - Example of LED Lighting (2)

The next example activates the ultrasonic sensor on the chest for three seconds every 100ms (image 4.40).
Each of the sensor values will be saved inside the next ALMemory.

Device/SubDeviceList/US/Left/Sensor/Value, Device/SubDeviceList/US/Right/Sensor/Value,
Device/SubDeviceList/US/Left/Sensor/Value1, Device/SubDeviceList/US/Right/Sensor/Value1, Device/
SubDeviceList/US/Left/Sensor/Value2, Device/SubDeviceList/US/Right/Sensor/Value2,...

image 4.40 - Example of how to use the untrasonic sensor

176 4 - naoqi & DCm


4.9 dcm bound methods
4.9.3 CreatealiaS
The Alias command uses the “createAlias” function to quickly deliver a lot of control signals.

The Alias command has the control device list and uses the “setAlias” function to request an update of all the
control devices that have different commands.

The createAlias function returns the ALValue data, and since incorrectly assigned device names will be
removed, you can use this to correct errors (graph 4.13).

graph 4.13 - CreateAlias command

ALValue DCM::createAlias(ALValue& pParams)

Input: ALValue array data for device control


pCommands[0]: device name used for control
pCommands[1][0]: name of the first device
pCommands[1][1]: name of the second device
pCommands[1][2] …

Output: Returns the ALValue array data used for input, and if it cannot
find the device, it returns after deleting that data.

Image 4.41, alias named “ChestLeds” has been defined for the three control devices (3 LEDs).

image 4.41 - Example of LED Lighting (3)

The “setAlias” function is the most useful for sending a command that has been defined as an alias.
However, you can still send the same control command by using the “set” function.

4 - naoqi & DCm 177


4.9 dcm bound methods
4.9.4 setAlias
“setAlias” is a function that sends different commands to control multiple devices. (Graph 4.14)

Graph 4.14 - SetAlias command

void DCM::setAlias(ALValue& pCommands)

Input: ALValue array data for device control


pCommands[0]: device name used for control
pCommands[1]: update types
“Merge”, “ClearAll”, ClearAfter”, “ClearBefore”
pCommands[2]: command types for transfer
“Time-mixed”, “Time-separate”

When using “Time-mixed”


pCommands[3][x]: control device list that is part of the alias member
pCommands[3][x][y]: command list for each device
pCommands[3][x][y][0]: float command
pCommands[3][x][y][1]: DCM time information for command application
pCommands[3][x][y][2]: priority (optional)

When using “Time-separate”


pCommands[4]: T numbers of time list for device control
pCommands[4][0]: first time pCommands[4][1]: second time pCommands[4][2] …
pCommands[4][T]: T number of time

pCommands[5][x]: x numbers of command list


pCommands[5][x][0]: device control command regarding the first time
pCommands[5][x][1]: device control command regarding the second time
pCommands[5][x][2] …
pCommands[5][x][T]: device control command regarding T number of time

Output: None

Two methods are provided for sending commands.

[Time-mixed]
For each device control, a command (in a format that combines both command and time) is sent. Following
is one example of Time-mixed which sends two commands for red LEDs and one command for the blue LED
and one command for the green LED at the same time.

The LEDs change to red for four seconds, and then the blue and green lights are turned on at the same time
for two seconds and three seconds respectively. After six seconds, the red LEDs will go off and only the green
and blue LEDs will remain.

178 4 - naoqi & dcm


4.9 DCM Bound Methods
image 4.42 - Example of LED Lighting (4)

[Time-separate]
For the control of each device, the time list and commands for each specific time will be sent separately.
This method is more efficient for bit control. The following is a Time-separate example where the time list
and command list (for each specific time) are assigned independently. (image 4.43)

image 4.43 - Example of LED Lighting (5)

4 - naoqi & DCm 179


4.9 dcm bound methods
4.9.5 SPeCial
The “special” function provides a special feature for DCM, and this function has a string parameter
(graph 4.15).

graph 4.15 - Special Command

Void DCM::special(std::string pName)

Input: string parameter for special features


“Reset”, “Chian”, “ResetMB”, “Config”

Output: None

- “Reset”: Requests a Chest Board reset


- “Chain”: Requests a new chain process within the Chest Board
- “Config”: Requests a new configuration for all the motor boards
- “ResetMB”: Ends all the motor boards and reactivates after waiting 3 seconds.
After 15 seconds, the DCM cycle activates again.

image 4.44 is an example of turning off all motors to initialize.

image 4.44 - Example of motor reset

180 4 - naoqi & DCm


4.9 dcm bound methods
4.10 DCM
Synchronization Methods

DCM provides a synchronization method through a callback function for real-time threads.
The callback function has “Preprocess” and “Postprocess” methods.

The “onPreProcess” method is called before being sent to the chestboard, and there is a short delay
as the command is being sent. The “onPostProcess” method is called after all the values in ALMemory have
been updated, and you can use this to obtain the new values from all the sensors.

To operate the functions (called from DCM threads) in real-time and to prevent DCM cycle delays,
you must comply with the following conditions:

- You must avoid all types of memory allocation methods.


- You must avoid all types of printf, cout, and file input/output.
­- The callback function must be returned within a few ms.

Most of the functions designed by the user should have set time intervals inside the system cycle.
If the return time of the designed function occurs in 1ms, 10ms, and 1ms order, it will bring inefficient results
to the entire control loop.

4 - naoqi & dcm 181


4.10 DCM synchronization Methods
5 nao
KinematiCS
LEARNING PREREQUISITE
Robot kinematics in Chapter 5 explains You will need quite a bit of mathematical
NAO’s joint structure and provides information and robotics knowledge to understand the contents
for each joint. The Denavit-Hartenberg (DH) in Chapter 5
method is used to explain the calculation
for forward kinematics. In addition, Python
will be used to create an actual forward
kinematics calculation program.

This chapter will also describe inverse kinematics


calculations and use Python to implement
the inverse kinematics calculation program
for NAO’s right arm.

182
content
5.1 Overview 184

5.2 Transformation Matrix 185

5.3 NAO Structure 186

5.3.1 Link Information 186


5.3.2 Joint Information 188
5.3.3 Head Joint 190
5.3.4 Arm Joints 190
5.3.5 Pelvic Joints 187
5.3.6 Leg Joints 187

5.4 Kinematics 193

5.4.1 Overview 193


5.4.2 Calculating the Forward 196
Kinematics of the Right Hand
5.4.3 Forward Kinematics 201
Calculation Using Python
and NAOqi

5.5 Inverse Kinematics 205

5.5.1 Overview 205


5.5.2 Using Python to calculate 209
the forward kinematics
of the right arm
5.5.3 Using Inverse Kinematics 210
to Control Movements

183
5.1 Overview

Robot Kinematics is a research of end effector movement of robots with multiple degrees of freedom.
Kinematics allows you to identify locations by calculating the connection between the default position
and each of the parts and by calculating the joint angle values.

It is largely classified as speed and location and divided into forward kinematics and reverse kinematics.
Forward kinematics utilizes matrix operations for joint rotations and structural lengths to identify the device
locations. Reverse kinematics sets the robot’s endpoint to calculate the necessary rotation value of the joints
to plan movements.

184 5 - nao kinematics


5.1 overview
5.2 tranSformation matrix

We use the 4x4 transformation matrix where the matrices are multiplied with other matrices to calculate
movement and rotation. Matrices are composed as shown below, and the element of each matrix has a role
for every position change.

The nine numerical values between A and I represent the coordinates of the rotation, and L, M, and N
represent the moving part. Through the multiplication of the transformation matrix, several position changes
can be represented with just one transformation matrix.

A three-dimensional move during a position change can be calculated using the transformation matrix
shown below. If you liked to calculate the coordinates after moving from the points (x,y,z) in the Cartesian
coordinate system to X-axis for L, to Y-axis for M, and to Z-axis for N, you can obtain the desired points
(x’,y’,z’) by assign them to the following equation.

Three-dimensional rotation can have a different transformation matrix depending on the axis of rotation.
The following shows the matrix for rotational transform based on x, y, x axes.w

5 - nao KinematiCS 185


5.2 trAnsFormAtion mAtrix
5.3 NAO
Structure

5.1 Link Information


The following shows the lengths of NAO’s hardware.

Image 5.1 - NAO’s hardware lengths

NAO is 573.2mm tall with an arm length of 29mm.


The measurement between the shoulders is 273.3mm.

The following gives you more detailed information regarding the length of each part.

• There are currently two versions of NAO (3.2 and 3.3). Since there are some differences between
the two models, you will need to edit the kinematics calculations made based on Version 3.2 in order to apply
it to Version 3.3.

Information regarding NAO’s specific weight and center of gravity for each part can be found in the following
web page:
http://www.aldebaran-robotics.com/documentation/family/nao_h25/masses_h25.html

186 5 - nao kinematics


5.3 nao structure
Image 5.2 - Lengths of NAO’s Parts

It is 126.5mm from the center of gravity to the neck, 100mm to the shoulder joint, 85mm to the hip,
and 100mm to the thigh. It is 102.75mm from the knee to the ankle, the foot height is 45.11mm,
the length between the neck and shoulder joint is 98mm, and the hand width is 15.9mm.

Image 5.3 - NAO’s arm lengths

For the arm, it is 90mm from shoulder to the elbow, 50.55mm from elbow to wrist, and the length
of the hand is 58mm.

5 - nao kinematics 187


5.3 nao structure
Graph 5.1 - Lengths of NAO’s links

Name Length (mm)

NeckOffsetZ 126.50

ShoulderOffsetY 98.00

UpperArmLength 90.00

LowerArmLength 50.55

ShoulderOffsetZ 100.00

HandOffsetX 58.00

HipOffsetZ 85.00

HipOffsetY 50.00

ThighLength 100.00

TibiaLength 102.74

FootHeight 45.11

HandOffsetZ 15.90

Graph 5.1 above shows the information regarding the lengths of all the joints.
These values can be used for the kinematic calculation of each joint.

5.3.2 joint information

Graph 5.2 - NAO’s joint information

Name Length (mm) Range (deg) Range (rad)

HeadYaw Head joint twist (Z) -119.5 to 119.5 -2.0857 to 2.0857

HeadPitch Head joint front and back (Y) -38.5 to 29.5 -0.6720 to 0.5149

RShoulderPitch Right shoulder joint front and back (Y) -119.5 to 119.5 -2.0857 to 2.0857

RShoulderRoll Right shoulder joint right and left (Z) -94.5 to -0.5 -1.6494 to -0.0087

RElbowYaw Right shoulder joint twist (X) -119.5 to 119.5 -2.0857 to 2.0857

RElbowRoll Right elbow joint (Z) 0.5 to 89.5 0.0087 to 1.5621

RWristYaw Right wrist joint (X) -104.5 to 104.5 -1.8238 to 1.8238

RHand Right hand Open And Close

188 5 - nao kinematics


5.3 nao structure
LShoulderPitch Left shoulder joint front and back (Y) -119.5 to 119.5 -2.0857 to 2.0857

LShoulderRoll Left shoulder joint right and left (Z) 0.5 to 94.5 0.0087 to 1.6494

LElbowYaw Left shoulder joint twist (X) -119.5 to 119.5 -2.0857 to 2.0857

LElbowRoll Left elbow joint (Z) -89.5 to -0.5 -1.5621 to -0.0087

LWristYaw Left wrist joint (X) -104.5 to 104.5 -1.8238 to 1.8238

LHand Left hand Open And Close

LHipYawPitch Left hip joint twist (Y-Z 45°) -65.62 to 42.44 -1.1453 to 0.7408

RHipYawPitch Right hip joint twist (Y-Z 45°) -65.62 to 42.44 -1.1453 to 0.74080

LHipRoll Left hip joint right and left (X) -21.74 to 45.29 -0.3794 to 0.7904

LHipPitch Left hip joint front and back (Y) -101.63 to 27.73 -1.7739 to 0.4840

LKneePitch Left knee joint (Y) -5.29 to 121.04 -0.0923 to 2.1125

LAnklePitch Left ankle joint front and back (Y) -68.15 to 52.86 -1.1895 to 0.9227

LAnkleRoll Left ankle joint right and left (X) -44.06 to 22.79 -0.7690 to 0.39780

RHipRoll Right hip joint right and left (X) -42.30 to 23.76 -0.7383 to 0.4147

RHipPitch Right hip joint front and back (Y) -101.54 to 27.82 -1.7723 to 0.4856

RKneePitch Right knee joint (Y) -5.90 to 121.47 -0.1030 to 2.1201

RKneePitch Right ankle joint front and back (Y) -67.97 to 53.40 -1.1864 to 0.9320

RAnkleRoll Right ankle joint right and left (X) -22.27 to 45.03 -0.3886 to 0.7858

NAO consists of 25 total joints: 2 head joints, 5 joints for each arm (10 total), 5 joints for each leg (10 total),
1 on the pelvis, and 2 that executes the opening and closing movement of the hand. Independent control of
each joint is possible, but the 2 pelvic joints must be controlled at the same time. Each joint has a limited
angle of movement, and the limitations are shown in the graph above under ‘Range.’ For the legs, crashing
with robot’s cover has been put into consideration for the joint limitation. Graph 5.2 (Motion) gives you the
axis of rotation for each joint.

You can use the key names in NAO’s ALMemory to access the current joint or sensor values.
The following shows you the command. Use ‘Joint Name’ of the desired joint from Graph 5.2
above for “Device Name” used by the command. The joint value returned will be in radians.

Command
(radian):
Device/SubDeviceList/”Device Name”/Position/Actuator/Value
Sensor
(radian):
Device/SubDeviceList/”Device Name”/Position/Sensor/Value

5 - nao kinematics 189


5.3 nao structure
5.3.3 Head Joint

Image 5.4 - NAO’s head joints

The head consists of Pitch and Yaw joints. Pitch is the joint that moves the head back and forth rotating
around the Y-axis. Yaw is the joint that moves the head side to side rotating around the Z-axis.
Image 5.4 shows the angular limit of each joint.

5.3.4 Arm Joints

Image 5.5 - NAO’s right arm joints

Shoulder, elbow, and wrist joints make up the arm. The shoulder has the Pitch joint moving back and forth
around the Y-axis and the Roll joint that lifts the arms based on the Z-axis. Elbow consists of the Roll joint
that rotates left and right based on the Z-axis and the Yaw joint that rotates based on the X-axis. Wrist
consists of the Jaw joint which rotates around the X-axis. Image 5.5 shows the possible angular movements
for each joint.

190 5 - nao kinematics


5.3 nao structure
5.3.5 Pelvic Joints

Image 5.6 - NAO’s pelvic joints

Pelvic joints consist of two RHipYawPitch and LHipYawPitch joints that rotate based on somewhere
in the middle of the Y-axis and Z-axis of LHipYawPitch and RHipYawPitch. Independent control of these joints
are impossible because they are physically configured with just one motor.

5.3.6 Leg Joints

Image 5.7 - NAO’s leg joints

The leg consists of 5 joints including HipRoll of the pelvic joint which rotates left to right based
on the X-axis, HipPitch that moves the leg back and forth based on the Y-axis, KneePitch of the knee joint
rotating around the Y-axis (similar to HipPitch), AnklePitch of the ankle joint which moves the leg back and
forth around the Y-axis, and AnkleRoll that rotates the leg left to right around the X-axis. For the joints that
make up the leg, in addition to the limits shown in Image 5.7, pitch and roll joint values are also limited
in order to prevent crashing with NAO’s surface cover. Image 5.8 and Graph 5.3 show a more detailed
information of these limitations for both legs.

5 - nao kinematics 191


5.3 nao structure
  LAnkle LAnkle LAnkle
Pitch(°) Roll+(°) Roll-(°)
-68.15 2.86 -4.29
-48.12 10.31 -9.74
-40.10 22.79 -12.60
-25.78 22.79 -44.06
5.72 22.79 -44.06
20.05 22.79 -31.54
52.86 0.00 -2.86
 

image 5.8 - NAO’s left leg joint lmitations graph 5.3 - Limitations of NAO’s left leg Joints

  Rankle Rankle Rankle


Pitch(°) Roll+(°) Roll-(°)
-67.97 4.29 -2.86
-48.13 9.74 -10.31
-40.10 12.60 -22.27
-25.78 45.03 -22.27
5.73 45.03 -22.27
20.05 31.54 -22.27
53.4 2.87 0.00
 

image 5.9 - NAO’s left leg Joint limitations graph 5.4 - Limitations of NAO’s right leg joints

192 5 - nao KinematiCS


5.3 nAo structure
5.4 KinematiCS

5.4.1 oVerVieW
The robot generally consists of continuous joints and links; movement of one joint will influence
the connected joint. Connected joints can have any length of link including 0 and can rotate around
any axis. A standard coordinate system is assigned for each joint to interpret the connection and position
and determines the general process of converting one joint to another. Starting from the reference point
to the first joint, from the first joint to the second, third, etc., until the last joint, once you are finished
with the entire transformation, you can obtain the transformation matrix which enables you to obtain
the position of the joint from the reference point.
In this book, we will use the Denavit-Hartenberg (DH) representation for kinematic calculations.
The DH method is used to describe the robot’s kinematics and represent its motions, and it can be used
regardless of the shape of the robot. As shown in the image, the DH method displays the relationship
of the connected joints in four different variables.

image 5.10 - Relationship of joints shown using the Denavit-Hartenberg (DH) Method.

• Link torsion a: Based on the Xi-1 axis, it is the angle from the Zi-1 axis to the Zi axis.
• Link length a: Offset distance from the intersection between the Xi-1 axis to the Zi-1
axis to the intersection of the Zi axis.
• Joint angle o: Angle between the Xi-1 axis and Xi axis based on the Zi axis
• Link offset d: In the Zi axis, it is the distance between the Xi-1 axis of the base frame
and the Xi axis.

In order to show the robot’s joint relationships using the DH method, first, set a standard coordinate system
for each joint. Assign Z and X axes for each joint as shown; the Y axis mutually perpendicular to both Z
and X axes can be calculated at any time. The DH method does not use the Y axis. You will need a total of four
standard exercises in order to convert to the coordinate system of the next joint.

5 - nao KinematiCS 193


5.4 kinemAtics
a - Rotate until i in the Zi-1 axis. This process makes Xi-1 and Xi parallel.

b - Move Xi-1 until di following the Zi-1 axis. This process places Xi-1 and Xi in the same position.

C - Move until Ai following the Xi-1 axis. This process places the origins of the two coordinate systems
in the same position.

D - Lastly, rotate until i based on the Xi axis. After this process is over, the two coordinate systems
become identical.

Matrix A can be obtained by multiplying the four matrices (which represent each exercise) in front
of the position. The following is the equation of this relationship. Each ‘n’ and ‘n+1’ represent the current joint
and the next joint to be connected.

Unit: M = Position change (Move), R = Coordinate rotation (Rotate).

The transformation matrix is generalized as shown below when the matrix is calculated and organized.

Unit:

194 5 - nao KinematiCS


5.4 kinemAtics
graph 5.5 shows each of the variable values regarding the joint connections that configure NAO.
The graph above shows you the information about the axis of rotation for each joint.

graph 5.5 - NAO’s connection variables

Joint name (raDianS) a (meterS) (raDianS) d (meterS)

HeadYaw 0.0 0.0 0.0 0.0

HeadPitch -PI/2 0.0 -PI/2 0.0

LShoulderPitch -PI/2 0.0 0.0 0.0

LShoulderRoll PI/2 0.0 PI/2 0.0

LElbowYaw PI/2 0.0 0.0 UpperArmLength

LElbowRoll -PI/2 0.0 0.0 0.0

LWristYaw PI/2 0.0 0.0 LowerArmLength

LHipYawPitch -3/4 * PI 0.0 -PI/2 0.0

LHipRoll -PI/2 0.0 PI/4 0.0

LHipPitch PI/2 0.0 0.0 0.0

LKneePitch 0.0 -ThighLength 0.0 0.0

LAnklePitch 0.0 -TibiaLength 0.0 0.0

LAnkleRoll -PI/2 0.0 0.0 0.0

RHipYawPitch -PI/4 0.0 -PI/2 0.0

RHipRoll -PI/2 0.0 -PI/4 0.0

RHipPitch PI/2 0.0 0.0 0.0

RKneePitch 0.0 -ThighLength 0.0 0.0

RAnklePitch 0.0 -TibiaLength 0.0 0.0

RAnkleRoll -PI/2 0.0 0.0 0.0

RShoulderPitch -PI/2 0.0 0.0 0.0

RShoulderRoll PI/2 0.0 PI/2 0.0

RElbowYaw PI/2 0.0 0.0 UpperArmLength

RElbowRoll -PI/2 0.0 0.0 0.0

RWristYaw PI/2 0.0 0.0 LowerArmLength

5 - nao KinematiCS 195


5.4 kinemAtics
The image below shows you each axis of rotation.

Image 5.11 - Link lengths of NAO’s joints and axis of rotation

The arrows in Image 5.11 each represent the X axis of the Cartesian coordinate (green), Y axis (red),
and Z axis (blue), and these are used to calculate the kinematic rotation.

5.4.2 Calculating the Forward Kinematics of the Right Hand

Image 5.12 - Joint name and limitations of NAO’s right arm

196 5 - nao kinematics


5.4 kinematics
This example calculates the hand position using NAO’s arm joint values and given link information. NAO’s
center point is the base position and it will be (0,0,0). Position change is shown using the following equation.

Pos represents the position of the Cartesian coordinate system, and as T (Transform Matrix), the subscript
of the upper left corner represents the transformation objective and the subscript of the lower left corner
represents the current position. ‘h’ is the hand and ‘o’ is the center point. ‘T’ (position change) from the
equation above can be represented through coordinate movements and rotations.

The right arm consists of five joints. Starting from the uppermost joint, there is the shoulder
(RShoulderPitch, RShoulderRoll), elbow RElbowRoll, RElbowYaw), and wrist (RWristYaw). You can obtain
the hand position if the position change occurs in the aforementioned order starting from the center point.
The following is the position change in order from the center point to the hand: center point - shoulder
- elbow - wrist - hand.

s = shoulder, e = elbow, and w = wrist. RShoulderPitch, RShoulderRoll, RElbowRoll, RElbowYaw,


and RWristYaw are the five joints of NAO used for static calculation when using the DH method.

The length from the wrist to the hand and the move from the center to the shoulder are not included.
Transformation matrix for T, which is the transformation matrix for the five joints above, is divided as shown
below and T is represented as A. For the DH method, all = ± 90° when the relationship of NAO’s arm joints
are being analyzed, and since C ( ) = 0 and a = 0, A... can be simplified as shown below.

5 - nao KinematiCS 197


5.4 kinemAtics
If you change the transformation matrix ( ) represented by T into A by using the DH method, thereby turning
it into a position change, it can be displayed using the equation below.

The position change between the non-revolving pivot point and the shoulder is expressed only by the position
movement, and this position change ( ) is expressed as shown.

Here, Sy represents the move to the Y axis and Sz represents the move to the Z axis, and appears as
the sum of the two moves. Cartesian coordinate position change from the center point to the right shoulder
happens -98mm toward Y and 100m toward Z. The following shows the transformation matrix.

Rotation transformation is included in shoulder to wrist position changes, and the transformation matrix
is determined by d, a, , and according to the DH method. Position change of the shoulder joint is
represented by which contains two pivot joints as shown in the following equation. represents
the Pitch rotation for the Y axis rotation and represents the Roll rotation of the X axis rotation.

Here, the from RShoulderPitch (the first pivot joint) from the DH method is written as A1, and since A1
is 1 = −90° , d1 = 0, the transformation matrix is as follows.

from RShoulderRoll (the second pivot joint) from the DH method is written as A2, and A2 is 2
= 90°
and d2 = 0. The transformation matrix is as follows.

198 5 - nao KinematiCS


5.4 kinemAtics
Position change of the elbow joint is accomplished by the shoulder to elbow move.

The rotation is also accomplished by the move of 1 and rotation of 2. represents the Roll rotation
of the X axis and represents the Yaw rotation of the Y axis.

Move from the shoulder to the elbow in RElowbowRoll (first elbow joint) is 90mm in the X axis,
and it is represented by ‘d.’ , a transformation matrix containing the move, is written as A3.
The transformation matrix is as shown below since it is 3 = 90°, 3 = 90°, and d3 =90 in A3.

, which is the position change of RElowbowYaw (pivot joint of the elbow) represents the Yaw rotation
in the Y axis, and it is written as A4. The transformation matrix is as shown below since it is 4 = −90°
and d4 = 0 in A4.

Position change of RWristYaw (wrist joint) occurs through the move of 1 and rotation of 1. Move to the wrist
is 50.55mm in the X axis, and represents the Yaw rotation in the Y axis.

Transformation matrix A5 of is as shown below since it is 5


= 90° and d5 = 50.55 in A5.

5 - nao KinematiCS 199


5.4 kinemAtics
The position change (which includes the rotation transformation) from shoulder to wrist is represented by
, and the following is the equation for the calculation.
(However, )

Move from wrist to hand is 58mm in the X axis and -15.90 in the Z axis, and the following shows
the transformation matrix for .

200 5 - nao KinematiCS


5.4 kinemAtics
The following shows the total transformation matrix .

The hand position in the Cartesian coordinate system based on the center point can be obtained through
the following matrix calculation.

5.4.3 forWarD KinematiCS CalCulation


uSing Python anD naoqi
a - Libraries to install

For kinematic analysis using Python, you need the math library for calculating trigonometric functions
(sine and cosine for example) and theNumPy (Numerical Python) library for matrix operations.

You can select the math library when installing Python.


NumPy can be downloaded from http://numpy.scipy.org/

The following shows how to use the library. First, you have to import the library you installed.
import math // import numpy as np

When calling functions, include the library name.


Ex) np.ones(3,3) // x = math.sin(90)

Since a 4x4 matrix is used for the transformation of each coordinate system during kinematic analysis,
a transformation matrix is created and the element is initialized with 0.

Trans = np.mat([[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0]])

5 - nao KinematiCS 201


5.4 kinemAtics
B - Calculating the position of the right hand using Python and NAOqi

The following shows the Python code that calculates the position of the right hand using
Python’s matrix calculation and having NAOqi call the joint value. Forward kinematic calculation
is done following the process mentioned above, and it outputs NAO’s current right hand position into the
center point-based Cartesian coordinate system.

### Proxy connection to access NAO


# call library
import naoqi
from naoqi import ALProxy

# Set the IP and PORT currently assigned to NAO


IP = «192.168.123.150»
PORT = 9559

# Connect Proxy to ALMemory


memProxy = ALProxy(«ALMemory»,IP,PORT)

# Import NAO’s current joint angle values


RSP = memProxy.getData(«Device/SubDeviceList/RShoulderPitch/Position/Actuator/Value»)
RSR = memProxy.getData(«Device/SubDeviceList/RShoulderRoll/Position/Actuator/Value»)
REY = memProxy.getData(«Device/SubDeviceList/RElbowYaw/Position/Actuator/Value»)
RER = memProxy.getData(«Device/SubDeviceList/RElbowRoll/Position/Actuator/Value»)
RWY = memProxy.getData(«Device/SubDeviceList/RWristYaw/Position/Actuator/Value»)

### Kinematic Calculation


# Call library and function definition
import math
import numpy as np

# Define center position, define and calculate transformation matrices, calculate hand position
Pos_o = np.mat([0,0,0,1]).T
Pos_s = Pos_o + np.mat([0,-98,100,0]).T

# Define variable value used for DH method (unit of rotation: Rad, unit of move: mm)
Theta1 = 0
Theta2 = 0
Theta3 = 1.57
Theta4 = 0
Theta5 = 0

D1 = 0
D2 = 0
D3 = 90
D4 = 0
D5 = 50.55

202 5 - nao kinematics


5.4 kinematics
Alpha1 = -1.57w
Alpha2 = 1.57
Alpha3 = 1.57
Alpha4 = -1.57
Alpha5 = -1.57

## Definition of transformation matrix


# move from the center point to shoulder Transform - origin to shoulder
Trans_os = np.mat([[1,0,0,0],[0,1,0,98],[0,0,1,100],[0,0,0,1]])

#Transformation matrix for NAO’s right shoulder Pitch joint (RShoulderPitch)


A1 = np.mat([[math.cos(RSP),-math.sin(RSP)*math.cos(Alpha1),math.sin(RSP)*math.sin(Alpha1),0], [math.
sin(RSP),math.cos(RSP)*math.cos(Alpha1),-math.cos(RSP)*math.sin(Alpha1),0],
[0, math.sin(Alpha1),math.cos(Alpha1),D1],[0,0,0,1]])

#Transformation matrix for NAO’s right shoulder Roll joint (RShoulderRoll)


A2 = np.mat([[math.cos(RSR),-math.sin(RSR)*math.cos(Alpha2),math.sin(RSR)*math.sin(Alpha2),0], [math.
sin(RSR),math.cos(RSR)*math.cos(Alpha2),-math.cos(RSR)*math.sin(Alpha2),0],
[0, math.sin(Alpha2),math.cos(Alpha2),D2],[0,0,0,1]])

#Transformation matrix for NAO’s right elbow Roll joint (RElbowRoll)


A3 = np.mat([[math.cos(Theta3+REY),-math.sin(Theta3+REY)*math.cos(Alpha3),math.sin(Theta3+REY) *math.
sin(Alpha3),0],[math.sin(Theta3+REY),math.cos(Theta3+REY)*math.cos(Alpha3),
-math.cos(Theta3+REY)*math.sin(Alpha3),0],[0,math.sin(Alpha3),math.cos(Alpha3),D3], [0,0,0,1]])

#Transformation matrix for NAO’s right elbow Yaw joint (RElbowYaw)


A4 = np.mat([[math.cos(RER),-math.sin(RER)*math.cos(Alpha4),math.sin(RER)*math.sin(Alpha4),0], [math.
sin(RER),math.cos(RER)*math.cos(Alpha4),-math.cos(RER)*math.sin(Alpha4),0], [0, math.sin(Alpha4),math.
cos(Alpha4),D4],[0,0,0,1]])

#Transformation matrix for NAO’s right wrist Yaw joint (RWristYaw)


A5 = np.mat([[math.cos(RWY),-math.sin(RWY)*math.cos(Alpha5),math.sin(RWY)*math.sin(Alpha5),0], [math.
sin(RWY),math.cos(RWY)*math.cos(Alpha5),-math.cos(RWY)*math.sin(Alpha5),0],
[0, math.sin(Alpha5),math.cos(Alpha5),D5],[0,0,0,1]])

# Transformation matrix A calculation which includes all rotation transformations of the robot
A = A1*A2*A3*A4*A5

# Position move from wrist to hand - Transform - wrist to hand


Trans_wh = np.mat([[1,0,0,58],[0,1,0,0],[0,0,1,-15.90],[0,0,0,1]])

# Calculate the hand position using the move through the hand at A (change from shoulder to wrist)
Pos_w = A*Pos_s
Pos_h = Pos_w + np.mat([58,0,-15.90,0]).T

# Calculations of integrated transformation matrix ‘Trans’ and hand position


Trans = Trans_wh*A*Trans_os
Pos_h2 = Trans*Pos_o

5 - nao kinematics 203


5.4 kinematics
# Output of hand position from the calculations of the integrated transformation matrix, move, and rotational
transform matrix
Pos_h
Pos_h2

Result from when it is RSP = 1.03395/ RSR = -0.02305/ REY = 1.61679/ RER = 0.89129/
RWY = 1.17653

# Integrated transformation matrix ‘Trans’


Trans
matrix([[ 0.69694106, -0.35896755, 0.62081838, 172.3705713 ],
[ -0.65414965, -0.67298179, 0.34522999, 132.72582755],
[ 0.2938731 , -0.64671308, -0.70384714, 99.73226304],
[ 0. , 0. , 0. , 1. ]])

Pos_h
matrix([[ 172.3705713 ],
[ 132.72582755],
[ 99.73226304],
[ 1. ]])

Pos_h2
matrix([[ 172.3705713 ],
[ 132.72582755],
[ 99.73226304],
[ 1. ]])

204 5 - nao kinematics


5.4 kinematics
5.5 inVerSe
KinematiCS

5.5.1 oVerVieW
Inverse kinematics is a form of research to find out how much each joint has to rotate to reach a certain
point when the position and bearing of the robot’s hands or feet are given. Beyond inverse kinematics for one
motion, it can also be used to control the movements by calculating the joint angle value that continuously
moves the position of the hand.

There isn’t just one combination used for the robot’s hand to reach a certain position, so it is impossible
to find the matrix element to obtain the trigonometric function value that will calculate all the joint angle
values. Therefore, in order to separate the equation used to obtain the joint angle value, you must multiply
the inverse of the transformation matrix ‘An’ to the left side of the relation equation to obtain the elements
needed to calculate the angle values.

First, the following shows the transformation matrix A regarding NAO’s right arm obtained in Section 5.4.2

Although the Yaw joint of the wrist (from NAO’s arm joints) is useful for determining the direction of the hand,
it is meaningless for obtaining the position, so the calculation for this joint is unnecessary when calculating
the hand position. Therefore, for A, you only need to obtain 1 2 3 3 (joint values of A1, A2, A3, A4) in order
to determine the hand position. Move the link value from the Yaw joint (from the equation above) in the future.
The modified equation is as follows.

5 - nao KinematiCS 205


5.5 inverse kinemAtics
The following equation multiplies , the inverse matrix of A1, to each left hand side to obtain
the first required shoulder joint.

206 5 - nao KinematiCS


5.5 inverse kinemAtics
If you look at (3,4) from the equation above, it is , and from

Also, from (1,4) and (2,4)

Joints 2 and 3 are parallel to each other, so it is not suitable to multiply the inverse matrices of A2
and A3 in the front to obtain the joint value.

From the calculation for obtaining the next 3, in order to obtain the angle value of the first shoulder joints
of A2 and A3, multiply (inverse of the transformation matrix) to each left-hand side
of the equation above.This is demonstrated in the equation below.

5 - nao KinematiCS 207


5.5 inverse kinemAtics
If you look at (4,3) in the equation above, it is
, and from ,

Also, from obtained from (2,1) and (2,3)


from the equation,

You can obtain the angle values ( 1, 2, 3, 4) of each joint through the process above in order
to position the robot’s hand in the Cartesian coordinate system (Px, Py, Pz).

208 5 - nao KinematiCS


5.5 inverse kinemAtics
5.5.2 Using Python to calculate the forward
kinematics of the right arm

The code below obtains the angle value of each joint calculated in Section 6.5.1. Joint values are
obtained in order starting from NAO’s shoulder joint, and it processes exceptions to prevent dividing by 0.
### Joint value calculation for moving to the desired coordinate
# Call library and definition of function

import math
import time

# Unit for position input: mm


# Accepts the x, y, x coordinate values for the desired position.
px = input(‘Input desire x position :’)
py = input(‘Input desire y position :’)
pz = input(‘Input desire z position :’)

### Calculate the joint angle value


# Joint1 (shoulder Pitch)
# Prevent dividing by 0
if px==0 :
theta1 = 0
else :
theta1 = math.atan(py/px)

# Joint2 (shoulder Roll)


if pz==0 :
theta2 = 0
else :
theta2 = math.atan((math.cos(theta1)*px + math.sin(theta1)*py)/pz)

# Joint3 (elbow Yaw)


if (math.sin(theta1)*px-math.cos(theta1)*py)==0 :
theta3 = 0
else :
theta3 = math.atan((math.cos(theta1)*math.cos(theta2)*px + math.sin(theta1)*math.cos(theta2)*py
- math.sin(theta2)*pz)/(math.sin(theta1)*px-math.cos(theta1)*py))

# Joint4 (elbow Roll)


if pow(math.cos(theta3),2) - pow(math.sin(theta3),2)==0 :
theta4 = 0
else :
theta4 = 1/(pow(math.cos(theta3),2) - pow(math.sin(theta3),2))

# Output calculated joint angle value print theta1, print theta2, print theta3, theta4

# Set the wait time for verifying the joint angle value time.sleep(2)

5 - nao kinematics 209


5.5 inverse kinematics
5.5.3 Using Inverse Kinematics to Control Movements
The method explained in Section 5.5.2 can be used to obtain the joint angle value of the motion that positions
the hand, and these actions can be executed in order to control the movements.

The Cartesian coordinate system determines the hand location and enters the required joint angle value
into the joint movement plan in order, and then delivers these commands to make NAO move. In the next code,
where joint angle value has already been determined, this value is used to control the movements.

### Proxy connection to access NAO


# library call
import naoqi
from naoqi import ALProxy

# Set the IP and PORT assigned to the current NAO


IP = “192.168.123.150”
PORT = 9559

# Connect proxy to ALMotion


proxy = ALProxy(“ALMotion”,IP,PORT)

### Calculate inverse kinematics


# Call library and definition of function
import math

### Call standby-related time library


import time

# Set joint movement speed


MaxSpeed = 0.2

# Set joint stiffness (the joint will not move if the stiffness is not set)
names = ‘Body’
stiffness = 1.0
proxy.stiffnessInterpolation(names, stiffness, 1.0)

### Set arm joint name


# Right arm joint
RSP=“RShoulderPitch”
RSR=“RShoulderRoll”
REY=“RElbowYaw”
RER=“RElbowRoll”
RWY=“RWristYaw”

# A plan for controlling the entire right arm names_RArm =


[“RShoulderPitch”,“RShoulderRoll”,“RElbowYaw”,“RElbowRoll”,“RWristYaw”]

210 5 - nao kinematics


5.5 inverse kinematics
# Left arm joint
LSP=“LShoulderPitch”
LSR=“LShoulderRoll”
LEY=“LElbowYaw”
LER=“LElbowRoll”
LWY=“LWristYaw”

# A plan for controlling the entire left arm


names_LArm = [“LShoulderPitch”,“LShoulderRoll”,“LElbowYaw”,“LElbowRoll”,“LWristYaw”]

### Joint control command


# Default command form for joint control
# Names = joint name, angles = joint angle value, MaxSpeed = joint rotation speed
# proxy.setAngles(names,angles,MaxSpeed)

# Example of arm waving


proxy.setAngles(“RShoulderPitch”,-1.3,0.2)
proxy.setAngles(“RShoulderRoll”,-0.9,0.2)
proxy.setAngles(“RElbowYaw”,0.2,0.2)
proxy.setAngles(“RElbowRoll”,0.9,0.2)
proxy.setAngles(“RWristYaw”,-0.5,0.2)

proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
time.sleep(1)

proxy.setAngles(“RShoulderRoll”,-0.3,0.3)
time.sleep(1)

proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
time.sleep(1)

proxy.setAngles(“RShoulderRoll”,-0.3,0.3)
time.sleep(1)

proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
time.sleep(1)

proxy.setAngles(“RShoulderRoll”,-0.3,0.3)
time.sleep(1)

proxy.setAngles(“RShoulderRoll”,-0.8,0.3)

# Set joint value


# Array with sequential joint shift values, 1st= joint value of first position, 5th= joint value of fifth position

5 - nao kinematics 211


5.5 inverse kinematics
# [[Joint1-1st, … Joint1-5th],[…],[…],[…],[Joint5-1st, … Joint5-5th]]
angleLists = [[0,-1.3,-1.3,-1.3,-1.3,-1.3,-1.3,-1.3,0],[0,-0.9,-0.8,-0.3,-0.8,-0.3,-0.8,-
0.3,,0],[0,0.2,0.2,0.2,0.2,0.2,0.2,0.2,0],[0,0.9,0.9,0.9,0.9,0.9,0.9,0.9,0],[0,-0.5,-0.5,-0.5,-0.5,-0.5,-0.5,-0.5,0]]
anglesinit = [[0],[0],[0],[0],[0]]

# Set time intervals for action


# Set when it will move according to the joint value, unit : seconds
# Moves according to the joint value corresponding to each time, ex) 1sec - 90°, 3sec - 35°
times = [[1,2,3,4,5,6,7,8,9],[1,2,3,4,5,6,7,8,9],[1,2,3,4,5,6,7,8,9],[1,2,3,4,5,6,7,8,9],[1,2,3,4,5,6,7,8,9]]
timesinit = [[1],[1],[1],[1],[1]]

# Option that determines whether relative joint values (which includes the current joint value) or absolute joint
values will be used
isAbsolute = True

# Initialize joint position


# Set joint values of both arms to 0.
proxy.angleInterpolation(names_RArm, anglesinit, timesinit, isAbsolute)
proxy.angleInterpolation(names_LArm, anglesinit, timesinit, isAbsolute)
time.sleep(5)

# Call actions according to the fixed list of joint values


proxy.angleInterpolation(names_RArm, angleLists, times, isAbsolute)

proxy.angleInterpolation(names_RArm, angleinit, timesinit, isAbsolute)


time.sleep(5)

proxy.angleInterpolation(names_LArm, angleLists, times, isAbsolute)

proxy.angleInterpolation(names_LArm, anglesinit, timesinit, isAbsolute)

time.sleep(5)

# End action and output result


print proxy.getSummary()

Image 5.13 is a simulation screen showing how Choregraphe moves the NAO robot as the next
codes executed. Nine target positions received and the joint angle value is calculated. Then the robot
will move by delivering this value to each joint. (Use Python to create the executable file through
the code shown below. )

### Sequential movement control using inverse kinematics (pick up)


# Call library
import math
import time
import naoqi
from naoqi import ALProxy

212 5 - nao kinematics


5.5 inverse kinematics
Image 5.13 - Using forward kinematics for NAO’s sequential movement control (picking up)

5 - nao kinematics 213


5.5 inverse kinematics
# Set the IP and PORT current assigned to NAO
IP = «192.168.123.145»
PORT = 9559
proxy = ALProxy(«ALMotion»,IP,PORT)

# Set the joint movement speed


MaxSpeed = 0.2

# Joint will not move if stiffness is unset,


so set it like the following
names = ‘Body’
stiffness = 1.0
proxy.stiffnessInterpolation(names, stiffness, 1.0)

point = [[1,-1,-1],[1,-0.8,-1],[1,-0.5,-1],[1,-0.3,-1],[1,0.01,-0.8],[1,0.3,-1],[1,0.5,-1.2],
[1,0.55,-1.4],[1,0.7,-1.57]]

for i in range(0,9) :
px = point[i][0]
py = point[i][1]
pz = point[i][2]
print px,py,pz

# Calculate joint angle


if px==0 :
theta1 = 0
else :
theta1 = math.atan(py/px)

if pz==0 :
theta2 = 0
else :
theta2 = math.atan((math.cos(theta1)*px + math.sin(theta1)*py)/pz)

if (math.sin(theta1)*px-math.cos(theta1)*py)==0 :
theta3 = 0
else :
theta3 = math.atan((math.cos(theta1)*math.cos(theta2)*px +

math.sin(theta1)*math.cos(theta2)*py - math.sin(theta2)*pz)/(math.sin(theta1)*px -
math.cos(theta1)*py))

if pow(math.cos(theta3),2) - pow(math.sin(theta3),2)==0 :
theta4 = 0
else :
theta4 = 1/(pow(math.cos(theta3),2) - pow(math.sin(theta3),2))
print theta1,theta2,theta3,theta4

214 5 - nao kinematics


5.5 inverse kinematics
### Deliver command to the joints of the arms
# Operate the right arm joint
proxy.setAngles(«RShoulderPitch»,theta1,0.2)
proxy.setAngles(«RShoulderRoll»,theta2,0.2)
proxy.setAngles(«RElbowYaw»,theta3,0.2)
proxy.setAngles(«RElbowRoll»,theta4,0.2)
# Operate the left arm joint
proxy.setAngles(«LShoulderPitch»,theta1,0.2)
proxy.setAngles(«LShoulderRoll»,-theta2,0.2)
proxy.setAngles(«LElbowYaw»,theta3,0.2)
proxy.setAngles(«LElbowRoll»,-theta4,0.2)

# Standby time for the next movement


time.sleep(1)

proxy.setAngles(«RShoulderRoll»,(-0.2),0.2)
proxy.setAngles(«LShoulderRoll»,-(-0.2),0.2)

5 - nao kinematics 215


5.5 inverse kinematics
6 ComPrehenSiVe
examPleS
LEARNING
Comprehensive Exercises in Chapter 6 use the
information thus far to look at different methods
and examples for implementing NAO’s applications.
Advanced Choregraphe features and expansion
methods will be used here and you will be able
to practice using Timeline Editor.

In addition, landmark recognition will be used


to create a path-finding program, and the
multiplication example will help you learn some
of the techniques for Python and NAOqi API.
Last, but not least, image recognition will be used
to classify objects and inverse kinematics
and NAOqi usage will be explained.

216
content
6.1 Choregraphe Application 218 6.5 Combining Recognition and 261
Movement – Using Images for Object
6.1.1 Program Configuration 218 recognition and Grabbing Motion
6.1.2 NAOqi API 219
6.1.3 Keyframe 226 6.5.1 Object Recognition 262
6.1.4 Timeline Editor 231 6.5.2 Grabbing the Object Using 263
Inverse Kinematics Analysis
6.5.3 Grabbing the object using 264
6.2 Motion Control – Timeline Editor 235 inverse kinematics analysis
6.5.4 Combining recognition 267
6.2.1 Saving NAO’s Actual Movements 235 and Grabbing Motion 5
6.2.2 Adjusting NAO’s movements 238
6.2.3 Controlling Joint Movements 242

6.3 Getting Directions Using 246


Landmarks – Using Choregraphe

6.3.1 Studying and Recognizing 247


Landmarks
6.3.2 Programming 248

6.4 Memorizing the Multiplication 251


Table – Python and NAOqi
Application

6.4.1 Singular Number 252


(Multiplicand) Calculator
6.4.2 Adjusting the Multiply Box 254
6.4.3 “Say”Box Expansion 255
6.4.4 Box Placement 258
and connection

217
6.1 Choregraphe
Application

Chapter 2 introduced the interface, basic programming method, and boxes provided by Choregraphe.

Chapter 3 explored how to use Python to create new boxes and edit box scripts.

Chapter 4 looked into the NAOqi framework that configures NAO’s system and the DCM in charge
of the communication between NAO’s devices. In this chapter, we will use the advanced features
of Choregraphe and Python to edit scripts and used the NAOqi framework and DCM features to implement
an actual program.

6.1.1 Program Configuration

Image 6.1 - Program configuration

Most of the boxes provided by Choregraphe have diagrams, and they are a combination of boxes
that consist of internal scripts of a diagram or boxes composed of Timeline. Much like the Say box,
the box composed of scripts is made of Python (or Urbi) and NAOqi API. Like the Arms Example box,
the box composed of Timeline is made up of Keyframe and Frame. Using Choregraphe to program NAO’s
movements is to place and connect these boxes. Using Choregraphe to program NAO’s actions means you
have to place and connect these boxes.

Timeline keyframe has an internal diagram, and its special feature is being able to start at a specific
frame. For example, the Hello box provided by Choregraphe is a Timeline box composed of defined frames
and the FaceLeds keyframe. The diagram for the FaceLeds keyframe is composed of the Light_
AskForAttentionEyes box in script form. (Image 6.2)

Meaning, boxes that use various devices and joints (like the Hello box) are composed of scripts
that call the methods of all the devices and Timeline with predefined movements.

218 6 - comprehensive examples


6.1 choregraphe application
image 6.2 - Hello box structure

6.1.2 naoqi aPi


As introduced earlier, the box script consists of Python and NAOqi API. NAOqi API can control NAO’s system
and most of the devices. Therefore, in order to create or edit boxes, you have to use both Python and NAOqi
API. Since Chapter 3 explained how to create and edit script boxes, it will be omitted here. Much like the Eyes
LEDs box script, most of the boxes use NAOqi API. (image 6.3)

image 6.3 - onInput_onStart method for the Eyes LEDs box script

6 - ComPrehenSiVe examPleS 219


6.1 choregrAphe ApplicAtion
NAOqi is largely divided into 35 modules, and they are explained in detail in each of their references (NAOqi
API). Graph 6.1 is the list of NAOqi API. Like the getParameter method, there are some methods used that
aren’t registered in the NAOqi API module. This chapter will mainly use NAOqi API and whatever method
is necessary for the example being presented.

Graph 6.1 - NAOqi API module name

ALAudioDevice ALAudioPlayer ALAudioSourceLocalization

ALBehaviorManager ALBluetooth ALBonjour

ALFaceDetection ALFaceTracker ALFileManager

ALFrameManager ALFsr AlInfrared

ALLandMarkDetection ALLaser ALLauncher

ALLeds ALLogger ALMemory

ALMotion ALMotionRecoder ALPreferences

ALPythonBridge ALRedBallDetection ALRedBallTracker

ALRobotPose ALSensors ALSentinel

ALSonar ALSoundDetection ALSpeechRecognition

ALTextToSpeech ALVideoDevice ALVisionRecognition

ALVisionToolbox DCM

■ Example 6.1 Turning off all LEDs for a certain period of time (Example File: Ex6.1_AllLEDs.crg)

NAO not only has LEDs for eyes, ears, and feet controlled by the LED box but also LEDs on the chest
and around the contact sensors. There are ten small blue LEDs on one ear configured at 36 degree intervals.
One eye has eight red, green, and blue LEDs configured at 45 degree intervals. The chest (power button)
and feet LEDs have one red, one green, and one blue LED. Each LED has a group name: AllLeds, ChestLeds,
EarLeds, FaceLeds, and FeetLeds are the most commonly used. Contact sensor LEDs can use AllLeds to
turn on and off much like all the other LEDs, but it cannot be controlled independently. Other group names
and LED names are specified in the reference (Advanced/Robot Lights).

The LED library boxes introduced in Chapter2 target LEDs of specific parts (like ears and eyes),
and the control is fixed. The user must edit the box to have a more flexible control of the LEDs.
The ALLeds module of NAOqi API provides a method to control the LEDs.

This example will use Python programming to turn off the LEDs for the eyes and ears and to turn them
back on after a certain period of time. RGB color representation will be used to control the LEDs of the chest,
eyes, and feet. You can easily turn the LEDs on or off by using the ‘off’ and ‘on’ methods.

220 6 - comprehensive examples


6.1 choregraphe application
The ‘fade’ method adjusts the intensity of one LED while the ‘fadeRGB’ method adjusts RGB type LED colors
(ex: eye LEDs). Both ‘off’ and ‘on’ methods immediately turn the LEDs on or off as soon as they are called.
If you set the duration of fade/fadeRGB method to 0, you will get the same effect as the off/on method.

This example will control the LEDs by using the ‘fade’ and ‘fadeRGB’ methods. The following is
an explanation of the off/on, fade, and fadeRGB methods:

- void off(on)(const string& name)


Role: Turns a specific LED or a group of LEDs on or off
name: Refers to the name of an LED or a group of LEDs

- void fade(const string& name, const float &intensity, const float &duration)
Role: Intensifies a specific LED or a group of LEDs for a set period of time.

Name: Refers to the name of a specific LED or a group of LEDs


Intensity: the LED intensity, value from 0.0 to 1.0
Duration: time it takes to change the LED brightness in seconds

- void fadeRGB(const string& name, const int &rgb, const float &duration)
Role: changes the intensity of a specific LED or a group of LEDs for a certain amount of time.
Name: refers to the name of an LED or a group of LEDs.
RGB: RGB value, an integer. If represented with a hexadecimal, it will be 0x00RRGGBB.
Duration: time it takes to change the LED brightness in seconds

Image 6.4 - AllLeds box and the Parameters screen

The AllLeds box we will create in this example consists of one input, one output, and three parameters.
The following explains the input and parameters (Image 6.4).

6 - comprehensive examples 221


6.1 choregraphe application
- Input (RGB value): Executes the box and delivers the RGB value (number array with 3 as its size).

-­ Duration_keep: Float type parameter. Refers to the time (seconds) maintained


with the LEDs turned off. Value ranges from 0.0 to 30.0. Default value is 10.0 seconds.

-­ Duration_off(on): Float type parameter. Refers to the time (seconds) spent on turning
the LEDs on or off completely. Value ranges from 0.0 to 10.0. Default value is 2.0 seconds.

Image 6.5 - AllLeds box and the Parameters screen

222 6 - comprehensive examples


6.1 choregraphe application
A - Create a box named AllLeds. Image 6.5 shows an explanation of each input/output.

B - The following is the script for turning all the LEDs on or off.

1. def onInput_RGBvalue(self, p):


2. ALLeds.fade(“AllLeds”, 0.0, self.getParameter(“Duration_off”))
3. time.sleep(self.getParameter(“Duration_keep”))
4. ALLeds.fade(“AllLeds”, 1.0, self.getParameter(“Duration_on”))
5. pass

The onInput_RGBvalue(self, p) method becomes active when a signal comes in to the input of the AllLeds
box. As previously stated, the input is an array of numbers with three digits. The next process explains how
to use the RGB value.

LEDs are controlled by using the fade( ) method (Line 2) in the ALLeds module. “AllLeds,” the first
parameter, is a grouping of NAO’s LEDs. 0.0, the second parameter, minimizes the LED intensity.
Self.getParameter(“Duration_off”), the third parameter, changes Duration_off (AllLeds box parameter)
into a circular value. This value refers to the time spent until the LEDs are completely turned off. Time.sleep
( ) in Line 3 is the time module’s sleep method provided by Python; it delays the system for a certain amount
of time. The sleep method parameter is in numeric data type and is measured in seconds. It turns off all
the LEDs and uses the sleep method in order to turn on the LED after a certain amount of time.
Duration_keep parameter in the AllLeds box is the Sleep method parameter.

ALLeds.fade( ) in Line 4 is used differently than the one in Line 2. Here, it functions to turn on all the LEDs.
The second parameter, 1.0, maximizes the LED intensity.

If you are executing by inserting just the part of the script above the default script, you will see that NAO’s
ear LEDs and contact sensors will turn blue, and eye LEDs, power button, and feet will turn white. The white
color comes from RGB color representation (255, 255, 255). This is why the color is white when the second
parameter (LED intensity) of the ‘fade’ method is 1.0. RGB color value of black is (0, 0, 0).

6 - comprehensive examples 223


6.1 choregraphe application
[3] Controlling the eyes, chest, and feet LEDs through RGB color representation is the same as the method
used by the Eyes LEDs box. The following is a script that controls the LEDs through the input signal
of the AllLeds box and RGB color representation.

1. def getRGB(self, r, gb):


2. return 256*256*r + 256*g + b
3. pass
4. def clampColor(self, p):
5. f(p < 0):
6. p=0
7. if(p > 255):
8. p = 255
9. return p
10. def onInput_RGBvalue(self, p):
11. r = self.clampColor(p[0])
12. g = self.clampColor(p[1])

13. b = self.clampColor(p[2])
14. rgb = self.getRGB(r,g,b)
15. #… LED on and off …
16. ALLeds.fadeRGB(«ChestLeds», rgb, self.getParameter(«Duration_on»))
17. ALLeds.fadeRGB(«FaceLeds», rgb, self.getParameter(«Duration_on»))
18. ALLeds.fadeRGB(«FeetLeds», rgb, self.getParameter(«Duration_on»)
19. pass

In order to use RGB color representation, you have to convert three color values into one.
The getRGB method in Line 1-3 handles this.

The parameters of the getRGB method are r(red), g(green), and b(blue). It is 256*256*r + 256*g + b after
being converted by RGB color representation. The clampColor method guarantees the validity of each color
value (Line 4-9). As introduced earlier, each RGB color has a value between 0 and 255. If a value exceeds
this, it will be presented as a completely different color. You can prevent this by using the clampColor
method.

After receiving the array of values for R, G, and B from the Color Edit box, onInput_RGBvalue method
validates the color values and converts them to one number. Parameter ‘p’ has R, G, and B color values
as an element of the numeric array.

Line 11-13 calls the clampColor method and validates the three color values. Line 14 calls the getRGB
method to converwt it to one number. Line 15 turns the LEDs on and off. Further explanation is omitted
here since it was previously explained. Line 16-19 turns on the chest, eyes, and feet LEDs to RGB colors.
ChestLeds, FadeLeds, and FeedLed groupings of the chest, eyes, and feet LEDs are entered as strings
into the first parameter. The second ‘rgb’ is a number converted by the getRGB method.

224 6 - comprehensive examples


6.1 choregraphe application
The folowing is the AllLeds box script.

1. class MyClass(GeneratedClass):
2. def_init_(self):
3. GeneratedClass__init__(self)

4. def onLoad(self):
5. pass
6. def onUnload(self)
7. pass
8. def getRGB(self, r, g, b):
9. return 256*256*r + 256*g + b
10. pass
11. def clampColor(self, p):
12. if(p < 0):
13. if(p < 0):
14. if(p > 255):
15. p = 255
16. return p
17. def onInput_RGBvalue(self, p):
18. r = self.clampColor(p[0])
19. g = self.clampColor(p[1])
20. b = self.clampColor(p[2])
21. rgb = self.getRGB(r,g,b)
22. ALLeds.fade(“AllLeds”, 0.0, self.getParameter(“Duration_off”))
23. time.sleep(self.getParameter(“Duration_keep”))
24. ALLeds.fade(“AllLeds”, 1.0, self.getParameter(“Duration_on”))
25. ALLeds.fadeRGB(“ChestLeds”, rgb, self.getParameter(“Duration_on”))
26. ALLeds.fadeRGB(“FaceLeds”, rgb, self.getParameter(“Duration_on”))
27. ALLeds.fadeRGB(“FeetLeds”, rgb, self.getParameter(“Duration_on”))
28. pass

Image 6.6 shows an example of how to use the AllLeds box.

Although you can edit the box to use the RGB color values for the parameters, it would be easier to use
the Color Edit box to set the desired colors.

Image 6.6 - How to use the AllLeds box

6 - comprehensive examples 225


6.1 choregraphe application
6.1.3 Keyframe
Timeline box is made of frames that define movements according to the time and Keyframe diagram
that starts at a specific point in time. Chapter 2 explained how to define movements in frames.
The example below show how Keyframe is used.

■ Example 6.2 Exercises and Shouting (Example File: Ex6.2_Aerobic.crg)

Frame 1 Frame 10 Frame 20 Frame 30 Frame 40

Aerobic One Two Three Four

Frame 50 Frame 60 Frame 70 Frame 80 Frame 90

Five Six Seven Eight Finished

Image 6.7 - Movements and shouting

Example 6.2 demonstrates simple exercise movements and shouting according to each movement
(Image 6.7). You can use Keyframe to implement the desired movements at specific times. We will omit
how to create movements since it was previously explained. The Aerobic box where NAO’s movements are
defined is saved in the example file (Ex6.2_Aerobic.crg). The followings shows you how to use Keyframe.

226 6 - comprehensive examples


6.1 choregraphe application
u v

Image 6.8 - Creating Behavior layers and Keyframe

A - Open the example file with predefined movements (Ex6.2_Aerobic.crg).


Double-click the box and open the Timeline window (Image 6.8).

B - Layer is created when you press the + button in ‘Behavior layers.’ The layer has several sequential
Keyframes. A diagram opens if you select keyframe1 which was initially generated. This diagram starts
in Frame 1. The start frame can be set by moving the Keyframe.

You can move the Keyframe by dragging it with your mouse or selecting the Edit Keyframe
menu by right-clicking your mouse on top of the Keyframe.

The diagram’s onLoad input (j in Image 6.8) activates when the current frame passes the start frame
of Keyframe.

The onStopped output (k in Image 6.8) is the end of the Aerobic box.

6 - comprehensive examples 227


6.1 choregraphe application
Image 6.9 - Adding and selecting Keyframe

C - To add/delete Keyframe, use Insert Keyframe/Delete Keyframe after clicking the right mouse button
on top of the Keyframe. As you can see in Image 6.9, Keyframe is created where the mouse was.
The chosen Keyframe turn bright blue while the one that wasn’t will turn bright purple.

Image 6.10 - Box placement for each Keyframe diagram

D - Add Keyframes for Frame 1, 10, 20… 90, and place a Text Edit box and Say Text box in each Keyframe
diagram. Enter the Text Edit box for each Keyframe as shown in Image 6.10. Be careful not to connect
the end output of the Say Text box and the end output of the diagram with each other.

228 6 - comprehensive examples


6.1 choregraphe application
n Example 6.3 Stand Up Box Analysis

Chapter 2 omitted any detailed information regarding Keyframes, so we did not explore
how to use the Play, Stop, Goto And Play, and Goto And Stop boxes of the Tool library. Keyframe manipulation
will be explained by analyzing the Stand Up box (from the Motion library).

Image 6.11 - Keyframe structure of the Stand Up box

Image 6.11 shows the Keyframe structure of the Stand Up box. The solid red line shows the transitions
between Keyframes. The solid blue lines show the connection between the same keyframe. Keyframe
movements are implemented using the Goto And Stop box. Unlike the Goto And Stop box introduced
in Chapter 2 which sets the frame number, the Goto And Stop box used by the Stand Up box sets the name
of the frame.

6 - comprehensive examples 229


6.1 choregraphe application
Image 6.12 - DetectRobotPose Keyframe configuration

Keyframe mainly moves twice. The first move occurs in the DetectRobotPose keyframe as shown
in Image 6.12. If NAO’s current pose obtained from the ‘get Robot Pose’ box is either “Sit”, “Crouch”, Knee”,
“Frog”, “Belly”, “Back”, “Left”, or “Right,” the standing motion for each pose is moved to a defined Keyframe.
Here, you must use the ‘Goto And Stop’ box to prevent congestion between these defined Keyframes.

Image 6.13 - FromSit Keyframe configuration

The second move occurs inside the defined keyframe (Image 6.13). For example, when NAO is sitting down,
it moves from DetectRobotPose keyframe to the FromSit keyframe. StandFromSitted box (moving from
the sitting positing to standing) is inside the FromSit keyframe. Once you complete the StandFromSitted box,
the number of tries increase by 1 through the Increase Count box. It moves back to the DetectRobotPose
keyframe afterward to determine whether standing was successful and to retry if it was unsuccessful.

230 6 - comprehensive examples


6.1 choregraphe application
6.1.4 Timeline Editor
In Chapter 2 we used time-based programming to define NAO’s movements in simple frames.
We were able to see that the movement between two frames (with defined joint rotations)
were automatically interpolated by set rules.

Timeline Editor can be used in a set form or by the user to manipulate automatically interpolated
movements. You can also define NAO’s actual movements in the Timeline frame at regular intervals by using
the motion recording function.

x
v
u
w

Frame 0 Frame 10 Frame 20 Frame 30

Image 6.14 - Example of Timeline Editor (curves mode) and joint information

Image 6.14 shows a Timeline Editor screen of a simple movement. Timeline Editor is divided
into worksheet mode, curves mode, and record mode. Worksheet mode shows whether the joint
has been defined (Image 6.15). Curves mode shows the changes of the joints (Image 6.14).
Record mode provides the function that allows you to save NAO’s movements in the Choregraphe frame;
a later example will explain how to do this.

6 - comprehensive examples 231


6.1 choregraphe application
Image 6.15 - Worksheet mode

Timeline Editor interface is largely divided into four parts. Actuators (j in Image 6.14) show the name and
color of NAO’s joints. k in Image 6.14 consists of buttons that manipulate the frames. l in Image 6.14
shows joint movements. The points here show that the movements (rotation angle of joint) in corresponding
frames have been defined. Worksheet mode shows whether each joint is being used. The Record menu saves
NAO’s actual movements in Choregraphe’s frame (m in Image 6.14).

Graph 6.2 - Curves mode buttons

Constant: Executes joint movements only in defined frames. The joint movements executed thus far
have been smooth, but the movements edited with Constant are stiff.

Linear: Movement is linearly interpolated until the next frame (where the movement has been defined).

Bezier: Interpolation using the Bezier curve. It moves rather slowly in at the beginning and end compared
to other spots. Two tangents are created for each point, and the user can use the tangent of the slope
to manipulate the Bezier curve.

Automatic Bezier: Interpolation using the Bezier curve. Choregraphe sets the tangent of the slope
on both ends. Here, smoothness of motion will be the standard. Movement of this curve usually occurs
when movements are defined in Timeline frames.

Bezier Smooth: Interpolation using the Bezier curve. It is similar to the two manipulation methods above.
The only thing different from the Bezier method is that the two tangents maintain a straight line. Meaning,
if you change the slope of the left tangent, the slope of the right tangent will also change.

232 6 - comprehensive examples


6.1 choregraphe application
Bezier Symmetrical: Similar to the Bezier Smooth method, but the difference is that the length of the two
tangents determine the size of the Bezier curve. The longer the lengths of the two tangents, the more linear
the Bezier curve will become.

Simplify: Button that reduces the frame with the defined movements limited to the frames that have identi-
cal joint movements. For example, if you assume that NAO’s movements are defined in four frames and that
the head joint is not used by the entire section, this button can be used to delete the head joint definition for
the two middle frames.

Show tangents: Shows the tangent limited to the joints and movement-defined frame in Curves mode.

View all: If the output of Curves mode was edited by the user, this button can be used to initialize it.

Frame manipulation for Timeline Editor uses the Curves buttons shown in k (Image 6.14) and the joint
information curve in m (Image 6.14). If the user is personally manipulating the Curves mode, you must
use the joint points defined by each frame. Graph 6.2 explains the buttons used for the Curves mode.

Image 6.16 - Record mode and operational buttons for Timeline Editor

If you use Timeline Editor’s Record mode, you can save NAO’s actual movements in Choregraphe frames.
As introduced earlier, Record mode can be executed by using ‘Switch recording mode’ in the Record menu.
If you place a checkmark for ‘View record toolbar’ in the Record menu, Record mode buttons will be added
to the Timeline Editor interface (Image 6.16).You can see that buttons for recording and playback have been
added for each joint in the Actuators window of Timeline Editor (Image 6.16). Image 6.16 shows the Record
mode buttons on the upper right corner: (starting from the left) switch recording mode button, start button,
and settings button.

6 - comprehensive examples 233


6.1 choregraphe application
Image 6.17 - Setup window for Record mode

As shown in Image 6.17, a setup screen will open if you click the ‘settings’ button in Record mode.
‘Mode’ can select how to save NAO’s actual joint information in Choregraphe. ‘Advanced’ can set
when it will be saved. ‘Periodic’ (in ‘Mode’) saves NAO’s actual movementsto the frame in set cycles,
and ‘Interactive using bumpers’ uses NAO’s bumper sensors to save NAO’s actual movements.
After actually manipulating NAO, the user can save the movements to the Choregraphe frame by pressing
NAO’s left bumper. The right bumper can set whether to enable the joint locks. ‘Time step’ (in ‘Advanced’)
is the interval where NAO’s movements are saved, and ‘Allow Timeline extension’ can set whether
Timeline’s end frame can be expanded.

We have explored NAOqi API, Timeline’s Timeline Editor, Keyframe, and Record mode.
The following sections will show examples of how to create exercise movements using Timeline Editor
and Record mode, finding paths using landmarks, and memorizing the multiplication table using Python
and NAOqi API.

234 6 - comprehensive examples


6.1 choregraphe application
6.2 motion Control
-timeline eDitor

It is important and useful to interpolate the joint value between the two frames where NAO’s joint values have
already been set. The control buttons for Curve introduced in graph 6.2 each refers to an interpolation method.
Typically linear interpolation interpolates between two frames, so the variation of the joint value is the same.
Interpolation method in NAO focuses on how suitable it is for the movement. Choregraphe uses the Automatic
Bezier interpolation method, and it focuses on the smoothness of the motion. You can use the Curve control button
to easily implement wide, spur of the moment joint motions as well as slow rotations.

The Record mode saves NAO’s actual movements. It is not an easy task for the user to select the frame from
the Timeline window to save the joint values in the appropriate frame. Choregraphe provides the ability to record.
The Record mode saves NAO’s movements in the frame by using specific time intervals and specific frame intervals
through signals provided by the user. This feature helps the user program a more ideal movement.

In this section, Timeline Editor will be used to record NAO’s actual movements, adjust joint movements,
and control movement formations between frames using the Curve control button. This exercise is implemented
in ch6.2.timeline editor.crg.

6.2.1 SAVINg NAO’S ACTuAL mOVemeNTS


As introduced above, Timeline Editor can be used to save NAO’s actual joints in the Timeline frame. In this section,
NAO’s bumper sensor is pressed to use the ‘Interactive using bumpers’ method to save movements in the frame.

image 6.18 - Setting the Record mode (left) and Record Start message (right)

Set the Record mode to ‘Interactive using bumper,’ the time to 0.5 seconds, and activate the ‘Allow Timeline
extension.’ (Left side of image 6.18) The message on the right side of image 6.18 will be shown when you press
the start button in Record mode. The movement will start saving once you press the OK button.

6 - ComPrehenSiVe examPleS 235


6.2 motion control- timeline editor
The order for saving movements is as follows:

image 6.19 - Select the joint to be used in Record mode

a - In order to save NAO’s actual movements, you must select the joint that will be used.
(image 6.19) This exercise will use the head and arm joints, so click the Record button for both Head
and Arms; the Record button will turn red. If a joint is not selected, the joint will neither lock nor unlock
even if NAO’s right bumper is pressed.

image 6.19 - Select the joint to be used in Record mode

image 6.20 - When the right bumper is pressed.

b - Click the ‘stand’ position from the Pose Library in order to set the initial pose, and then press NAO’s
left bumper. If the left bumper is pressed, a red line will appear in Frame 7 as shown in image 6.19.
This is the frame where NAO’s movement will be saved; this is determined by both the FPS value
of Timeline and the ‘time step’ value of the Record mode. In this exercise, the default value (15)
was used for FPS, and 0.5 seconds was used for the ‘time step’ value. Meaning, FPS (15) / time step (0.5)
determines the frame where the actual movements are saved.

236 6 - ComPrehenSiVe examPleS


6.2 motion control- timeline editor
Actual NAO

image 6.21 - Movements of actual NAO and 3D NAO

C - After setting the initial pose, press the button for the right bumper to unlock the joint. After the user sets
the unlocked joint into a desired position, the right bumper has to be pressed to lock the joint again. Once set
in the desired position, press the left bumper to save the position to the Choregraphe frame. image 6.21
shows the screen that’s been saved to the frame after outstretching NAO’s forward. There is a red line shown
in Frame 15 in Timeline. This means that it is saved every 7.5 frame intervals. Frames are separated
by integers, so Step [2] was saved in 7 frame intervals while it was saved in 15 frame intervals in this exercise.

image 6.22 - Actual NAO and 3D NAO movements according to frames

6 - ComPrehenSiVe examPleS 237


6.2 motion control- timeline editor
D - Any tasks after [4] are repeats of steps [2] and [3], so detailed description will be omitted here.
The movements used in this exercise are shown in image 6.22.

image 6.23 - Curve movement saved in the frame

e - If all movements have been saved, click the Stop button in Record mode to stop the recording.
When the recording stops, the curve movement saved in the frame will appear on the screen as shown
in image 6.23. You can see that it has the frame with the initial pose we previously selected and 5 frames
from image 6.22. The movement starts in Frame 7 and ends in Frame 45.

6.2.2 ADJuSTINg NAO’S mOVemeNTS


The previous section demonstrated how to save NAO’s actual movements in Timeline frames using the
Record mode. Although using the Record mode can be more convenient than using 3D NAO’s joint control
window, there is a disadvantage of not being able to use the Mirroring function. In this section, we will
demonstrate how to adjust movements that were saved by using the Record mode in Timeline Editor.
The user can control NAO’s joints with more accuracy through this process. Joints can be controlled
as follows:

238 6 - ComPrehenSiVe examPleS


6.2 motion control- timeline editor
image 6.24 - Joint curve of the desired area

a - Select the desired joint for observation in Actuators from the Timeline Editor window.
image 6.24 shows the joint curve for the entire arm for both left and right arms. If NAO’s entire joint
is used, 25 curves will be shown. If this feature is used, only the desired joint can be observed.

image 6.25 - Rotation informatiom of LShoulderPitchJoint and RshoulderPitchJoint (Frame7)

b - When the cursor is placed on a point inside the frame where movement has already been defined,
frame number and information regarding joint rotation will come up as shown in image 6.25.
image 6.25 shows the rotation information of the LShoulderPitch joint and RShoulderPitch joint.

6 - ComPrehenSiVe examPleS 239


6.2 motion control- timeline editor
image 6.26 - Expansion of Joint curve (left) and curve key edit menu (right)

C - If you would like to set the two joints with the same value, drag the point (of the corresponding joint)
to move it. If a more precise control is desired, the Expansion button in Timeline Editor or the mouse wheel
can be used to expand the screen (left side of image 6.26). You can control with more ease by using the Curve
key edit button (pencil shaped button) (right side of the image 6.26)

image 6.27 - HeadYaw and HeadPitch joint information

D - In this exercise, ShoulderPitch, ShoulderRoll, ElbowYaw, and ElbowRoll joints will be used. However,
other joints may move in the process of locking and unlocking the actual NAO joints. The previously
introduced ‘simplify’ feature can remove this type of joint movement.

240 6 - ComPrehenSiVe examPleS


6.2 motion control- timeline editor
  u
w

image 6.28 - Removal of joint movement using the ‘simplify’ feature

image 6.27 shows the HeadYaw and HeadPitch joint information. The head joint itself was not moved,
but it moved because of the exterior shock resulting from controlling NAO’s movements.

In order to remove this slight movement of the joint, you must select the removal area (j from image 6.28)
and then click the ‘simplify’ button.

If the movement variation in the selected area is below the error margin, the movement will be removed
(k from image 6.28). The movement will be removed across all sectors, but the joint information
of the starting and ending points will be preserved (l of image 6.28).

image 6.29 - Result of controlling NAO’s movements

6 - ComPrehenSiVe examPleS 241


6.2 motion control- timeline editor
e - In Image 6.29, the hand and neck joints use the ‘simplify’ feature to remove movements, and the joints
being used are symmetrical. When compared to Image 6.23, there aren’t as many curves; this is because
there is an overlap with the information regarding joints that have the same rotation direction.

6.2.3 Controlling Joint moVementS


One of the important features of Timeline Editor is being able to set the joint movements.

As previously introduced, the joint is set to move using Automatic Bezier by default. In this section, we will
use the adjusted movement from 6.2.2 and observe how NAO will move as the joint movements are modified.

image 6.30 - Movements according to the Automatic Bezier formation

a - image 6.30 shows NAO’s movements according to Automatic Bezier. The orange frame has been
previously defined so it will not change even if the movements change. The blue frame is the movement
between the two frames with previously defined movements; NAO’s movements will change
if the formation of the movement changes.

242 6 - ComPrehenSiVe examPleS


6.2 motion control- timeline editor
 
v
u

image 6.31 - Movement change to linear form

b - In order to change the joint movement information, you must select the points from the joint as shown
in j from image 6.31. You can select the points by dragging the mouse or pressing he ‘ctrl’ key
and then clicking the left mouse button.

image 6.32 - Movement of the Linear formation

After selecting the points, click the Linear button (k from image 6.31). When you click the Linear button,
you can see that the movement curve has changed into a straight line as shown in l from 6.31.
image 6.32 is the movement from the Linear formation; you can see that the movement is not very different
from the Automatic Bezier formation.

6 - ComPrehenSiVe examPleS 243


6.2 motion control- timeline editor
C - You must set a constant form to get rigid movements. The set up is the same as Step [2].
image 6.33 shows the movement curve from the Automatic Bezier formation and the Constant formation.
image 6.33 is a curve of Automatic Bezier and Constant movements. You will see that that the point
that moves the joint is where the joint was defined.

image 6.33 - Movement from Automatic Bezier formation and Constant formation

image 6.34 show constant movements and you can see that it is different from the movements that use
the two other forms we previously mentioned. The Constant movement moves the joint from the point
where it was defined, so it is easy to understand that it will not move until Frame 14.

However, it is still in its initial pose even in Frame 15. This phenomenon occurs because it takes time
to deliver the actual commands. Meaning, the point of completion for NAO’s movements is actually
the frame that comes right after the defined frame.

Another important thing here is that the movement in Frame 45, the completion frame, is different.
The completion frame does not need to deliver the command anymore so it does not provide the robot
with other signals in Choregraphe.

244 6 - ComPrehenSiVe examPleS


6.2 motion control- timeline editor
image 6.34 - Movements from using the Constant form

You must be careful when executing excessive rotation using the Constant form. If you try to bend the arm
backward too quickly while using Constant form, NAO will stumble backward.

6 - ComPrehenSiVe examPleS 245


6.2 motion control- timeline editor
6.3 getting DireCtionS
uSing lanDmarKS
uSing ChoregraPhe

Processing image data within the robot can be very useful for a lot of diverse operations. For example,
the robot can be used to identify objects that appear in an image or calculate the location of the object
in question; this is something that cannot be done using infrared and laser sensors or microphones.

- Despite this advantage, processing image data is an extremely difficult task, so it is not used very
often in the early stages. This is also because you have to know a lot of diverse information regarding
the characteristics and patterns of the image data, and the user must configure it to detect these things
through mathematical analysis.

- This problem can easily be solved by using the default image recognition module. In NAO, algorithms
that detect marks, faces and images of objects selected by the user are provided as a module.
If Chapter 2 Choregraphe showed you how to use the boxes formed with these modules, this section
will use the ‘vision reco’ box to control NAO’s walking.

Landmark Detection

Determine
Behavior

Database
Verification

Image Acquisition

image 6.35 - Program configuration

image 6.35 shows the configuration of the program we will create in this chapter. NAO’s bottom camera
is used to gather continuous images of the floor. If there is a match for the gathered image from the landmark
image recorded in NAO’s database, NAO will execute the program that corresponds to the landmark.
Here, the landmark is programmed to go, stop, turn left, and turn right.

246 6 - ComPrehenSiVe examPleS


6.3 getting directions using lAndmArks–using choregrAphe
6.3.1 StuDying anD reCognizing lanDmarKS
 

image 6.36 - Landmark images (File: landmark.pdf)

The landmarks used in this section are shown in image 6.36. Enable each landmark to go, stop, turn left
and turn right. The landmark includes all the areas acquired by the camera, and you must make sure to clearly
separate them in the acquired image. The landmark above is saved in Landmark.pdf. The image study process
was conducted in Exercise 2.8, so detailed explanation will be omitted here. The set value for each landmark
is the same as image 6.37.

image 6.37 - Landmark image study

If the landmark image has been learned, the study process for Telepathe image will not be explained in detail
here since it was already explained in Exercise 2.8. The set value regarding each landmark image is the same
as image 6.37. After the landmark has been learned, Monitor must be used to verify. image 6.38 shows the
image after the landmark image was verified using Monitor.

6 - ComPrehenSiVe examPleS 247


6.3 getting directions using lAndmArks–using choregrAphe
image 6.38 - Landmark image verification using Monitor

6.3.2 Programming

image 6.39 - Initial program setup

image 6.39 shows the initial program setup. Activate NAO’s bottom camera by using the ‘Select Camera’ box. Set
the volume loudness by using the ‘Set Volume’ box. Use the ‘Wait for Signals’ box to stand by until you are finished
setting up the camera and volume loudness. When setup is complete, use the ‘Stand Up’ box to lift NAO and the
‘Say’ box to announce the start of the program. ‘Init’ box is provided by the Pose library; it is the default pose of the
‘Demo Omni’ box.

248 6 - ComPrehenSiVe examPleS


6.3 getting directions using lAndmArks–using choregrAphe
Image 6.40 - Init box adjustment

If you double click the ‘Init’ box, you will see that the default pose is set in Frame 20. Set the HeadPitch joint
to 2.0 in Frame 20. By bowing NAO’s head, you are letting the top camera look at the area near the feet.

Image 6.41 - Connection between ‘Vision Reco’ box and walking

Image 6.41 shows a program that uses the ‘Vision Reco’ box to control walking. As shown earlier, the ‘Vision
Reco’ box outputs two strings, so the ‘Dispatcher’ box list will be added as a string array.

6 - comprehensive examples 249


6.3 Getting Directions Using Landmarks–Using Choregraphe
Graph 6.3 - Demo Omni box setup

Category Go Straight Turn Right Turn Left

X 0.3 0.2 0.2

Y 0.0 0.0 0.0

Theta 0.0 -1.0 1.0

Step Frequency 0.4 0.4 0.4

The ‘Demo Omni’ box is used for moving straight, turning left, and turning right.

Graph 6.3 shows the parameters. If another landmark is detected while moving straight using the ‘Demo
Omni’ box, the ‘go straight’ box must be terminated.

If X, Y, Theta, and Step Frequency all use the ‘stop’ box with 0 to stop, or the link is not configured
in the ‘stop’ box from the ‘Demo Omni’ box, the program will not function properly.

Image 6.42 - Incorrect use of the ‘Demo Omni’ box

250 6 - comprehensive examples


6.3 Getting Directions Using Landmarks–Using Choregraphe
6.4 Memorizing
the multiplication table
python and naoqi application

Variable concept is something you must take into precaution when Choregraphe is used to program.
The default Number Edit and Text Edit boxes are used as a constant. Meaning, the value does not change
when the program starts. Python must be used to overcome this limitation in order to add the variable
within the box script.

Image 6.43 - Multiplication program configuration

This section will implement the multiplication program and Choregraphe programming a bit
more complicated than NAOqi API. Image 6.43 shows the multiplication program configuration.
You can use he head’s contact sensor to increase and decrease the singular number and then to execute
the multiplication. In order to increase/decrease the singular number, a variable must be used; as previously
mentioned, Python must be used to implement the variable. Python should also be used to implement
the box that reads the numbers. The multiplication program is implemented in ch6.4timetable.crg

6 - comprehensive examples 251


6.4 Getting Directions Using Landmarks–Using Choregraphe
6.4.1 Singular Number (Multiplicand) Calculator
Create a multiplicand box you can use to increase/decrease the singular number (multiplicand). The multiplicand
box receives the input from the touch sensor to increase or decrease the singular number. There is also a singular
number output when the signal comes in. The following process shows how to create the multiplicand box.

A - Create the multiplicand box, and set the input/output parameters as shown below.

Graph 6.4 - Setting the input/output of the multiplicand box

Category Name Type Nature

Up “Bang” onEvent

Input Trigger “Bang” onEvent

Down “Bang” onEvent

Output Num Number Punctual

Parameters Type Default Value Minimum Value Maximum Value

Default operand Integer 2 1 9

Max number Integer 9 2 20

Min number Integer 1 1 20

Image 6.44 - Setup window for the multiplicand box and the Parameters screen

B - Image 6.44 shows the window where you can set the multiplicand box and parameters.
Adjust the script to implement the box functions as shown below:

252 6 - comprehensive examples


6.4 Getting Directions Using Landmarks–Using Choregraphe
1. class MyClass(GeneratedClass):
2. def __init__(self):
3. GeneratedClass.__init__(self)
4. self.x = 2;
5. self.maxnumber = 9;
6. self.minnumber = 1;
7. def onLoad(self):
8. self.x = self.getParameter(«default operand»)
9. self.maxnumber = self.getParameter(«max number»)
10. self.minnumber = self.getParameter(«min number»)
11. if self.minnumber > self.maxnumber :
12. self.minnumber = 1;
13. self.maxnumber = 9;
14. def onInput_Up(self):
15. self.x = self.x + 1
16. if self.x > self.maxnumber :
17. self.x = self.minnumber
18. def onInput_Down(self):
19. self.x = self.x - 1
20. if self.x < self.minnumber :
21. self.x = self.maxnumber
22. def onInput_Trigger(self):
23. self.process()
24. def process(self):
25. result = self.x
26. self.Num(result)

During the initialization process of the box (Line 2-6), set the current singular number (self.x)
to 2, the minimum singular number (self.minnumber) to 1, and the maximum singular number (self.
maxnumber) to 9.
When the box is loading (Line 7-13), read the parameters of the default operand, max number,
and min number to change the multiplication setting.
If there is a user mistake and the minimum singular number is greater than the maximum singular number,
the minimum singular number will automatically be set to 1 and the maximum singular number to 9. If a
signal enters the input (‘Up’) of the box (onInput_Up), increase the current singular number (self.x) by 1.
If the increased singular number is greater than the maximum singular number, set the current singular
number to the minimum singular number (self.minnumber). If a signal (onInput_Down) enters the input
(Down), decrease the current singular number by 1, and when the current singular number is less than the
minimum singular number as it was in the aforementioned ‘Up’, set the current singular number to the
maximum singular number (self.maxnumber).
If a signal enters the input (‘Trigger’), output the current singular number (self.x) to the output (‘Num’).

6 - comprehensive examples 253


6.4 Getting Directions Using Landmarks–Using Choregraphe
6.4.2 Adjusting the Multiply Box

Image 6.45 - Multiply box

The multiplier used in this section will have two number inputs and will output the multiplication result of the two
numbers (Image 6.45). Adjust the current ‘Multiply’ box to have three outputs. The names of the output are answer,
num1, and num2, in that order. The adjusted ‘Multiply’ box script is as follows:

1. class MyClass(GeneratedClass):
2. def __init__(self):
3. GeneratedClass.__init__(self);
4. self.rMultiplier = 2.0;
5. self.bMultiplicand = False
6. self.bMultiplier = False
7. self.rMultiplicand = 2.0
8. def onUnload(self):
9. pass
10. def onInput_Multiplicand(self, rVal ):
11. self.rMultiplicand = float(rVal)
12. self.bMultiplicand = True
13. if self.bMultiplicand and self.bMultiplier:
14. self.process()
15. def onInput_Multiplier(self, rVal ):
16. self.rMultiplier = float(rVal);
17. self.bMultiplier = True
18. if self.bMultiplicand and self.bMultiplier:
19. self.process()
20. def process(self):
21. rRes = self.rMultiplicand * self.rMultiplier;
22. X = self.rMultiplicand
23. Y = self.rMultiplier
24. self.bMultiplicand = False
25. self.bMultiplier = False
26. self.num1(int(X))
27. self.num2(int(Y))
28. self.answer( int(rRes) );

254 6 - comprehensive examples


6.4 Getting Directions Using Landmarks–Using Choregraphe
During the initialization process (Line 1-7), set the multiplier (self.rMultiplier) and multiplicand (self.
rmultiplicand) to 2.0. Also, set a false initialization for self.bMultiplier and self.bMultiplicand that show
whether or not there was a multiplier and multiplicand input. When onInput_Multiplicand method in Line 10-
14 has a multiplicand (multiplier) input, that value (rVal)
should be saved in self.rMultiplicand and set self.bMultiplicand to ‘True.’

Afterward, verify (Line 12) whether the multiplier input was successful. If the status of self.bMultiplier is
‘True,’ because the input of the multiplier was successful, the process method should be called to execute
the multiplication.

*In regards to the multiplier, the onInput_Multiplier method in Line 15-19 performs the same function as the
onInput_Multiplicand method.

If both multiplier and multiplicand are input, the process method should be called.
The process method in Line 20-28 executes the multiplication (Line 21~23) and sets the multiplier and
multiplicand input as ‘False’ (Line 24-25).

Line 26-28 is where the output of the multiplication result occurs. Conversion operators are included in the
parameters of the output method; if the output is in a float type, the NAO will read the nearest whole number
later on.

6.4.3 ‘Say’ Box Expansion


Memorizing the multiplication table means that NAO must read the numbers. The ‘Say’ box and ‘Say Text’ boxes
directly input the strings. In this section, we will create the box that reads the input numbers.

A - Adjust the ‘Say’ box as shown in Graph 6.5.

Graph 6.5 - Inputs/outputs of the ‘Say’ box

Category Name Type Nature

OnStart “Bang” OnStart

OnStop “Bang” onStop

Inputs answer number onEvent

multiplicand number onEvent

multiplier number onEvent

Outputs onStopped “Bang” onStopped

6 - comprehensive examples 255


6.4 Getting Directions Using Landmarks–Using Choregraphe
Graph 6.5 - Inputs/outputs of the ‘Say’ box

Parameters Type Default Value w Maximum Value

Voice Shaping Integer 100 50 150

Speed Integer 100 50 200

Parameters Type Default value

Middle text String times

End text String is equal to

Image 6.46 - Speaking box (left) and Parameters screen (right)

B - Image 6.46 is the box interface. The script for reading the three numbers is as follows:

1. class MyClass(GeneratedClass):
2. def __init__(self):
3. GeneratedClass.__init__(self)
4. self.tts = ALProxy(‘ALTextToSpeech’)
5. self.ttsStop = ALProxy(‘ALTextToSpeech’, True)
6. self.strnum1 = “2”
7. self.strnum2 = “1”

256 6 - comprehensive examples


6.4 Getting Directions Using Landmarks–Using Choregraphe
8. self.strans = “2”
9.
10.
11. def onLoad(self):
12. self.ids = [ ]
13.
14. def onUnload(self):
15. for id in self.ids:
16. try:
17. self.ttsStop.stop(id)
18. except:
19. pass
20.
21. def onInput_onStart(self):
22. sentence = “ /RSPD=”+ str( self.getParameter(“Speed”) )+ “ /”
23. sentence += “ /VCT=”+ str( self.getParameter(“Voice Shaping”) ) + “ /”
24. sentence += self.strnum1
25. sentence += “ /Pau=200 /”
26. sentence += self.getParameter(“middle text”)
27. sentence += “ /Pau=200 /”
28. sentence += self.strnum2
29. sentence += “ /Pau=200 /”
30. sentence += self.getParameter(“end text”)
31. sentence += “ /Pau=200 /”
32. sentence += self.strans
33. sentence += “ /Pau=200 /”
34. sentence += “ /RST /”
35. id = self.tts.post.say(str(sentence))
36. self.ids.append(id)
37. self.tts.wait(id, 0)
38. self.ids.remove(id)
39. self.onStopped()
40.
41. def onInput_onStop(self):
42. self.onUnload()
43.
44. def onInput_answer(self, C):
45. self.strans = str(C)
46. pass
47.
48. def onInput_multicand(self, A):
49. self.strnum1 = str(A)
50. pass
51.
52. def onInput_multiplier(self, B):
53. self.strnum2 = str(B)
54. pass

6 - comprehensive examples 257


6.4 Getting Directions Using Landmarks–Using Choregraphe
Line 6-9 are variables for reading numbers; self.strnum1 and self.strnum2 are variables that will each save
the multiplier and multiplicand into strings, and self.strans is a variable that will save the multiplication
result of the multiplier and multiplicand into a string.

Line 21-39 are codes that enable NAO to speak; ‘sentence’ saves the spoken strings and commands.
The detailed explanation of the commands used in a sentence is in the reference (Advanced/Audio system).

Line 24-33 are where you would input the sentences that will actually be spoken; sentence
+= “ /Pau=200 /” is the command that creates the 200ms delay. The ‘Speaking’ box speaks in the following
order: the multiplicand, “times,” multiplier, “is equal to,” and the result of the multiplication.
The onInput_answer method, onInput_multiplicand method, and onInput_multiplier method each executes
a string transformation when a number enters the input.

6.4.4 Box Placement and Connection


The multiplication program uses other boxes like the ‘Wait’ box, ‘Loop’ box, ‘Dispatcher’ box, and ‘Tactil
Touch’ box are used along with the boxes already mentioned. The process for creating the multiplication
program is as follows:

Image 6.47 - Multiplication table calculation program

A - Image 6.47 is where the multiplication table is calculated. Manipulate the singular number by connecting
the ‘Tactil Touch’ box and the ‘multiplicand’ box. Set the loop max to 9 for the Loop_multiplier box
to execute the multiplication. Insert the ‘multiplicand’ box and ‘Loop_multiplier’ box from the ‘Multiply’
input. Connect the results of the ‘Multiply’ box with the ‘Speaking’ box input. When the multiplier
is 9 while using the ‘Dispatcher’ box, initialize and then stop the ‘Loop_Multiplier’ box.

258 6 - comprehensive examples


6.4 Getting Directions Using Landmarks–Using Choregraphe
Image 6.48 - Multiplication control program

B - Image 6.48 essentially added a program control feature to Image 6.47. You can change the multiplier
of the multiplication table by connecting the end of the ‘Speaking’ box to both the trigger input
of the ‘multiplicand’ box and the start input of the ‘Loop_multiplier’ box. Two ‘Wait’ boxes were used
for the program that controls the multiplication table.

First, the ‘Wait’ box between the ‘Tactil Touch’ box, ‘multiplicand’ box, and the ‘Loop_Multiplier’
box delays the starting point of the multiplication table. The ‘Wait’ box is used here to initialize
the ‘Loop_Multiplier’ box. If, as shown in Image 6.49, the output of the center sensor (of the ‘Tactil
Touch’ box) is connected to both the start and initialization inputs of the ‘Loop_Multiplier’ box, the index
initialization will occur after the iteration statement begins. This is why it was delivered using the
‘Wait’ box after the initialization signal.

Image 6.49 - Connection between the Tactil Touch box and Loop_Multiplier

The ‘Wait’ box connected to the start input (of the ‘Speaking’ box) was used for the same reason. If the end
output (of the ‘Multiply’ box) and the start input (of the ‘Speaking box’) are directly connected, the ‘Speaking’
box may start before the input of the values needed for the multiplication table is finished.

6 - comprehensive examples 259


6.4 Getting Directions Using Landmarks–Using Choregraphe
Image 6.50 - Multiplication table program

C - Use the ‘Set Volume’ box or ‘Say’ box to implement the initial setup for the multiplication
program (Image 6.50)

In this exercise, Python and NAOqi API will be used to implement the functions that aren’t provided
by the default Choregraphe box. It would be easier and more convenient to implement the desired programs
when text-based programming (Python) and graphics-based programming are used together.

260 6 - comprehensive examples


6.4 Getting Directions Using Landmarks–Using Choregraphe
6.5 Combining Recognition
and Movement using Images
for Object recognition
and Grabbing Motion

Continuous combining of NAO’s recognitions and movements can be used in diverse areas. For example,
it can detect when the ball is getting closer during robot soccer to execute actions like kicking and stopping
the ball; it can also grab and bring back a specific object after detection. The most basic combination of
detection and movement, object detection and grabbing motion through an image, will be implemented in
this section.

In Section 6.3 we looked at how NAO can navigate its way using landmark recognition. Landmarks may be
used this way to provide a type of guideline for NAO. In this section, we will look at how landmarks may be
used to detect objects.

Image 6.51 - Using landmark recognition to grab an object

6 - comprehensive examples 261


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
In Chapter 5, inverse kinematics analysis was used to control NAO’s movements. We configured NAO
to grab an object located in a specific location by performing inverse kinematics analysis. In this chapter,
the grabbing motion from prior sections will be used to detect an object and to separate the object either
to the right or left depending on the object in question (Image 6.51).
Also, in this section, Python will be used to call the module and handle the algorithm.

6.5.1 Flow of the Entire Algorithm


The flow of the entire algorithm is shown in Image 6.52. After first determining what the item is,
either the left or right arm will lift the object and then lay it back down either on the left or right side.
ALLandMarkDetection module will be used to determine the object.

Left and right hand grabbing motion each uses its own program calculated from inverse kinematics analysis.

  Start  

Object  No.  114  


Determine    
Object  

Object  No.  109  

Grab  with  the   Grab  with  the  


left  hand   right  hand  

Place  it  on  the   Place  it  on  the  


left  side   right  side  

End  

Image 6.52 - Algorithm flow of recognition

262 6 - comprehensive examples


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
6.5.2 Object RecognitionI
Object detection occurs in the ‘Landmark Detection’ module. The NAO robot has variously defined landmarks
as a default. Here, the landmark introduced in [5] NAO Mark in Chapter 2 Choregraphe 2.5.9 will be used.
First, the Python code executed for landmark detection is as follows:

1. import os
2. import sys
3. import time
4.
5. from naoqi import ALProxy
6.
7. IP=“192.168.123.145”
8. PORT= 9559
9.
10. #ALLandMarkDetectionSet the connection for the module
11. landMarkProxy=ALProxy(“ALLandMarkDetection”,IP,PORT
12.
13. #ALLandMarkDetectionproxyFor the continuous reading of the proxy0
14. # Read every 500ms
15. period=500
16. landMarkProxy.subscribe(“Test_LandMark”,period,0.0)
17.
18. #ALLandMarkdetection Save the module output to AlMemory
19. memValue= “LandmarkDetected”
20.
21. #Set the proxy for ALMemory
22. memoryProxy=ALProxy(“ALMemory”,IP,PORT)
23.
24. #After waiting 0.5 seconds, get the corresponding data to memValue
25. time.sleep(0.5)
26. val=memoryProxy.getData(memValue)
27.
28. # verify whether the right value has been received
29. if(valandisinstance(val,list) andlen(val)>=2):
30.
31. # the first value refers to the timestamp
32. timeStamp=val[0]
33.
34. #markInfoArray[0][1] number of the landmark with detected value
35. markInfoArray= val[1]
36. print markInfoArray[0][1];
37.
38. else:
39. print»Nolandmarkdetected»
40.
41.

6 - comprehensive examples 263


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
42. # Cancel module subscription.
43. landMarkProxy.unsubscribe(«Test_LandMark»)

First, set the proxy (Line 1-11) to connect with the ‘ALLandMarkDetection’ module. Since the ‘ALLandMark
Detection’ module continuously produce values, it uses the ‘subscribe’ method to connect every 500ms
(Line 16). Data is saved in ALMemory when it connects normally, so the proxy should be connected to
ALMemory as shown in Line 22. To verify whether the value has been delivered properly, check whether
it’s a list instance and whether the length is greater than 2. Also, verify the [0][1] index of markInforArray
and check to see whether the landmark was detected using the correct process shown above.

Perform landmark recognition through Python using this process. The purpose of the codes explained
here is to verify whether the current landmark recognition is being executed properly; afterward,
some of the lower section which combines with the part that actually grabs the object will be changed.

6.5.3 Grabbing the Object Using Inverse Kinematics Analysis


Chapter 5 provides a detailed explanation for inverse kinematics analysis. We will use this to prepare NAO to pick
up (then put down) an object using the right arm. The next Python codes are for the movements that use the right
arm to lift up an object and to put it down again.

1. import math
2. import time
3.
4. import naoqi
5. from naoqi import ALProxy
6.
7. IP=“192.168.123.145”
8. PORT= 9559
9.
10. proxy=ALProxy(“ALMotion”,IP,PORT)
11.
12. MaxSpeed= 0.2
13.
14. names=’Body’
15. stiffness= 1.0
16. proxy.stiffnessInterpolation(names,stiffness,1.0)
17.
18. # Bow the head slightly
19. proxy.setAngles(“HeadPitch”,1.57/5,0.2)
20. # Initial rotation value of the hand for grabbing
21. proxy.setAngles(“LWristYaw”,-0.5,0.2)
22. proxy.setAngles(“RWristYaw”,0.5,0.2)

23.
24. # Open the hand to grab the object
25. proxy.setAngles(“RHand”,0,0.2)

264 6 - comprehensive examples


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
26.
27. # Point of movement until the arm comes down
28. point= [[1,-1,-1],[1,-0.5,-1],[1,0.01,-0.8], [1,0.5,-1.2],[1,0.7,-1.57]]
29.
30. # Calculate the inverse kinematics
for movement
31. foriinrange(0,5):
32. px= point[i][0]
33. py= point[i][1]
34. pz= point[i][2]
35.
36. printpx,py,pz
37.
38. ifpx==0:
39. theta1= 0
40. else:
41. theta1= math.atan(py/px)
42.
43. ifpz==0:
44. theta2= 0
45. else:
46. theta2= math.atan((math.cos(theta1)
*px +math.sin(theta1)*py)/pz)
47.
48. if(math.sin(theta1)*px-math.cos(theta1)*py)==0:
49. theta3= 0
50. else:
51. theta3=math.atan((math.cos(theta1)*math.cos(theta2)*px+math.sin(theta1)*math.cos(theta2)
*py - math.sin(theta2)*pz)/(math.sin(theta1)
*px- math.cos(theta1)*py))
52.
53. ifpow(math.cos(theta3),2)-pow(math.sin(theta3),2)==0:
54. theta4= 0
55. else:
56. theta4= 1/(pow(math.cos(theta3),2)-pow(math.sin(theta3),2))
57.
58. printtheta1,theta2,theta3,theta4
59.
60. proxy.setAngles(“RShoulderPitch”,theta1,0.2)
61. proxy.setAngles(“RShoulderRoll”,theta2,0.2)
62. proxy.setAngles(“RElbowYaw”,theta3,0.2)
63. proxy.setAngles(“RElbowRoll”,theta4,0.2)
64.
65. time.sleep(0.5)
66. #forloopend
67.
68. # Open the hand

6 - comprehensive examples 265


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
69. time.sleep(1)
70. proxy.setAngles(“RHand”,1,0.2)
71.
72. # Get closer to grab the object (Adjust some of the inverse kinematics errors)
73. time.sleep(2)
74.
75. proxy.setAngles(RShoulderRoll ,-0.2,0.2)
76.
77. # Grab the object
78. time.sleep(1)
79. proxy.setAngles(“RHand”,0,0.2)
80.
81. # Lifting and lowering the arm
82. time.sleep(1)
83. point2= [[1,-1,1],[-1,1,1]]
84.
85. foriinrange(0,2):
86. px= point2[i][0]
87. py= point2[i][1]
88. pz= point2[i][2]
89.
90. printpx,py,pz
91.
92. ifpx==0:
93. theta1= 0
94. else:
95. theta1= math.atan(py/px)
96.
97. ifpz==0:
98. theta2= 0
99. else:
100. theta2= math.atan((math.cos(theta1)*px +math.sin(theta1)*py)/pz)
101.
102. if(math.sin(theta1)*px-math.cos(theta1)*py)==0:
103. theta3= 0
104. else:
105. theta3=math.atan((math.cos(theta1)*math.cos(theta2)*px+math.sin(theta1)*math.cos(theta2)
*py - math.sin(theta2)*pz)/(math.sin(theta1)*px- math.cos(theta1)*py))
106.
107. ifpow(math.cos(theta3),2)-pow(math.sin(theta3),2)==0:
108. theta4= 0
109. else:
110. theta4= 1/(pow(math.cos(theta3),2)-pow(math.sin(theta3),2))
111.
112. printtheta1,theta2,theta3,theta4
113.

266 6 - comprehensive examples


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
114. proxy.setAngles(“RShoulderPitch”,theta1,0.2)
115. proxy.setAngles(“RShoulderRoll”,theta2,0.2)
116. proxy.setAngles(“RElbowYaw”,theta3,0.2)
117. proxy.setAngles(“RElbowRoll”,theta4,0.2)
118. time.sleep(0.5)
119.
120. #forloopend
121.
122. # Open the hand to put the object down.
123. time.sleep(2)
124. proxy.setAngles(“RHand”,1,0.2)

The code is largely divided into two sections; Line 30-79 grabs the object and Line 80-121 is where the lifting
and putting down of the object occurs. The two processes above are coordinates calculated from several
previous attempts, and by continuously performing these movements,
we able NAO to implement the grabbing motion.

Furthermore, Line 71-75 has a section where some of the values of the angle are substituted
in order to get close enough to the object. There are some parts that cannot be moved due to the angle value
limitations during inverse kinematics analysis, but the substitution resolves this issue.

6.5.4 Combining Recognition and Grabbing Motion 5


1. import math
2. import time
3. import os
4. import sys
5.
6. Import naoqi
7. From naoqi import ALProxy
8.
9. IP=“192.168.123.145”
10. PORT= 9559
11.
12. proxy=ALProxy(«ALMotion»,IP,PORT)
13.
14. MaxSpeed= 0.2
15.
16. names=’Body’
17. stiffnes s= 1.0
18. proxy.stiffnessInterpolation(names, stiffness,1.0)
19.
20. proxy.setAngles(«HeadPitch»,1.57/5,0.2)
21. # Initial rotational value of the hand for the grabbing motion
22. proxy.setAngles («LWristYaw»,-0.5,0.2)
23. proxy.setAngles («RWristYaw»,0.5,0.2)

6 - comprehensive examples 267


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
24.
25.
26. landMarkProxy=ALProxy (“ALLandMarkDetection”,IP,PORT)
27.
28. landMarkProxy.subscribe (“Test_LandMark”,500,0.0)
29.
30. memValue= “LandmarkDetected”
31.
32. memoryProxy=ALProxy (“ALMemory”,IP,PORT)
33.
34. time.sleep(0.5)
35. val=memoryProxy.getData(memValue)
36.
37. if(valandisinstance(val,list) andlen(val)>=2):
38.
39. timeStamp=val[0]
40.
41. markInfoArray= val[1]
42. print markInfoArray[0][1];
43. # Select the ID for the currently detected Landmark for differentiation purposes
44. markid=markInfoArray[0][1]
45.
46. else:
47. print“Nolandmarkdetected”
48.
49. time.sleep(3)
50.
51. # Select the name of each arm joint according to the Landmark ID
52. # right arm
53. if markid==[114]:
54. SP=“RShoulderPitch”
55. SR= “RShoulderRoll”
56. ER= “RElbowRoll”
57. EY=“RElbowYaw”
58. H=“RHand”
59.
60. #left arm
61. elifmarkid==[109]:
62. SP=“LShoulderPitch”
63. SR= “LShoulderRoll”
64. ER= “LElbowRoll”
65. EY=“LElbowYaw”
66. H=“LHand”
67.
68. point= [[1,-1,-1],[1,-0.5,-1],[1,0.01,-0.8],[1,0.5,-1.2],[1,0.7,-1.57]]
69. foriinrange(0,5):
70. px= point[i][0]

268 6 - comprehensive examples


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
71. py= point[i][1]
72. pz= point[i][2]
73.
74. proxy.setAngles(H,0,0.2)
75.
76. printpx,py,pz
77.
78. ifpx==0:
79. theta1= 0
80. else:
81. theta1= math.atan(py/px)
82.
83. ifpz==0:
84.
85. theta2= 0
86. else:
87. theta2= math.atan((math.cos(theta1)*px +math.sin(theta1)*py)/pz)
88.
89. if(math.sin(theta1)*px-math.cos(theta1)*py)==0:
90. theta3= 0
91. else:
92. theta3=math.atan((math.cos(theta1)*math.cos(theta2)*px+math.sin(theta1)*math
cos(theta2)*pymath.sin(theta2)*pz)/(math.sin(theta1)*px- math.cos(theta1)*py))
93.
94. ifpow(math.cos(theta3),2)-pow(math.sin(theta3),2)==0:
95. theta4= 0
96. else:
97. theta4= 1/(pow(math.cos(theta3),2)-pow(math.sin(theta3),2))
98.
99. # Adjust rotational angle according to the direction of joint rotation
100. if markid==[109]:
101. theta2= -theta2
102. theta4= -theta4
103.
104. printtheta1,theta2,theta3,theta4
105.
106. #execute arm joint movement
107. proxy.setAngles(SP,theta1,0.2)
108. proxy.setAngles(SR,theta2,0.2)
109. proxy.setAngles(EY,theta3,0.2)
110. proxy.setAngles(ER,theta4,0.2)
111.
112. time.sleep(0.5)
113. #forloopend
114.
115. # execute hand movement
116. time.sleep(1)

6 - comprehensive examples 269


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
117. proxy.setAngles(H,1,0.2)
118.
119. time.sleep(2)
120.
121. #get close enough to grab the object (adjust position-orientation inverse error of both arms)
122. if markid==[114]:
123. proxy.setAngles(SR,-0.2,0.2)
124. elifmarkid==[109]:
125. proxy.setAngles(SR,0.2,0.2)
126.
127. time.sleep(1)
128.
129. proxy.setAngles(H,0,0.2)
130.
131. time.sleep(1)
132.
133. point2= [[1,-1,1],[-1,1,1]]
134.
135. foriinrange(0,2):
136. px= point2[i][0]
137. py= point2[i][1]
138. pz= point2[i][2]
139.
140. printpx,py,pz
141.
142. ifpx==0:
143. theta1= 0
144. else:
145. theta1= math.atan(py/px)
146.
147. ifpz==0:
148. theta2= 0
149. else:
150. theta2= math.atan((math.cos(theta1)*px +math.sin(theta1)*py)/pz)
151.
152. if(math.sin(theta1)*px-math.cos(theta1)*py)==0:
153. theta3= 0
154. else:
155. theta3=math.atan((math.cos(theta1)*math.cos(theta2)*px+math.sin(theta1)*math.
cos(theta2)*py - math.sin(theta2)*pz)/(math.sin(theta1)*px- math.cos(theta1)*py))
156.
157. ifpow(math.cos(theta3),2)-pow(math.sin(theta3),2)==0:
158. theta4= 0
159. else:
160. theta4= 1/(pow(math.cos(theta3),2)-pow(math.sin(theta3),2))
161.
162. #Adjustment of rotational angle according to the direction of joint rotation

270 6 - comprehensive examples


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
163. if markid==[109]:
164. theta2= - theta2
165. theta4= - theta4
166.
167. printtheta1,theta2,theta3,theta4
168.
169. proxy.setAngles(SP,theta1,0.2)
170. proxy.setAngles(SR,theta2,0.2)
171. proxy.setAngles(EY,theta3,0.2)
172. proxy.setAngles(ER,theta4,0.2)
173.
174. time.sleep(0.5)
175. #forloopend
176.
177. time.sleep(2)
178. proxy.setAngles(H,1,0.2)
179.
180. landMarkProxy.unsubscribe (“Test_LandMark”)

The codes shown above have been created by combining two movements: landmark recognition and grabbing
motion of the right arm. All the libraries of the two codes are used. Several proxies are used as required by
each part.

To execute a different movement for each landmark, the camera is used to gather the landmark ID (Line 44),
and then the movement based on the if-conditional is executed.

You can set the names for the joints you will be moving in Line 55-66: if the landmark ID is 109, the left arm
will execute the movement and if the landmark ID is 114, the right arm will execute the movement.
Line 100-102 adjusts the angle of the left arm’s rotation according to the direction of the joint rotation.

Codes in Line 107-110 enable each arm to move according to the conditional statement.
It also lets each hand move the same way as 117.

Image 6.53 and 6.54 show NAO’s movements executed by using the codes we created
in Section 6.3. In both cases, all movements proceed in the same order.

For each landmark shown at the end of an image, 109 is used for the landmark that moves the left arm
and 114 is used for the landmark that moves the right arm. It starts from NAO’s default position (standing)
to lift and move the arm according to the calculated angle, grab the box, lift the arm to separate,
and then to put the box down again.

6 - comprehensive examples 271


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
Image 6.53 - NAO’s left arm movements

Image 6.54 - NAO’s right arm movements

272 6 - comprehensive examples


6.5 Combining Recognition and Movement – Using Images for Object Recognition and Grabbing Motion
273
274
About
the author
ki-sung suh
1986 Yonsei University School of Electrical
Engineering, BS.

1988 Yonsei University School of Electrical


Engineering, Master of Engineering.

1993 Yonsei University School of Electrical


Engineering, Ph.D.

1993-1998 Seokeyeong University, Department


of Industrial Engineering, Department
of Electronic Engineering, Assistant
Professor

1999-2003 Michigan State University, Genetic


Algorithms Research and Applications
Group, Research Associate.

2002-2003 Michigan State University, Electrical


& Computer Engineering, Visiting
Assistant Professor.

2004-Present Seokeyeong University,


Department of Electronic Engineering,
Associate Professor

Areas of interest include Intelligent Robot, Evolutionary


Computation, Genetic Programming, Evolutionary Neural
Networks, and Evolutionary Design.

275
www.aldebaran-robotics.com
AMERICAS - americas@aldebaran-robotics.com
EUROPE MIDDLE EAST AFRICA - emea@aldebaran-robotics.com
ASIA PACIFIC - asia-pacific@aldebaran-robotics.com

©2013 ALDEBARAN Robotics - www.aldebaran-robotics.com

Published by Aldebaran Robotics.


Designed by Romain Belotti for Aldebaran Robotics.
Printed in France by Icones.

ALDEBARAN Robotics, the ALDEBARAN Robotics logo, and NAO are trademarks of
ALDEBARAN Robotics. Other trademarks, trade names and logos used in this document
refer either to the entities claiming the marks and names, or to their products. ALDEBARAN
Robotics disclaims proprietary interest in the marks and names of others. Choregraphe®
& NAO® are registered trademarks of ALDEBARAN Robotics. The design of NAO®
is the property of ALDEBARAN Robotics. All the photos featured in this document are
noncontractual and are the property of ALDEBARAN Robotics. Aldm400029__p A00

You might also like