Professional Documents
Culture Documents
Using Nao: To Interactive Humanoid Robots: Prof. Kisung Seo Aldebaran Robotics & NT Research, Inc
Using Nao: To Interactive Humanoid Robots: Prof. Kisung Seo Aldebaran Robotics & NT Research, Inc
ALDEBARAN ROBOTICS
& NT RESEARCH, INC
uSing nao:
introDuCtion
to interaCtiVe
humanoiD
robotS
2
Words From the Author
Robots are both replacing and assisting people in various fields including
manufacturing, extreme jobs, and service sectors. Robots will have a wider
range of application in the near future. Humanoid robots are attracting the most
attention compared to other robots because 1) they look similar to people so they
seem friendlier and are recognized as being a better fit for helping (or replacing)
humans for certain tasks, 2) much like humans, biped walking is possible and
jobs can be performed using both hands, and 3) they mimic the most evolutionally
outstanding human form and function.
Humanoid NAO consists of 25 joints that make walking and general motion control
possible. Diverse interactions are possible through wireless/cable network enabled
communication, cameras, infrared sensors, microphone, speakers and LEDs.
The software structure is based on open source embedded Linux and supports
programming languages like C, C++, URBI, Python, and. Net Framework. It also
provides a graphic-based programming called Choregraphe.
This book will try to focus on using Aldebaran’s humanoid NAO robot to explain the
environment and tools, programming techniques, and basic theory and applications
for educational and research purposes of vocational high schools, universities, and
the general public.
This book is largely divided into two parts: Chapters 1-3 for beginners and
Chapters 4-6 for advanced users. Chapters 1-3 introduce Choregraphe and
Python necessary for basic NAO robot usage. Chapters 4-6 handle information
for professional use. I would like to advise anyone just learning about the NAO
robot and people who are unfamiliar with C and Python to become familiar with
the information in Chapters 1-3. Chapters 2 and 4-6 are recommended for anyone
with previous experience in robot programming or anyone who wants to perform
specialized algorithms and control commands.
3
Chapter 1 introduces the NAO robot and the Monitor program that can be used
to verify NAO’s internal memory and image processing. It will also explain how to
do the initial setup for the system. Because this chapter discusses NAO’s special
features, it would be good for readers who are not quite familiar with NAO.
Chapter 3 will have a short introduction to Choregraphe scripts and Python for
NAOqi. There is a basic description of Python syntax and a discussion about
creating and editing Choregraphe script boxes. This would be a good chapter if you
are already familiar with Python.
Chapter 4 explains the NAOqi framework which forms the foundation of the NAO
robot and the DCM used for controlling all the devices. Special characteristics
including the NAOqi framework structure, file structure, and Broker as well as the
NAOqi framework are used to control NAO. It also explores how to load modules
into NAO using Linux, C++, and cross-compiling as well as what to do when several
commands are received in Time Command. There will also be an introduction to
the structures of DCM controlled devices and how to synchronize using DCM’s
synchronization method.
It was considerably difficult to write this book because there was a disadvantage
of dealing with such a specific model of humanoid robot.
4
There wasn’t much material about it, and the ones that were available were quite
disorganized. I was also conflicted about how to handle the variety of readership
because of the content and general difficulty of the subject matter. I am sincerely
hoping that this book will serve as a good introduction to humanoid robots.
I would like to express my sincere gratitude to the people at NT Research Inc. who
gave me both material and emotional support.
April 2011
Ki-sung Suh
5
How to use
this Curriculum
Aldebaran Robotics does not warrant the accuracy of the provided content
which shall be used at your own risk and under your control. Aldebaran Robotics
disclaims all liability related to the use as well as the content. All rights not
specifically granted herein are reserved to Aldebaran Robotics. Aldebaran
Robotics and/or its licensor shall retain all rights, title and interest and
ownership in and to the book and its content.
This curriculum has been done with the 1.8.16 version of Choregraphe,
our programming software. However, most of the features are compatible
with newer versions. The screenshot of the software included in this curriculum
may be different depending of the version of Choregraphe you have.
6
( table
of contents
>> WORDS FROM THE AUTHOR
>> How to use this curriculum
7
1 introDuCtion
LEARNING
Chapter 1 introduces the NAO robot and the Monitor
program that can be used to verify NAO’s internal
memory It will also explain how to do the initial
setup for the system.
8
content
1.1 NAO is… 10
1.2 Preparation 17
9
1.1 Nao is…
Humanoid Robot NAOs from Aldebaran Robotics are medium-sized open architecture robots (Image 1.1).
NAOs are being used all over the world for educational and research purposes in over 480 universities.
Aldebaran Robotics designed the NAO technology to be used in both secondary and higher education
programs. Because of the user-friendly programming environment, anyone can use NAO regardless
of the level of programming, and it can implement high-level functions using the open architecture.
Project NAO started in 2005 and was chosen instead of Sony’s Aibo (puppy robot) to be the official platform of
RoboCup (Robot Soccer World Cup) in August 2007. It has been used since 2008 at the competition in Suzhou,
China. In 2009, 24 teams from all over the world entered the RoboCup Competition using a total of 100 NAOs.
NAO can communicate with the PC via both a cable and wireless networks. Multiple NAOs can interact
with each other using infrared sensors, wireless network, camera, microphone and speaker. User input
can be made through the contact sensor, camera, and microphone. Status output can be delivered
to the user through multiple LEDs and speakers.
The NAO software is based on Gentoo Linux and supports multiple programming languages.
Programming languages like C, C++, URBI, Python, and .Net Framework, and Choregraphe can be used
for graphic-based programming.
10 1 - introduction
1.1 nao is…
1.1.1 Common featureS
NAO is 57.3cm in height, 27.3cm wide,
and weighs less than 4.3kg. The body
is made from a special plastic material
and has a 21.6V 2Ah lithium-ion battery
that allows up to 90 minutes of use.
There are diverse sensors like the 2-axis
gyro sensor and ultrasonic sensor,
and the multimedia can be implemented
through the use of cameras,
microphones, and speakers.
Open architecture is
enthusiastically reflected
in NAO’s development
environment. Software
and software development
tools are provided so you
can use it with Windows,
Mac OSX and Linux
operating systems.
1 - introDuCtion 11
1.1 nAo is…
1.1.2 Configuration
Image 1.3 shows NAO’s components. There are 25 total joints, and they are largely divided into head (2),
arm (12), waist (1), and leg (10). Arm and leg joints are left and right symmetrical. There are Hall-effect
sensors (32), contact sensors (3), infrared sensors (2), ultrasonic sensors (2), a 2-axis gyro sensor (1),
3-axis acceleration sensors (2), decompression sensors (8), and bumpers (2). There are also cameras (2),
microphones (4), and speakers (2) for image and voice processing.
12 1 - introduction
1.1 nao is…
Image 1.4
Image 1.4 shows NAO’s device structure. NAO’s head has an embedded system for full control, and there
is an ARM micro-controller in the chest to control the motor and power. The embedded system uses
embedded Linux (32 bit x86 ELF) and its H/W is composed of x86 AMD GEODE 500MHz CPU, 256MB SDRAM,
and flash memory. Ethernet (cable) and Wi-Fi (wireless, IEEE 802.11g) are supported, and the 2011 model
is Bluetooth enabled.
1 - introduction 13
1.1 nao is…
1.1.3 Joint configuration
Image 1.5 uses the Roll-Pitch-Yaw method to show NAO’s joints. HipYawPitch, which is one of the waist
joints, only has one degree of freedom since both the left/right joints are operated by one actuator.
The actuator that operates NAO’s joints are categorized into four different types depending on the motor
performance and reduction ratio.
Detailed information for each of the motors can be found in the Advanced-Hardware section. The Hall-effect
sensor that measures the motor rotation has 12 bit accuracy. Meaning, the degree is accuracy is 0.1.
Chapter 6 will discuss NAO’s kinematics.
14 1 - introduction
1.1 nao is…
1.1.4 Vision System
Image 1.6 shows NAO’s camera locations and angles. There are two cameras attached to NAO;
the top camera focuses on the front while the bottom camera focuses on the foot. The vision system can be
used to implement mark recognition, face recognition, object recognition, image recording, and etc. If C++
programming language is used to develop the vision system, it can be done so by using the OpenCV library.
The current Version 1.8 supports OpenCV library Version 2.0. If you use Choregraphe, you can use the camera
for face and landmark recognition. If you use the Monitor program, you can verify the images and videos
processed by NAO’s cameras and do a basic setup. Chapter 1.4 will discuss the Monitor program.
The following lists the features of the vision system:
1 - introduction 15
1.1 nao is…
1.1.5 Audio
Image 1.7 shows the configuration of NAO’s microphone. There are four microphones: one on each ear,
one in the front, and one in the back of the head. There are two speakers: one attached to each ear.
The speakers can be used to play music and to read texts entered by the user. The four microphones
can be used for simple voice recordings, but they also provide a function that can recognize the location
of the sound. Choregraphe supports recording and playback, sound detection, text reading, and etc.
1.1.6 Software
NAO provides Choregraphe, NAOqi, and Monitor as development software. Choregraphe is an easy-to-use
graphics-based development tool even novice users can easily handle. Chapter 2 will provide more details
about Choregraphe.
NAOqi is the programming framework used to program NAO, and it was designed to satisfy the requirements
needed in robotics. Main features include parallelization, resource management, synchronization, and event,
and it allows communication between other modules like motion, audio, and video. Chapter 5 will provide
more details regarding NAOqi.
Monitor is a program that receives feedback from NAO so you can easily verify joint or sensor values.
The feedback data includes image processing data, so if the image processing algorithm is being
implemented, you can verify the image result on the PC.
You can also set the resolution and color. There is a memory viewer for verifying the joint and sensor values
and a camera viewer used for setting up the camera and saving images. Laser sensor can be used for
monitoring if it is used as an external device.
16 1 - introduction
1.1 nao is…
1.2 Preparation
- NAO Robot
- Battery (Lithium-ion)
- Battery charger (100-240V)
- 4 Power adapter
- Installation CD
1.2.2 Requirements
The following lists the necessary minimum requirements for NAO’s development environment.
This book uses Windows XP 32-bit operating system as reference. Due to the nature
of the robot, there may be constraints in its actions if operating with a wired connection,
so a wireless environment is highly recommended.
n Hardware n Software
When installing Choregraphe in the Windows operating system, both NAOqi and Monitor
are installed together. This book uses Choregraphe Version 1.8.16.
For text-based programming, you need a program appropriate for each programming language.
In order to program using C and C++, you need Visual Studio 2005 (or 2008) or GCC 4.4
(or a more recent version). You also need CMake.
When using Python, Version 2.6 is recommended. Installation method for using the Software
Development Kit is explained in detail in the reference section (Advanced-SDK).
1 - introduction 17
1.2 preparation
1.3 connecting Nao
As introduced above, NAO can communicate with PCs through either a wired connection using an Ethernet
cable or wireless connection using Wi-Fi. To use the wireless connection, you must set it up first.
After connecting the wired/wireless router (which supports the DCHP function) to the NAO robot and pressing
the power button, NAO will speak about its own status (“Hello, I’m NAO.
My internet address is xxx.xxx.xxx.xxx, my battery is fully charged.”). Here, wired connection is prioritized
before the wireless connection for the internet address.
NAO also supports FTP service. The user can use FTP to send and receive files with NAO.
When using web service or FTP, a login is needed. The initial ID and password is nao/nao.
18 1 - introduction
1.3 connecting nao
A - Enter NAO’s IP address into the web browser. Image 1.9 shows a normal default screen.
C - When Network tab is selected, Connections and Available Network will come up.
In Connections, the IP address of the wired/wireless connection and the Mac address
is shown. In Available Network, the list of networks currently availablwe for use is shown.
E - If Connect is selected, the wireless IP address is set and the network status will change. to “ready”.
Image 1.10 shows processes (3)-(5) from above. NAO can use the wireless IP address without a cabled
connection to communicate with other systems once the wireless network configuration is complete
1 - introduction 19
1.3 connecting nao
1.3.3 Using web service for default settings
As shown in the wireless network configuration process, web service can be used to set and verify
the basic parts of NAO.
The menu in the web browser is largely divided into About, Network, Settings, and Advanced. Advanced
is divided into NAOqi, Memory, Process, Hardware, Remote Controls, Bluetooth, and Log. The following
explains each menu option:
• About:
Shows the basic information regarding NAOqi and the Network.
Use the information about NAOqi to show the version, status, language, module, and behaviors.
Information regarding the network is used to show the IP address of the Ethernet and Wi-Fi.
• Network:
Shows NAO’s network connection information and can set the connection.
This function can be used in network configuration.
• Settings:
NAO’s basic settings can be done here. ID and password can be set and NAO’s icon can be changed.
The changed icon shown here is the one shown in Choregraphe’s connection list.
You can also set the language, time zone and the volume. The current version supports Chinese,
English, French, German, Italian, Japanese, Korean, Portuguese and Spanish.
• Advanced:
You can set or verify advanced functions.
- NAOqi: You can see the status of NAOqi’s behavior and can control it using ‘start,’
‘pause,’ and ‘restart.’
- Memory: Search specific variables by name and see the current variable values.
- Process: Shows the list of processes NAO is currently executing. Process information includes
the process no. (Pid), terminal connected to the process (TTY), time, and process name.
- Hardware: Shows the NAO’s device and joint information, configuration value, and temperature.
Here, the device refers to the board that controls the battery, LEDs, sensors, and joints.
Joint information includes the temperature, value of joint rotation angle measured by sensors,
and motor activated value. Configuration value refers to NAO’s default version, and temperature
refers to the silicon (silicium) on the head and the temperature of the board.
- Bluetooth: Shows the Bluetooth devices connectable with NAO and provides the function
for setting the connection.
20 1 - introduction
1.3 connecting nao
1.3.4 File Transfer using FTP
FTP can be used for NAO’s file transfers.
The file transfer function has been added to Choregraphe Version 1.8.16. FTP programs like WinSCP
can be used with earlier versions. When WinSCP is used, you can access the robot using NAO’s IP address,
ID, and password.
NAO uses Gentoo Linux, and the folder structure is the same as the one in the Linux folder.
The files related to NAO are stored in “/home/nao.” Image 1.11 shows the access to the NAO folder.
The video shown in Image 1.11 was recorded using Monitor’s camera viewer. Chapter 2 will show you
how to use FTP in Choregraphe.
1 - introduction 21
1.3 connecting nao
1.4 Monitor
(former name: Telepathe)
As introduced above, Monitor is a program that shows real-time information about NAO’s camera
and memory. Image 1.12 is Monitor’s default screen.
In order to receive feedback from NAO using Monitor, your PC and NAO must communicate with
each other. Both Memory Viewer and Camera Viewer set the connection information using Browse
Robots (Image 1.13). This is also how Choregraphe connects with NAO. The connection can be set using
the port number or IP address. After entering NAO’s wired/wireless IP address, press the “Connect to”
button to connect NAO with Monitor.
22 1 - introduction
1.4 Monitor (former name: Telepathe)
NAO’s camera can be set using Monitor’s Camera Viewer. image 1.14 shows Camera Viewer’s video
configuration screen. You can set the functions for frame rate, resolution, video conversion, mark and
face detection, vision recognition, bottom camera selection, adjustment of color channel value, symmetry
conversion, and etc. The video stream is received from NAO when Camera Viewer’s playback button
is pressed (the frame rate can be, it depends on the network).
Camera Viewer can be used to save images inside the robot and to transmit the saved files to your PC.
image 1.15 shows how the video recorder is used to save and transmit images. After selecting the “Video
Recorder” tab, enter the name of the video file to be saved. Afterward, when the record button is pressed,
NAO will save it to its secondary storage. After it is saved, the corresponding file will be sent to your PC.
At this point, the images are saved in “/home/nao/naoqi/share/naoqi/vision/” OR “/home/nao/recordings/
cameras/” (in newest versions) in Camera Viewer’s video recorder does not itself have a configuration
section. The images have the following characteristics:
1 - introDuCtion 23
1.4 monitor (Former nAme: telepAthe)
Memory Viewer can be used to view the variable values used by NAO’s system. Memory Viewer
can be expressed as a time graph, and simultaneous output of multiple variables is also possible.
image 1.16 shows the screen for setting the variable for observation in Memory Viewer.
When using Memory Viewer for the first time, you can create a new configuration file, open an existing
configurations file, or choose not to use one.
The graph is activated if you select “Watch” and “Graph” from the variable list and play “Start/Stop
graph” located below the graph output screen. Variable values will be in 100ms units by default,
and if the Subscription Mode is set to “No subscription,” “Refresh all” must be pressed to update the screen.
“Every nb ms” updates the screen per unit of specified time. Time used here must have a value greater
than 100ms.
24 1 - introDuCtion
1.4 monitor (Former nAme: telepAthe)
25
2 ChoregraPhe
LEARNING
Chapter 2 will teach you how to use Choregraphe,
a graphic-based programming tool, to operate NAO.
Choregraphe uses a program module called Diagram
to explain how to program and how to set NAO’s
movements in Timeline.
26
content
2.1 Introduction and Interface 28
2.1.1 Menu 29
2.1.2 Box Library 30
2.1.3 Diagramming Space 30
2.1.4 3D NAO 31
2.1.5 Predefined Position Library 32
2.1.6 Video Monitor 32
2.3 Box 35
2.3.1 Structure 35
2.3.2 Box generation 36
27
2.1 Introduction
and Interface
Choregraphe is a cross-platform application that can implement NAO’s actions through graphics-based
programming. Unlike text-based programming, graphics-based programming has a lower focus on
grammar, and programming is mostly achieved using a mouse rather than a keyboard to create codes.
Choregraphe is also usable in Windows, Linux, and Mac OS and provides some FTP or Monitor functions
introduced in Chapter 1. Implementing NAO’s actions in Choregraphe is a task that involves connecting
the elements of movement (box) into one group centered around a time or event.
u
v
x
Image 2.1 shows the Choregraphe interface, and it is divided into four different categories
as shown below. Pose Library and Video Monitor functions are also provided.
28 2 - choregraphe
2.1 introduction and interface
2.1.1 Menu
In Menu, there is a drop-down menu with File, Edit, Connection, Behaviors, View, and Help, and the icon
menu shows New Diagram, Open, Save, Previous/Next, Connection, Run, Cancel, Debug View, Loading,
and Motor Lock. The following shows each of the functions:
File Menu
- New project: Create a new project.
- Open project: Open a project.
- Open recent project: Open the project that was most recently worked on.
- Save project: Save the project.
- Save project as: Save the project under another name.
- Import: Bring up the workspace or Version 1.6 of the project.
- Exit: Exit Choregraphe.
Edit Menu
- Undo: Cancel the latest action.
- Redo: Perform the canceled action again.
- Preferences: Configure the Choregraphe environment.
Connection Menu
- Connect to: Connect NAO to Choregraphe.
- Disconnect: Disconnect from NAO.
- Play: Send and execute the program connected to NAO.
- Stop: Stop a program that is currently being executed.
- Debug/Errors output: If there is an error while operating
in debuggingcmode, error information will be presented.
- Connect to local NAOqi: Connect to NAO in simulation.
- Advanced: Used during an update, unusable in any version
older than 1.3
Behaviors Menu
- Manage the behaviors saved in the robot.
- Add behaviors saved in your PC to the robot.
- Add behaviors you are currently working on to the robot.
- Delete saved behaviors or save them to your PC.
- Stop all behaviors.
View Menu
- Robot View: Enable 3D NAO in Choregraphe’s work environmentand
set the point of view for the screen.
- Enable the Box List, Pose library, Video monitor, Project content,
Script editor, Debug Window, and Undo Stack necessary
for Choregraphe operation.
- Reset Views: Return Choregraphe’s task screen to the initial state.
2 - choregraphe 29
2.1 introduction and interface
2.1.2 Box Library
The boxes that can be used by Choregraphe are listed by function. Here, a box refers to an icon
with a function. The box library is largely divided into 13 types depending on function and is made up
of 70 or so boxes. This box can be used to control NAO’s various movements.
You can create a new box library or import a saved box library on the top part of the Box Library,
and a search function is provided. In the lower part, there is a short explanation regarding the selected box.
The default is the built-in box library, and it is indicated with a tab when a different box library is imported.
Image 2.2 shows how it was programmed to turn on the ear LEDs by using the Battery box and Ear LED box
when there is no battery capacity. Just the mouse can be used to program inside the diagramming space.
30 2 - choregraphe
2.1 introduction and interface
2.1.4 3D NAO
As the name of 3D NAO implies, it is a screen showing the three-dimensional NAO. 3D NAO mainly provides
three functions. First, it simulates the movement programmed by the user.
This function is possible only for joint movements, and simulation for using other elements like sensors,
LEDs, and cameras is not possible. Second, the user can enter the values for the joints the robot can actually
move. The mirroring function is used when left/right joints are manipulated in a symmetrical manner.
Third, the actual movements of the robot are shown through this screen.
You can set the point of view in Choregraphe’s View Menu or the user can use the mouse to control the point
of view. The screen can be moved left/right and up/down using the left mouse button, and the right mouse
button can be used for rotation. Also, the mouse wheel can be used to zoom in/out.
There are two methods for generating NAO’s posture. The first method is to control the joints in simulation
to generate the pose (Image 2.3), and the second method is to manually move the joints of the actual robot
to acquire the joint information.
2 - choregraphe 31
2.1 introduction and interface
2.1.5 PreDefineD PoSition library
This is a menu that can be configured and used in Choregraphe’s View Menu.
It is a function that saves the joint value of a specific pose to use it like a box. If you register common poses
like standing and the default pose, you can program more conveniently.
The pose library is configured similarly to the box library (image 2.4). Choregraphe provides three different
poses provided as default. Zero is when all the joint values are 0, and Init is a good pose for transitioning
to the next movement. Stand is a standing pose that uses the least amount of battery. The Pose library loads
the predetermined joint value to generate NAO’s movements. The user can create frequently used poses
to add to the pose library, and this can be implemented using the File Menu of the pose library.
Instructions on how to use the Video Monitor is explained in Section 2.5.9 through the Video Library’s practice
exercise.
32 2 - ChoregraPhe
2.1 introduction And interFAce
2.2 Choregraphe-NAO
Connection
Left side of Image 2.6 shows NAO’s connection list. The picture of NAO shown in the list displays
the condition of the robot. There are three types of NAO conditions:
NAOqi is active and wired or wireless connection is available. If the right mouse button is clicked,
a menu that can move to LED tests and web page is created.
NAOqi is in a stationary state and both wired and wireless connection is not possible. However,
connection is possible if it is wired and you force set the port (9559).
NAO possible for simulation. Because only simulations are possible, LEDs and sensors cannot be
implemented and only motion control is possible.
2 - choregraphe 33
2.2 Choregraphe-NAO Connection
2.2.2 Enslaving
Enslaving is locking and unlocking of joints. The robot may engage in overstrained poses
when testing behaviors. It can also be used when the user randomly and manually generates
the robot’s posture. The Enslaving function can be used through three different methods.
The first method is to use the menu where either the “Enslave all motor on/off” from the Connection Menu
or the icon from the menu window is selected. The second method is to select the “Enslave chain on/off”
button in the lower section of the joint configuration window after the 3D NAO joint is pressed. The third
method is to use the Stiffness box from Motion in the box library. Caution is necessary if the joint is locked
for a long time because the battery consumption is high and the temperature can go up rapidly.
Following the Enslaving condition, the color of the icon in the menu window changes:
- Green: Un-enslaving condition, the stiffness value is 0. The motor doesn’t move even
if there is a command.
- Yellow: Set as the stiffness value. Stiffness is explained in Section 2.5.6 Motion Library.
- Red: Enslaving condition, the stiffness value is 1. The motor moves according to commands.
Image 2.7 - Authentication process (left) and folder list (right) for the FTP Service
34 2 - choregraphe
2.2 Choregraphe-NAO Connection
2.3 box
2.3.1 StruCture
Choregraphe’s best feature is the fact that it provides a graphics-based
programming. Boxes and icons are an important element of Choregraphe
and are placed in the diagram. Graphics-based programming is done
through connecting these boxes. Choregraphe itself provides 70 or so boxes,
but the user can create others as well.
The box is configured with an input, output, and the Parameter button
(image 2.8). Some boxes don’t have an output or the Parameter button.
The interface of the box is as follows: image 2.8 - Box
onStart input: An input for activating the functions of the box, located in the upper left corner.
It is usually connected to the signals produced by other boxes.
onStop input: An input for stopping the box functions, located second from the top in the upper left
corner.
onStopped output: An output that shows the value result when the box function stops, located
in the upper right corner. It is usually used as the input signal of other boxes.
onLoad input: This is used when connecting Timeline’s internal box. When the box behavior
is implemented, Timeline’s internal box behavior loads.
Input connected to ALMemory: connected to ALMemory, memory’s specific variable value is delivered.
onEvent Input: This input brings the data from an external source to inside the box.
Punctual output: This output sends out the data inside the box.
Parameter: Box parameters can be set, located in the lower left corner of the box.
The box inputs/outputs can have different colors depending on the type of connection signal.
The form of signal and color used by Choregraphe is as follows:
- “Bang”: Black, a common input/output, it is a signal that doesn’t have any information.
- Number: Yellow, it is a signal that has the number information.
- String: Blue, it is a signal that has the text information.
- Dynamic: Gray, it can use all three aforementioned signals, the type of signal
is determined when the signal is applied.
2 - ChoregraPhe 35
2.3 box
Image 2.9 - Box structure
There are three types (Script, Timeline, Diagram) inside the box (Image 2.9). Script uses Python to implement
the functionality of the box. Timeline programs NAO’s movements according to the passage of time.
Diagram programs NAO’s movements depending on the flow of events. Timeline, which is time-based, can
program NAO’s movements by frame. Diagram, which is event-based, an program NAO’s movements by box.
Programs with joint movements must use both Timeline and Diagram together. Some parts of the box
provided by Choregraphe can have all three types applied at the same time.
A - If you press the right mouse button in the diagram space, the edit menu will open (Image 2.10).
If ‘Add a new box’ is selected from this menu, a setup window for a new box will open.
36 2 - choregraphe
2.3 box
B - You can configure the new box through the setup window (Image 2.11).
The following is the description of the image:
uu y
y
v
v
zz
w
w
x
x
m Offspring: Determines the box type as mentioned above. No Offspring is a script, and Timeline
is time-based and the Flow Diagram is a diagram.
n Plugin: It is a function that uses the library of some of the boxes provided by Choregraphe.
o IO description: Sets the detailed information for the input, output and parameters. Sets the element’s
name, description, signal type, and input/output form. Signal type and input/output form are the same as
what was introduced earlier. The number to the right side of Type refers to the number of data.
If it is greater than 1, the signal is transmitted in array form.
2 - choregraphe 37
2.3 box
Image 2.12 - Box for Timeline (left) and Diagram (right)
C - If you double click the new box, a window like the one you see in Image 2.12 will open.
In this section, we looked at how you can generate the box interface and how to create a new box.
The next section explains how to use the box to program.
38 2 - choregraphe
2.3 box
2.4 Event-based
and Time-based Programming
There are two ways for Choregraphe to program NAO’s movements. The first method is event-based
programming where it deals primarily with box placement and connection. The second method is time-based
programming that defines the robot’s joint movements according to time. Dance moves can be created using
time-based programming. The diagram can be generated according to the flow of time. Meaning, both time
and event-based programming can be used together. This section will provide a simple over view on how to use
Choregraphe for the two methods.
Example 2.1 uses the Say box to have NAO speak the strings entered by the user. We will use two Say boxes
to explore how to connect the boxes and the process executed by NAO.
A detailed description of the box is in Section 2.5 Box Library.
Image 2.13 and 2.14 shows box placements, parameter settings, and box connections.
Image 2.15 shows you how to execute programs with NAO. Image 2.16 shows you the process
of how NAO executes a program. The following is an example of how to program and execute NAO.
2 - choregraphe 39
2.4 Event-based and Time-based Programming
a - After selecting the Say box from the Box List, place it in the diagram.
Repeat this process one more time.
b - After pressing the Set button of the Say box, enter the string to be spoken by NAO in the Text entry.
u v w
image 2.14 - Box Connection
C - Connect the box inputs/outputs as shown in Image 2.14. A signal is generated in the output
of the Say box when NAO finishes speaking. Descriptions for each connection are as follows:
- Connection j: The start inputs of the root diagram and the box are connected. When NAO executes
a program, the start input of the root diagram becomes active. If these two input/output are not
connected, the program will not automatically run. If you don’t want it to run automatically,
it will run when you double-click the first start input of the Say box.
- Connection k: The output of the first Say box and the input of the second Say box are connected.
The second Say box will run once the first Say box is completed.
- Connection l: The output of the second Say box and the output box of the root diagram
are connected. A signal going into the output of the root diagram signals the end of the program.
40 2 - ChoregraPhe
2.4 event-bAsed And time-bAsed progrAmming
D - To send the program created in Choregraphe to NAO, it has to be connected to NAO first.
As you can see in Image 2.15, if you click the antenna shaped button, the Browse Robot window will
open (as previously explained in Section 2.2). Since it was explained in the previous section, we will omit
how to connect to NAO using the Browse Robot window.
E - After connecting Choregraphe with NAO, the program must be sent to NAO and then executed.
When the Play button is clicked, the blue bar on the right will move. This signals that the generated
program is being sent to NAO. The bar will be full when the program is sent completely.
This will take a long time if a program is using a lot of boxes.
2 - choregraphe 41
2.4 Event-based and Time-based Programming
2.4.2 Time-based Programming
Time-based programming primarily consists of NAO movements that occur over time, meaning a program
can be made because each frame defines NAO’s posture. Programming done using Timeline doesn’t define
the posture for all the frames but instead defines the posture for some of the frames.
Joint movements from the two frames with defined postures are defined by a consistent pattern.
An example will be shown to demonstrate how Timeline programs and manipulates joints.
Example 2.2 shows the creation of new boxes in Timeline and how to move the arms (Image 2.17).
Since Section 2.3 already explained how to generate new boxes, an explanation regarding parameters
of new boxes will be omitted here. The following is a description of the Timeline window.
A - When Timeline box is created and double-clicked, the Timeline window will open (Image 2.18).
The following explains the Timeline window:
j Motion: Composed of Timeline editor and playback button. If Timeline Editor is used, you can define
the movement pattern of the frames. Chapter 6 will explain how to use Timeline Editor.
Motion playback occurs when the playback button is pressed.
k Behavior layers: Action layers with keyframe can be generated. Keyframe has a diagram that can place
boxes. This diagram will be used to generate an event at a particular time or to implement a process
for when an event occurs. Keyframe diagram’s start signal is applied from the set frame (point).
Edit menu will open when the right mouse button is pressed in keyframe.
This menu can be used to add and delete keyframes, change names, and to set the start frame.
The mouse can be used to drag the keyframe to set the start frame.
42 2 - choregraphe
2.4 Event-based and Time-based Programming
u x
v
w
y
l Edit: Set the frame per second (FPS) for motion playback, the number of frames, and the playback mode.
NAO will move faster if you increase the number of FPS. Depending on how resources are obtained,
playback mode can have the Passive mode, Wait mode, and Aggressive mode. Here, resources refer
to the elements necessary to implement programs in NAO’s system.
The following is a description of each mode:
- Passive mode: Basic mode, motion playback occurs even when necessary resources
are not prepared.
- Waiting mode: Waits until the necessary resources are ready.
- Aggressive mode: Terminates the movements being executed in order to acquire
necessary resources.
- In general, we let the default value “Passive Mode”
m Select the desired frame. You can use the left mouse button to select one frame or drag the mouse to select
an entire section of frames. Chosen frame has a number displayed next to the Edit button. Green line means
0 frame, purple line means the selected frame, and a red line means the end frame.
B - Use the Edit button to set the Timeline; the following is the setup value:
- FPS: 10
- Size: 360
- Resource Acquisition: Passive Mode
2 - choregraphe 43
2.4 Event-based and Time-based Programming
Image 2.19 - Defined posture of Frame 5
C - To generate the ready position, select Frame 5 and NAO’s right arm. Place a check for mirroring
and manipulate the joints in the joint manipulation window (Image 2.19). Mirroring lets the left/right
joints move in identical form. When the joints are manipulated, you will see a dark gray bar in Frame 5.
This means that the movements have been defined in Frame 5.
When the joints are manipulated, you will see the circle next to the joint value become red.
This means that the corresponding joint is locked. Sometimes, the user must select whether
a particular joint will lock. For example, if left arm movement is defined in Frame 10 and right arms
movement is defined in Frame 20, the right arm will actually move from 1 to 20.
This happens because the joints are automatically interpolated for the undefined frames.
To make sure a joint will not move until a specific time, you must lock it.
A more convenient function is to save the joint information by each segment. If you press the right
mouse button inside the frame where the movement is defined, a sub-menu called “store joints
in keyframe” will appear. In this sub-menu, there are Whole body (where all joint information is saved),
Head (where the head joint information is saved), Arms (where the arm joint information is saved),
Legs (where the leg joint information is saved), and Forearms (where the wrist joint information is saved).
D - Define the posture for Frame 15, 25, 35, and 45 using the same method (Image 2.20).
Frame 15 and 35 have the same posture while Frame 5 and Frame 45 have the same posture.
To define identical postures, copy and paste the existing frame (where posture has already
been defined) rather than manipulating the joints again.
44 2 - choregraphe
2.4 Event-based and Time-based Programming
Image 2.20 - Defined posture for Frame 15 (above) and Frame 25 (below)
E - You can observe movements by clicking the Motion playback button in the Timeline window.
So far we looked at how to program NAO’s actions through a brief introduction to Choregraphe,
box structure, and a programming practice. Section 2.5 Box Library will explore the box library
that will be used for actual programming.
2 - choregraphe 45
2.4 Event-based and Time-based Programming
2.5 Box
Library
In general programming, a function refers to an independent program that executes a particular function.
A box is like a function, and the box library provided by Choregraphe contains a total of 70 boxes
that are mainly divided into 11 types. The box library configuration is as follows:
A - Ear LED
Image 2.21 shows the Ear LED box (left) and the Parameters screen (right). Ear LED box
can use the LED by adjusting the location (left, right), time, intensity of the blue, and the angle.
The following describes the parameters:
46 2 - choregraphe
2.5 box library
B - Eyes LEDs
Image 2.22 shows the Eyes LED box (left) and the Parameters screen (right). The Eyes LEDs box controls
the LEDs of both eyes; it turns on NAO’s eye LEDs for a set period of time. The following shows the box
parameters:
Image 2.23 appears when you double-click the Eyes LEDs box. There are Color Edit and Eyes LEDs boxes
inside. The color selection window opens when you double-click a color from Color Edit; this window can
be used to choose the desired color. A number array is produced from the Color Edit box. Section 2.5.4 Tool
Library has the detailed information for the Color Edit box. The Eyes LEDs box in Image 2.22 and the Eyes
LEDs box in Image 2.23 are different boxes. Script Editor will open if you double-click the Eyes LEDs box in
Image 2.26, and Python can be used to edit. Chapter 3 Python deals with script editing methods.
C - RandomEyes
2 - choregraphe 47
2.5 box library
D - Switch LEDs
Image 2.25 shows the Switch LEDs box and the Parameters screen. The LEDs of both ears
and legs can be controlled by using the Switch LEDs box. The following shows the parameters:
E - Water Clock
Image 2.26 shows the Water Clock box and the Parameters screen. The Water Clock box turns
on the LEDs for NAO’s ears and eyes as blue for a set period of time. Intensity of the LEDs weakens
as you get near this set time. The following shows the parameter:
48 2 - choregraphe
2.5 box library
2.5.2 Sensors Library
There are numerous sensors on the NAO robot including infrared, bumper, and contact sensors.
The sensors can be used to identify the presence of obstacles or to determine user contact. Just like
the LEDs, functions that use sensors can only be verified using the actual robot.
A - Battery
Image 2.27 shows the Battery box and internal configuration. The isLow box inside has two inputs: one for
the Battery box input and the other for connecting to the memory’s BatteryLowDetected. The second output
of the Batt ery box will activate when the power of the battery drops below a certain level. There is a high risk
of the robot breaking if excessive movements are carried out when the battery is low. Endless Walk box
in the Walk library shows an example of how to use the Battery box
B - Bumper
Image 2.28 shows the Bumper box and internal configuration. Bumpers are installed in the front of both feet,
and they detect whether or not the bumpers are pressed to recognize obstacles in front of the feet. The two
outputs in the Bumper box produce information regarding whether or not the left and right bumpers are being
pressed. The inside of the bumpers are configured with left and right boxes, and each box is connected to the
variables of LeftBumperPressed and RightBumperPressed. The output value is true if each variable has a value
greater than 0, and the output value is false if the value is 0.
2 - choregraphe 49
2.5 box library
C - Foot Contact
Image 2.29 shows the Foot Contact box and internal configuration. The pressure sensors (FSRs:
Force Sensitive Resistors) attached to NAO’s soles are used to determine whether or not the soles
are in contact with the floor (Image 2.30). The Foot Contact box determines the output signal according
to the footContactChanged variable value.
A signal will occur from the second output if it is in contact with the floor, and if there is no contact, a signal
will occur from the third output. You can determine when NAO is falling by using the pressure sensor
attached to the sole. Here, the protective function is activated to improve system safety. Detailed information
regarding the pressure sensor is in the “Hardware” section of the reference.
50 2 - choregraphe
2.5 box library
D - Robot Pose
Image 2.31 shows the Robot Pose box and internal configuration. Robot Pose box produces the strings
for NAO’s current posture. The blue inside the output indicates a string signal. There are no other boxes
inside and the robotPoseChanged variable and the output are directly connected. Strings provided as output
values include “Unknown,” “Stand,” “Sit,” “Crouch,” “Knee,” “Frog,” “Back,” “Belly,” “Left,” “Right,” and
“Headback.”
E - Sonar
Image 2.32 shows the Sonar box and internal configuration. The Sonar box uses the ultrasonic sensors
located in NAO’s chest to detect whether or not there are obstacles in the front. Out of the three punctual
ouputs, the upper two will work when there are no obstacles detected by the left and right ultrasonic sensors.
When an obstacle is detected by the ultrasonic sensor, the blue punctual output produces a string output
with information regarding the direction of the obstacle.
The punctual output of the other Sonar box (located inside the Sonar box) is connected to ultrasonic
sensor related variables. They are each connected to the SonarLeftNothingDetected, SonarLeftDetected,
SonarRightDetected, and SonarRightNothingDetected variables.
2 - choregraphe 51
2.5 box library
F - TactilTouch
Image 2.33 shows the TactilTouch box and internal configuration. This touch sensor is attached to NAO’s crown
and is divided into three parts. It becomes active with user contact and produces information regarding whether
or not each part has been activated. The following three outputs each show whether or not the front, center,
and back sensors are in contact. The inside is configured with the if 0 box consisting of Python codes. This box
receives the value of memory’s variable, and the output is activated when this value is greater than 0. Starting
from the top in order, it is connected to the FrontTactilTouched, MiddleTactilTouched, and RearTactilTouched
variables. The user can send signals to the robot using this sensor. For example, to manipulate the forward, stop,
and reverse motions, the user can create behaviors depending on whether or not the sensors of the three parts
are active.
Image 2.34 shows the Tactile L(R).Hand box and internal configuration. Unlike the Tactil Touch box,
the contact sensors here are attached to both arms. The output and functions are identical to the Tactil
Touch box.
52 2 - choregraphe
2.5 box library
H - TactilTouch
Image 2.35 shows the Fall Detector box and internal configuration. This box determines whether the robot
will fall. It also implements internal system protection. In the internal diagram of the Fall Detector box,
the Wait box and robotHasFallen variable are all connected, and they get activated by the Wait box after
a period of time. Time delay occurs in order to ensure the completion of the safety process. The default
delay time for the Fall Detector box is 0.5 seconds.
A - Dispatcher
The Dispatcher box has one input and several outputs (Image 2.36) and plays a similar role as the switch
statement in C. If the signal coming in from the input matches the box element, the element will be
sent to the corresponding output. The first output becomes active when there is no matching element.
Input/Outputs are gray and both numbers and texts can be used.
Text must be entered with double quotation marks (“”).
The Dispatcher box can identify the different signal types. You can add and delete lists through
the menu (Insert row, Remove row) that appears when you click the right mouse button on top of the box.
Also, when you finish creating list, an empty list will automatically be added.
2 - choregraphe 53
2.5 box library
n Example 2.3 (Example File: Ex2.3_dispatcher.crg)
Image 2.37 shows a program that uses Random Int and Dispatcher boxes to control LEDs.
Random Int box creates random integers. The following example shows how LEDs are controlled.
- When the random number is 1: Turn off Ear LEDs, Turn on Eye LEDs.
- When the random number is 2: Turn off Ear LEDs, Turn off Eye LEDs.
- When the random number is neither 1 nor 2: Turn on Ear LEDs, Turn off Eye LEDs.
B - Eyes LEDs
54 2 - choregraphe
2.5 box library
The Choice box detects the user’s voice, and if the detected voice matches a word on the list, a string
is produced for the corresponding output (Image 2.38). Meaning, answers can be classified depending
on the question. The first input executes the Choice box. If the input receives a string, the box is executed
after the string is read. This string must be a question that matches the answer NAO will hear from the user.
The second input receives a list of words from an external source. This signal must be received before the
box starts. The list of words can be generated above the box. The first output of the box has a string output
regarding the status of the Choice box. Choice box status is as follows:
- “timeout”: When there is no response from the user for a set period of time.
- “notUnderstood”: When there are no words that match the user’s voice.
- “onStop”: When a signal is applied to the stop.
- “wordRecognised”: When there is a word that matches the user’s voice.
- “onTactileSensor”: When the user has touched the contact sensor.
The user can issue a command using the head’s contact sensor in addition to speech recognition.
The front part of the contact sensor increases the index of the words while the back decreases it.
When the index changes, the word from the corresponding index is read. The center part delivers
the chosen word to the command. The following explains the Choice box parameters:
- Activate head: Selects whether or not the head joint is activated while Choice box is operating.
- Activate arms: Selects whether or not the arm is activated while Choice box is operating
- Activate legs: Selects whether or not the leg is activated while Choice box is operating.
- Minimum threshold to understand: Sets the minimum threshold value for speech recognition (0.0-1.0).
- Minimum threshold to be sure: Sets the threshold value to obtain the answer to the user’s question
(0.0-1.0). If the value is lower than this threshold value, NAO will ask the question again.
- Speech recognition timeout when confirmation: Sets the point of time for determining the success
of speech recognition. Once speech recognition is completed during this time without a reply from
the user, NAO will decide that the speech recognition has been successful.
- Speech recognition timeout: Determines when to stop speech recognition.
- Maximum number of repetition when no reply: Sets the number of time the question is repeated
if the user does not respond (1-20).
- Fun animations activated: Determines whether or not the chosen movements are still implemented
if speech recognition fails.
- Repeat validated choice: Determines the repeated output of the chosen word. If this parameter
is selected, the box will terminate and there will be another output of the chosen word.
- Activate ears light: Determines whether or not the Ear LEDs will be activated.
- Activate eyes light: Determines whether or not the Eye LEDs will be activated.
- Activate brain light: Determines whether or not the contact sensor LEDs will be activated.
- Tactile sensor menu timeout: Determines the time limit of the contact sensor.
- Maximum number of repetition when failure: Determines how many times the question is repeated
when speech recognition fails.
- Activate help when failure: Determines whether or not help will be activated
when speech recognition fails.
- Activate help command: Determines whether voice prompted help will be activated.
NAO will explain the help when the user says “help.”
- Activate repeat command: Determines whether or not to repeat the voice prompted question.
NAO will speak the question when the user says “help.”
- Activate exit command: Determines whether or not to close the voice prompted box.
NAO will terminate the Choice box when the user says “exit.”
2 - choregraphe 55
2.5 box library
image 2.39 - Example of how to use the Choice box
image 2.39 is an example of a program that uses the Choice box. The question is entered by using
the Text Edit box. The expanded menu will open if you click the + button of the Choice box.
You can hear the pronunciation of a word if you click the playback button next to it.
C - Loop
The Look box has three inputs and two outputs (image 2.40). The Loop box has a similar function as the
‘for’ statement in C. The index increases by one when signal enters the start input and the output of the index
is activated. Loop max parameter can have a value between 0 and 500 for the number of iterations. The Loop
box index will initialize with 0 and is set to increase by one.
This is set in the script that appears when the Loop box is double-clicked. You only need basic knowledge
of Python to make this change.
56 2 - ChoregraPhe
2.5 box librArY
D - Multiplexer
The Dispatcher box compares one input with the box list to activate just one output.
The Multiplexer box is the opposite of the Dispatcher box and has several inputs but just one output
(Image 2.41).
When a signal enters the first input, it activates the «apple» string, and when a signal enters the second
input, it activates the «orange» string. When a signal enters both inputs, both “apple” and “orange” strings
are activated.
In the example shown in Image 2.42, the contact sensor of the head and the Multiplexer box are used
to speak where the user has touched the robot. As explained earlier, because each contact sensor consists
of three parts, each box has three outputs.
These outputs can be connected to the Multiplexer box input. As shown in Image 2.42, a signal enters the first
input of the Multiplexer box when the front sensor is touched, and then «front» will be output.
2 - choregraphe 57
2.5 box library
E - Wait for Signals
Wait for Signals box consists of two inputs and one output (Image 2.43). The output is activated only when
the signal enters both inputs of the Wait for Signals box. The output is not activated if only one of the inputs
receives the signal. However, the two signals don’t have to come in at the same time, and they are initialized
when the output is activated.
Image 2.44 - Example of how to use the Wait for Signals box
Image 2.44 is an example of how to use the Wait for Signals box to determine whether or not both left/right
bumpers are being pressed. When both bumpers are pressed, the robot will say “Two Bumper.” The following
shows the parameter setting.
F - Timer
58 2 - choregraphe
2.5 box library
The Timer box has two inputs and two outputs (Image 2.45). Period is the timer cycle with seconds as
the unit of measure and 0.0 to 5000.0 as the range. After the timer is activated and the set period of time
passes, a signal will occur in the second output. Please note that a signal will occur in the second output
when a start signal is applied to the initial timer box.
Image 2.46 shows an example of how to use the Time box to say “Ten Second” in 10 second intervals.
After “Ten Second” is initially spoken, it will happen every 10 seconds thereafter.
The following shows the parameter setting:
G - Wait
Wait box has two inputs and one output (Image 2.47). Wait a set period of time without sending a signal.
The input/output is black, and you cannot send information (like strings or numbers). Timeout is a time
delay in seconds with 0.0 to 5000.0 as the range.
2 - choregraphe 59
2.5 box library
■ Example 2.7 (Example File: Ex2.7_Wait.crg)
As explained in Timer box, the Timer box immediately activates the output as soon as the start signal
is received. Example 2.7 uses the Wait box and Timer box to get rid of this feature (Image 2.48).
Unlike Example 2.6, even if the start signal is applied, NAO will not immediately speak.
Image 2.49 - Goto And Play and Goto And Stop boxes and the Parameters Screen
60 2 - choregraphe
2.5 box library
The second is the constant function. In text-based programming language, the user defines the constant in order
to use specific information. The constant provided by Choregraphe has information regarding the angles, RGB
colors, numbers, and strings.
A box for each constant is provided. If a constant box isn’t used, you must edit the Python script.
Goto And Play box and Goto And Stop boxes have only one input and identical parameters (Image 2.49).
As previously introduced, time-based motion is generated based on frames. Frame number tells you
how many frames to move and can be between 0 and 10000.
This box is not used in the root diagram; it is used inside Timeline generated boxes (like the boxes
in the Motion library). After the Goto And Play box moves to the specified frame, the movement starts
from this frame. These two boxes can reflect event-based elements in time-based movements.
B - Play, Stop
Although both the Play box and Stop box are outputs, they have no parameters and only one input
(Image 2.50). The Play box is similar to the Goto And Play box, but it starts playing from the current frame
without moving the frames. The Stop box also stops in the current frame also without moving the frames.
C - Angle Edit
The Angle Edit box consists of two inputs and one output (Image 2.51). The user can enter the angle value
by selecting the degree or radian. The Angle Edit box first converts the angle value entered by the user into
a radian value, and then this radian value is used for output.
2 - choregraphe 61
2.5 box library
C - Color Edit
Color Edit box consists of one input and one output (Image 2.52). The color picker opens when you click a
color from Color Edit. Once a desired color is selected in the color picker, the color information will be sent
out as R, G, and B. You can see an example of how to use Color Edit
in the Eyes LEDs box.
Eyes LEDs box consists of the Color LED box and Eyes LEDs box (Image 2.53). Information for the selected
colors is transmitted in the order of R, G, and B. Input inside the Eyes LEDs box receives three signals
in a number array. Image 2.42 shows you that the color edit output [61, 74, 255] is a number array.
62 2 - choregraphe
2.5 box library
D - Number Edit, Text Edit
The Number Edit box and Text Edit box consist of one input and one output (Image 2.54).
Number Edit box produces numeric signals while Text Edit box produces text signals.
A - Divide
The Divide box has two inputs and one output (Image 2.55).
The inputs/outputs color shows that all three inputs/outputs send and receive numeric signals.
The Divide box performs divisions using the two numbers that came in through the input and then produces
the result. The first box is the Dividend and the second box is the Divisor.
2 - choregraphe 63
2.5 box library
Image 2.56 uses the Divide box to generate “1.0 ÷ 0.0.” There is an error when the Divisor is 0.
Image 2.56 - How to use the Divide box Example: 0.0 ÷ 1.0 (left) and 1.0 ÷ 0.0 (right)
B - Multiply
The Multiply box, like the Divide box, has two inputs
and one output (Image 2.57). It uses the two numbers
from the input to perform the multiplication
and then produces the result.
Image 2.57 - Multiply Box
C - Randomlnt
RandomInt box has one input and one output (Image 2.58). The RandomInt box generates random integers
where the minimum value is 0 and the maximum value is determined by the parameter. Max is the maximum
value of the random number with a range of 0 to 1X109. Shuffle heightens the complexity of the random
number. The RandomInt box is activated only once, so the Timer box or Loop box must be used in order to
continuously generate random numbers.
64 2 - choregraphe
2.5 box library
D - RandomFloat
RandomFloat box, like the RandomInt box, has one input and one output (Image 2.59).
The difference is that there is a floating-point instead of an integer and there is no Shuffle and.
The maximum value can be set by adjusting Max and has a range of 0.0 to 9.9 X 1013.
A - Arms Example
2 - choregraphe 65
2.5 box library
B - Hello
Hello box is a Timeline and Script box (Image 2.62). Internally, the FaceLeds layer is applied starting from
Frame 0. FadeLeds layer has a _AskForAttentionEyes script box. Frame 115 is the end frame with 25 FPS,
which is faster than the default value.
Hello box shows the motion for waving the arms to greet, and the colors of the eye LEDs change
(Image 2.63). There are intervals defined by two frames for a more natural motion.
C - Empty Timeline
66 2 - choregraphe
2.5 box library
D - Sit Down
The Sit Down box executes the sitting motion, and Maximum of tries is the number of times you can attempt
to sit (Image 2.65). You can make a maximum of 10 attempts. There are three outputs; the first output
becomes active when sitting is successful and the second output becomes active when it fails to sit.
If it is impossible to execute the sitting motion, the third output becomes active.
Sit Down box consists of one layer, and the sitDownBehavior layer has five keyframes (Image 2.66).
The keyframe starting in Frame 1 is the DetectrobotPose frame, and it decides whether to move to another
keyframe. The DetectRobotPose frame is separated into the number of attempts made for sitting, pose
acquirement, and playback of the acquired pose. Get Robot Pose box is used to acquire the robot’s pose,
and you can use the pose acquired through the Dispatcher box to move to another keyframe.
2 - choregraphe 67
2.5 box library
The following shows the key frame for the acquired position.
When Standing
The keyframe for standing or lying on either its back or front is configured with the Stiffness box, Timeline
box, Increase Count box, and DetectRobot Pose box. When laying on its side, it detects which side it’s facing
and executes the movement. Here, a pose is already defined in the FromStand, FromBelly, FromBack, and
RotateSide boxes for standing.
68 2 - choregraphe
2.5 box library
E - Stand up
Stand Up box provides the standing motion and has the same input/output and parameters
as the Sit Down box (Image 2.67).
While the Sit Down box has categorized the robot’s pose into four different types, the Stand Up box
categorizes into five different types. Image 2.68 defines the crouching pose. The configuration
of the box is omitted here since it is similar to the Sit Down box.
2 - choregraphe 69
2.5 box library
F - Stiffness
Stiffness box provides the Enslaving and Un-enslaving functions. Enslaving and Un-enslaving
were previously introduced as the locking and unlocking of the motor. Stiffness refers to the force of the
lock. Meaning, this is how much force will be applied to execute the command given to a motor.
When power is applied to NAO, Stiffness of all joints will be 0.
When Stiffness is 0, the joint will not move even if there is a command. When it is 1, all available force
will be used to execute the command. The greater the Stiffness, the greater the battery used, and the risk
of malfunction due to external shocks also increases.
Stiffness box consists of two inputs and one output (Image 2.69). The start sets the Stiffness to Max,
and the stop sets it to Min. The parameters of the Stiffness box are as follows:
- Min stiffness: Stiffness value when signal has been applied to the stop. (0.0 - 1.0)
- Max stiffness: Stiffness value when signal has been applied to the start. (0.0 - 1.0)
- Duration: When Stiffness is applied. (0.0 - 1.0) The previous stiffness value is applied.
- Head, Left arm, Right arm, Left leg, Right leg: Where Stiffness will be applied.
70 2 - choregraphe
2.5 box library
G - Tai Chi Chuan
Image 2.70 - Tai Chi Chuan box and the Parameters screen
Tai Chi Chuan box defines the motions that enable NAO to execute Tai Chi Chuan (Image 2.70).
Use legs enables you to select whether or not the legs will be used. The internal configuration of Tai Chi
Chuan is shown in Image 2.71, and the movements will change depending on whether or not the legs
are used.
If the legs are used, Tai Chi Chuan is executed only in “Stand” and “Crouch” positions, and if the legs
are not used, Tai Chi Chuan is executed only in “Sit,” “Unknown,” “Stand,” “Crouch,” and “Knee” positions.
2 - choregraphe 71
2.5 box library
H - LeftHand, RightHand
The LeftHand box and RightHand box shown in Image 2.72 are boxes that move NAO’s hands.
The hand opens when a signal enters the start and closes when a signal enters the stop. Since it is a script
box, you must edit the script in order to execute a more precise manipulation of the hands.
A - OmniWalk
The OmniWalk box consists of four inputs and one output (Image 2.73). It is a script box where
four numeric signals are used to make NAO walk. The following explains the input:
- X: Determines the direction and speed of the forward and reverse movements. The value
can be from -1.0 to 1.0; a negative number means reverse and a positive number means forward.
- Y: Determines the direction and speed for left and right. The value can be from -1.0 to 1.0;
a negative number represents the right side and a positive number represents the left side.
- Theta: Determines the direction and speed of the rotation. The value can be from -1.0 to 1.0;
a negative number represents a right turn and a positive number represents a left turn.
- Step Frequency: Determines the frequency of the walk. The value can be from 0.0 to 1.0; 0
is the stationary state and 1 is the maximum speed.
72 2 - choregraphe
2.5 box library
B - DemoOmni
We have just explored the OmniWalk box. DemoOmni box has created a program example using
the OmniWalk box.
The four OmniWalk box input signals become the parameters of the DemoOmni box (Image 2.74).
The DemoOmni box includes Omnibox’s regular walking motion and walking while waving
the arms. DemoOmni box is a combination of regular OmniWalk walking and walking while waving the arms.
Not only that, the safety feature is activated to prepare NAO for dangerous situations that may occur while
walking. Safety feature is activated when Force Sensing Resistor (FSR) fails to detect. Meaning, when both
feet are off the ground, NAO will stop walking.
- Left arm enabled: determines whether or not the left arm will move
- Right arm enabled: determines whether or not the right arm will move
- Stop walk when foot contact is lost: determines whether
or not the safety feature will be activated
2 - choregraphe 73
2.5 box library
Image 2.75 - Internal configuration of the Demo Omni box
The Demo Omni box consists of OmniWalk, Joystick, Protection, and EnableArms boxes (Image 2.75).
Joystick box uses the output to send the parameters of the Demo Omni box.
The Protection box activates “ENABLE_FOOT_CONTACT_PROTECTION” of the motion module.
The safety feature is explained in the reference (NAOqi Guide/Motion/Safety).
C - Endless Walk
Endless Walk box, as the name suggests, is for walking without stopping. OmniWalk box is used
for walking, and the sensor uses ultrasonic sensors as well as bumper and battery sensors. It is also
possible to use the camera to implement the face tracking function. Walking stops when the touch sensor
is used, and when NAO falls, the program ends after it gets back up. SpeedX, SpeedY, and SpeedRotation
parameters are same as the DemoOmni box parameters, and faceTracking shows whether or not the face
detection and tracking have been activated.
74 2 - choregraphe
2.5 box library
x
v w
u
Image 2.77 - Internal configuration of the Endless Walk box
Image 2.77 shows the internal configuration of the Endless Walk box. A simple explanation regarding
walking, sensors, and face tracking is provided in the TextEdit box. Endless Walk box uses a lot of boxes
because it has a lot of diverse functions. They are divided into four areas based on each function,
and the areas are not connected with each other.
The program operates normally even though each area is not connected because the Endless Walk box
memory (neverEndingWalk) is used. This type of programming is efficient when movements are controlled
by ultrasonic sensors (area k and m). The following provides an explanation for each area.
- Area j: This is where face recognition and tracking are executed. The faceTracking parameter
of the Endless Walk box is verified through the ifFaceTracking box. Face recognition is executed when
faceTracking is selected. Face Coord box uses FaceDetected, a memory variable, to acquire and produce
the angle of the face. The structure of the FaceDetected variable is explained in the reference
(NAOqi API/ALFaceDetection).
When the Face Coord box produces the angle of the face, the Tracker box uses this information to manipulate
the head. Random Int box generates a random number when this signal enters. If this random number is 1,
NAO will say “Hello,” and if this random number is 2, NAO will execute the greeting.
- Area k: This is where an ultrasonic sensor is used to detect obstacles; it consists of the Sonar box
and SonarDemo box. The output from the Sonar and SonarDemo boxes does not connect with other boxes.
The Sonar box used here is different from the Sonar box used in the Sensors library.
2 - choregraphe 75
2.5 box library
The Sonar box processes the value of the ultrasonic sensor to save the distance between NAO and the object
in the memory. SonarDemo box determines NAO’s actions when there is an object. The action here is not a
function of avoidance, but rather a movement that rotates the head depending on the direction of the object.
- Area l: This is NAO’s safety feature for when the battery is low. Because Endless Walk executes
a constant walking motion, the robot could fall down while walking if the battery is low.
If the battery power is insufficient, DemoOmni box in Area m will stop first. Even if a stop signal
is applied to the DemoOmni box, it will take a bit of time for it to completely stop. To guarantee
this time, a delay of three seconds is generated through the Sleep box. The Movement box lets
the NAO sit down, and the Stiffness box turns NAO’s joints into Un-enslaving status.
- Area m: This is where walking is executed and the robot is manipulated using the DemoOmni box.
Before starting to walk, the Init box sets the language, loudness of the sound, and Stiffness and stands
up. Init pose box is used to adjust the posture. The poolCommand box gets the information
(X, Y, and Theta) necessary for OmniWalk from the memory (neverEndingWalk) and sends this value
to the DemoOmni box. DemoOmni box is the box that actually commands the walk.
The MarcheInfinie box determines how the robot will be controlled. This is where you can
generate the information that will be used by the poolCommand box. It determines the actions
of the ultrasonic, contact, bumper, and FSR sensors.
D - WalkTo
WalkTo box determines where to walk to (Image 2.78). The parameter includes X and Y (distance values) and
Theta (angle of rotation) regarding the destination, and other parameters are the same as the Demo Omni
box. The following shows the explanation regarding Parameter X, Y and Theta:
- X: Front (front, back) distance in meters. Range is from -2.0 to 2.0, and the default value is 0.2.
- Y: Side (front, back) distance in meters. Range is from -2.0 to 2.0, and the default value is 0.
- Theta: Angle of rotation in radian. Range is from -3.14 to 3.14, and the default value is 0.
76 2 - choregraphe
2.5 box library
2.5.8 Audio Library
NAO has four microphones and two speakers. Audio is an important medium for the communication between
NAO and people. Audio provides diverse functions to facilitate smooth communications.
A - setVolume
The setVolume box sets the audio volume (Image 2.79). The parameter is volume percentage,
and 0 means it has been muted.
B - setLanguage
The setLanguage box can set the language used by NAO (Image 2.80). As introduced in Chapter 1, Chinese,
English, French, German, Italian, Japanese, Korean, Portuguese and Spanish can currently be used.
2 - choregraphe 77
2.5 box library
D - Music
Music box plays the audio files (Image 2.81). Supported audio formats include .wav, .mp3 and .ogg.
The following explains the parameter:
Music box consists of a Music File box that determines the location of the file you wish to play and a PlayMusic
box that actually plays the file (Image 2.82). When you click the folder button of the Music File box, “Select a file”
window will open where you will be able to choose a file. The list in this window is a file registered in Choregraphe,
and it is sent together with the program. If the purpose is to send a program rather than to play music, it would be
good to delete the music file in this list.
78 2 - choregraphe
2.5 box library
E - Say
The Say box lets NAO read the Text (Image 2.83).
This is used to let NAO read specific texts or sentences. The parameter is as follows:
- Text: String of text to be read by NAO; double quotation marks (“”) are not used.
- Voice Shaping: A value for the depth of the voice; range is from 50 to 150 with 100 as the default
value. Lower number means a deeper voice; 75 is recommended for a male voice.
- Speed: A percentage value of the speaking rate; range is from 50 to 200 with 100 as the default
value. The lower the value, the slower the speaking rate.
F - Say Text
The Say Text box has a similar function as the Say box, but it receives the Text parameter
from the input. Since this parameter is same as the Say box, it will be omitted here.
2 - choregraphe 79
2.5 box library
Image 2.85 - Example of Say box (top) and Text box (bottom)
Image 2.85 is an example of how to use the Say and Say Text boxes. When there is an obstacle, the last
output of the box uses a string to let you know the location of the obstacle. In order to use the Say box, the
Dispatcher box has to be used to determine whether it’s the left or the right, and the two Say boxes have to be
used to select the Text for each instance. However, when Say Text box is used, you can read the string of text
received from the Sonar box.
The SpeechReco box determines whether or not the voice acquired from the microphone matches the set
word (Image 2.86). The first output is activated when speech recognition starts. The second output sends the
matching word when speech recognition is successful. The third output is activated when there is no matching
word. The following shows the parameters of the SpeechReco box:
- Word list: Words used for speech recognition; semicolon is used to separate the words.
- Language: Sets the language of the voice. Currently Chinese, English, French, German,
Italian, Japanese, Korean, Portuguese and Spanish are supported.
- Threshold: Numerical value that represents the precision of the speech recognition;
if the recognition of the word is lower than the Threshold value, it is assumed that the word
was not recognized. Range is from 0 to 1, and the default value is 0.4.
- Visual expression: Determines whether or not LEDs are used during speech recognition.
80 2 - choregraphe
2.5 box library
F - Pronounce
The Pronounce box reads the word chosen by the user. Unlike the Say box, it reads all the registered
pronunciations when there are several different pronunciations available. The Pronounce box can be used to
improve the accuracy of the Speech Reco box. A good example is the French word “bonjour” where four different
pronunciations are provided. The Parameters include Word and Language.
G - Record
The Record box records the sound from the microphone (Image 2.88). There are four inputs;
the first starts the recording and the second stops it. The third input plays the recorded file
and the fourth stops it. The following explains the parameters.
- Duration: Time to be recorded in seconds. Range is from 0 to 60, and the default value is 5 seconds.
- Filename: File name used when saving the recorded content as an audio file.
Audio files are saved in the “/home/nao” folder.
- Number of channels: Sets the number of microphones. 1 and 4 can be selected;
1 only uses the front microphone and is saved in .ogg format. 4 uses all four microphones
and is saved in .wav format.
2 - choregraphe 81
2.5 box library
H - Sound Location
The Sound Location box detects nearby sounds and produces the angle of the sound source (Image 2.89).
The second output (soundLocation) produces two radian values for the angle of rotation for the sound source.
The first radian value is the azimuth and the second radian value is the elevation angle. The third output
(headPosition) contains information regarding the head and produces six values. The first three values
indicate the location of the head and the rest of the three indicate the head’s angle of rotation. Trust threshold
is the default value that determines whether or not the sound will occur; the range is from 0.5 to 1.0, and the
default value is 0.5. Enable move determines whether to move the head toward the sound source.
Sound Location box acquires the information regarding the sound source from the soundLocated variable
(Image 2.90). The Head Track box moves the robot based on the information acquired from Sound Location.
82 2 - choregraphe
2.5 box library
2.5.9 video library
NAO has a high definition camera that looks straight ahead and a camera that looks downward.
NAO’s vision system is configured to obtain the necessary information from the video gathered through
the cameras. The Video Library provided by Choregraphe consists of boxes that use this vision system.
A - Select Cam
Select Cam box chooses the camera to activate (Image 2.91). The first input activates the front-facing
camera, and the second input activates the downward-facing camera.
B - Face Detection
Face Detection box uses the cameras to detect faces and produces the numbers of people present
(Image 2.92). The second output produces the number of detected faces, and the third output activated
when there are no faces detected.
There is an internal detection box and it takes FaceDetected variables from the memory.
The number of faces is calculated using the FaceDetected variable size.
2 - choregraphe 83
2.5 box library
C - Add/Del Faces
Add/Del Faces box builds the database necessary for face recognition (Image 2.93). The first input receives
the string of names that can distinguish between the faces. If the string signal enters this, the faces
and names are added. The second input initializes the database. The following explains how to add the faces
to this database.
When a string signal enters the first input, there is a five second delay because of the Wait box. During these
five seconds, the face is detected and the database is added. Facial data is added to the database when
LearForgetFace box receives the facial data from the Delay Msg box. The eye LEDs will be green if the facial
data is properly added to the database; if it fails, the LEDs will be red.
D - Face Reco
The Face Reco box uses the cameras and the faces from the database to recognize faces (Image 2.94).
Face Reco box identifies who the faces belong to (if there is any). When face recognition is successful,
the second output produces the name of the face. When several faces have been recognized at once,
the outputs of names are done in order.
84 2 - choregraphe
2.5 box library
E - NAOMark
NAO Mark box recognizes predefined marks (Image 2.95). If the mark is recognized, the mark’s number
is output into the second output, but if it is not recognized, the third output is activated.
Choregraphe provides 10 marks; the mark images and numbers can be found under “media/NAOmark.pdf”
in the installation CD (Image 2.96).
F - Vision Recognition
Vision Recognition box compares the images acquired from the cameras with the images saved
in the database to determine the existence of an object (Image 2.97). This database is different from the one
used by the Face Recognition box. The database used in Vision Recognition can be generated by using the
video monitor we previously introduced. The second output produces the object’s name if there is an object
in the image. The third output is activated if the object does not exist.
2 - choregraphe 85
2.5 box library
n Example 2.8 (Example File: Ex2.8_vision reco.crg)
Image 2.98 shows an example of how the Video monitor is used for database generation, how
the Vision box is used for detecting the cell phone, and how to determine whether the cell phone
is open or closed. The following explains how the Video monitor is used to generate this database.
86 2 - choregraphe
2.5 box library
image 2.99 shows how the Video monitor is used to extract and register the object within an image.
First, Select Camera should be used to choose the downward-facing camera.
j The image acquired from NAO’s camera is shown when you click Video monitor’s playback button.
k A five second delay occurs when Video monitor’s study button is pressed.
This time is for acquiring a stable image.
n The study area is set when the object’s outline becomes a closed curve.
o There are three areas you can register to: Book, Object, and Location. It has been registered
to Object in this example. Here, the registration is saved in the Choregraphe database.
When the send button is clicked, Choregraphe’s database is sent to NAO.
image 2.100 - Results from example (left) and an image for object detection using Monitor (right)
image 2.100 is an example of implementing Example 2.8. You will see that [“closed,” “cellphone”] is output
into the second output of the Vision Reco box when object detection is successful.
The Monitor program introduced in Chapter 1 can verify the result of NAO’s image processing. The right side
of 2.100 shows how Monitor was used to detect an object. You can tell that object detection was successful
because it states “closed,” “cellphone” on the cell phone.
2 - ChoregraPhe 87
2.5 box librArY
2.5.10 tracker library
In order for NAO to follow a specific object, it needs information about the object, recognition, and accurate gait.
Choregraphe provides the function for using the camera to track the red ball and faces.
A - WB Tracker
WB (Whole Body) Tracker box tracks the object while maintaining the initial pose (Image 2.101).
If tracking is successful, the second output is activated. If tracking is unsuccessful (if an object in the image
does not exist), the third output is activated. Target choice refers to the objected you wish to track, and it can
only track human faces or red balls with a radius greater than 6cm. Time before lost is time spent finding the
object. In other words, if the object isn’t found within this time, the tracking is deemed unsuccessful.
The measure is done in seconds and the range is from 0.0 to 5.0 with 1.0 as the default value.
B - Walk Tracker
The output is the same as the WB Tracker. Target choice and Time before lost are also identical to the WB
Tracker. Threshold for walk forward/backward is NAO’s threshold value for how far the tracked object
is from NAO; it is measured in meters and the range is from 0.0 to 1.0.
88 2 - choregraphe
2.5 box library
These two parameters determine NAO’s direction of movement. NAO will move forward if the distance
to the tracked object is greater than the Threshold for walk forward value, and it will move backward
if the value is less than the Threshold for walk backward. If the value is in between the two threshold values,
NAO will remain in its place.
The Walk Tracker box has the Tracker box and WalkToTarget box inside (Image 2.103). Tracker box recognizes
the object and calculates the direction. WalktoTarget box uses the information regarding the angle and direction
to move NAO.
A - Send E-mail
2 - choregraphe 89
2.5 box library
Send E-Mail box is used to send e-mail (Image 2.104). The output is activated if the e-mail is sent successfully.
The following shows the parameters:
Image 2.105 shows how NAO captured the sent e-mail from the web page.
B - Fetch E-mail
Fetch E-mail box is used to receive e-mail (Image 2.106). These e-mails are saved in various formats.
The most common are text (.txt), webpage (.html), image (.jpg), and audio (.wav) files.
The parameters are:
90 2 - choregraphe
2.5 box library
- POP address: enter the POP address of the incoming mail server
- E-mail address: enter the e-mail address of the recipient
- Password: enter the e-mail account password
- SSL port: enter the SSL port number
If you place the mouse on the output after an e-mail is received, a message will appear (image 2.107).
Information regarding the e-mail will be in the bottom. The e-mail is saved in «/var/volatile/tmp/.»
If you use an FTP program like WinSCP, you can read the e-mails saved in NAO on your PC (image 2.105)
2 - ChoregraPhe 91
2.5 box librArY
3 Python
LEARNING
Chapter 3 will have a short introduction
to Choregraphe scripts and Python for NAOqi.
There is a basic description of Python syntax
and a discussion about creating and editing
Choregraphe script boxes.
92
content
3.1 Before Getting Started 94 3.6 Class 112
93
3.1 Before getting
Started
Most functions provided by the NAO humanoid robot can be used through Choregraphe. A lot of diverse
tasks like turning the LED lights on, repeating set movements, making sounds, and etc. are supported by
the Choregraphe box, and it can be used to perform various types of robotic tasks. However, it is difficult
to execute tasks that aren’t supported by the box, and it is incredibly tough to use the box to construct
complicated alg orithms like image processing.
This is why you must be able to edit the parameters or box algorithms used by the previous box and know
how to create new boxes. This can be done using C/C++ and Python, which will be discussed in this chapter.
As an object-oriented, interpreter-based language, Python has the advantage of being able to quickly view
the content being tested.
NAO’s Choregraphe environment uses Python to edit various things like the parameter settings of the box
and default flow (Image 3.1). You will be able to use Python in the future to directly control NAO’s hardware
through linking NAOqi or DCM, thereby making it possible to make desired changes to existing functions. If
you know how to work with Python, you can take advantage of NAO’s advanced functions.
This chapter will not provide a detailed explanation of Python. This chapter will introduce Python’s basic
functions and explain some of the functions needed to operate NAO. The main focus will be on becoming
familiar with the functions shown in the Choregraphe box examples.
94 3 - python
3.1 before getting started
3.2 Overview
It is a simple language that can quickly test and verify codes unlike existing languages where
you have to compile, execute, and debug. Hence the advantage is in being able to greatly reduce
the time spent on testing. Also, since data type is dynamically determined, you can easily create codes
unrelated to it.
Image 3.2 above shows the code for the variables produced when ‘hello’ and 1234 are each placed in variable
‘a,’ and you will see that the data type changes automatically.
If you were using C, ‘int a’ type statement would be necessary in order to specify what type of variable was
being used initially, and assigning «hello» to the variable declared as ‘int a’ would have generated an error.
The most typical advantage in dynamic data determination is the creation of a generalized code.
Let’s say that you are creating a function called ‘add’ where two variables will be added, and assuming
that the integer and floating point value will be entered, the overloading function in C++ will be used to create
the code below:
3 - python 95
3.2 overview
float add(float a, int b)
{
return a + (float)b;
}
However, in Python, one line of code (as shown below) will suffice.
def add(a, b)
return a+b
Generalized codes can be created by determining the dynamic data type, and there is an advantage
of increased productivity compared to C and C++.
The default data structure provided by Python includes strings, lists, tuple, and dictionary. Lists can obtain
the element values from the corresponding location using the [ ] operator, which has the same form as the
existing array. However, unlike an array, other data can be inserted in the middle, so you can create internal
element values in completely different forms (like lists, strings, etc.). For example, the list structure shown
below contains other lists inside the strings and numbers (Image 3.3).
96 3 - python
3.2 overview
3.2.5 Automatic Memory Management
Since Python uses Garbage Collection like Java does, the user does not have to worry about dynamic memory
allocation and deallocation. If necessary, Python automatically allocates memory and automatically deallocates
when you are done using it. It can also automatically increase or decrease the amount of memory depending
on the need.
3.2.7 Scalability
Python is very compatible with other languages. Other languages and Python can call each other’s modules.
Even for codes without a source with only a library interface, you can use them by employing a simple
interface function.
Python is very compatible with other languages. Other languages and Python can call each other’s modules.
Even for codes without a source with only a library interface, you can use them by employing a simple interface
function.
For NAO, Python is installed when Choregraphe is installed, and most of the examples and source codes used
here can be tested using IDLE, a Python development environment (Image 3.4).
You can also edit and test some Choregraphe modules
3 - python 97
3.2 overview
3.3 Data Types
and Operators
An error like the one shown in Image 3.5 will occur if one of the reserved words are used as a variable name.
Image 3.5 - Error that occurs when using a reserved word as a variable name
98 3 - python
3. 3 data types and operators
In order to assign other decimal numbers, you have to include Binary = ‘0b,’ Octal = ‘0o,’ and Hexadecimal =
‘0x’ in front of the value. Binary = bin, Octal = oct, and Hexadecimal = hex functions can be used to produce
another decimal value. image 3.7 is an example of this.
In Python, float type processes real values. Real numbers can be entered like 3.14 and 2.71 and as an
exponent type like 314e-2 and 271e-2 (image 3.8). image 3.8 shows an error that appears below the decimal
point in the result.This is a common point error that occurs in real number representation so it will not be
explained in detail here.
Python also supports complex number representation. Complex number is indicated as ‘j’ and you can enter
and process complex numbers as shown in image 3.9 below.
3 - Python 99
3. 3 dAtA tYpes And operAtors
Operators like basic arithmetic operations (+, -, *, /), remainder (%), power (**), and integer division (/Ø/) will
be provided for the numerical values. Power operator is higher priority than basic arithmetic operations,
and only the integer is taken as a result after the division (Image 3.10).
Python strings can be represented together using either single quotation marks (‘) or double quotation marks
(«). It can be used as shown in Image 3.11 below, but it cannot be used combined with each other. Also,
there is a «”» symbol for placing a lot of strings, and additional special texts like newline and tab are also
supported (Graph 3.2).
\t Tab
\r Carriage return
\0 NULL
\\ ‘\’
\’ ‘ text
\” “ text
100 3 - python
3. 3 data types and operators
The addition operator (+) is used to merge strings, and the multiplication operator (*)
is used to repeat the strings. The [ ] operator can also be used to call each of the string elements,
and the first text starts from 0.
A slice operator is provided to import multiple strings, and it is used in [start:end] form. The [ ] operator
can also be used to read elements, but they cannot be modified. The element index also supports the - form;
if the index is from 0-4, it can be used in index form until -5 - -1 (image 3.12).
image 3.12 - An error that occurs when string operators and elements are changed (ex09.py)
Lists have a data structure similar to an array, but there is an advantage of being able to insert and delete
with great flexibility. Python does not support arrays but rather supports lists with a basic data structure,
and you can save data that are different from one another within the list. Lists can be created using the [ ]
operator, and the main methods include the append method for insertion, insert method, remove method
for deletion, and index, count, sort methods for searching and sorting.
image 3.13 below shows you how to generate a list. It has formed one list from strings that correspond to
each day of the week, and each element can be accessed using the [ ] operator.
3 - Python 101
3. 3 dAtA tYpes And operAtors
The following adds an element to the list using the Append method. If the Append method is used,
you will see that the content has been added as the very last element of the list (image 3.14).
The following adds an element to the list using the Insert method. Insert method first selects where
it will be inserted, and you can use image 3.15 below to verify that ‘Sunday’ has been inserted for 0.
The Remove method removes the corresponding element, and this can be verified in image 3.16 below.
The Index method returns the position of the corresponding element, and the Count method tells you how
many times the corresponding element exists in the list. Also, the string is sorted in ascending alphabetical
order if the Sort method is used, and it is sorted in descending alphabetical order if the Reverse method
is used (image 3.17).
102 3 - Python
3. 3 dAtA tYpes And operAtors
b - Tuple
Tuple is similar to a list, but it is a data structure that only supports reading. Although the list is generated
and processed by the [ ] operator, Tuple is generated through the ( ) operator. The [ ] operator is used only
to read the data (image 3.18). Since Tuple is a read-only data structure that you cannot edit, only Count
and Index methods are supported. It plays an identical role as the list.
C - Dictionary
Dictionary is a data structure that consists of a key and value. You can get the value by using the key
(image 3.19); the index is not supported and an error will occur if you use a key that does not exist
(image 3.20). You can assign additional keys and values to add new values, and Items, Keys, and Values
methods are supported to obtain the values.
image 3.20 - Error that occurs when you use a key that does not exist (ex12.py)
3 - Python 103
3. 3 dAtA tYpes And operAtors
The Items method wraps all the Dictionary keys and values with Tuple before returning them, and the Keys
method wrap the keys with Tuple and the Values method wrap the values with Tuple before returning them
(image 3.21).
104 3 - Python
3. 3 dAtA tYpes And operAtors
3.4 Control
StatementS
Python has its syntax executed one by one in sequential order. Control statements like conditions
and loops can be used by changing the flow in the sequential language. Loops are used for executing either
the same or similar tasks several times while conditions are used to determine whether or not to execute
the task according to the conditions. The most common loop includes the ‘for’ and ‘while’ statements,
and the condition has the ‘if’ statement.
“Good job.”
Good job.
image 3.22 - Example of a control statement using the ‘if’ statement (ex13.py)
If the condition is true, execute Processing Syntax 1, but otherwise, Processing Syntax 2 will be executed.
Unlike other languages, Python does not wrap the processing syntax with a block. This is why the codes
corresponding to Processing Syntax 1 must all have the same number of spacing, and if this is not the case,
an error may occur or it may not process the way you want it to (image 3.23). Even though the number of
spacing between the ‘if’ and ‘else’ statements look different from one another in Image 3.22, if you consider
the from the first sentence, you can see that they actually have the same number of spacing.
“Good job.”
“You might have to try harder.”
3 - Python 105
3. 4 control stAtements
Additionally, it is possible to sequentially test several different conditions using elif. ‘if’ and ‘elif’ can be
combined to expand the program above to output a score for each, and the code that makes this happen
can be generated as shown below in Image 3.24. With C/C++, a part of this code had to be expressed
using 90 = score && score = 100, but it can now be expressed using 90 = score = 100.
Image 3.25 - A code that obtains the total for 1-5 using the ‘while’ statement (ex15.py)
106 3 - python
3. 4 control statements
3.4.3 The ‘for’ Statement
The ‘for’ statement, like the ‘while’ statement, is a typical repeat statement; it’s used differently than C/C++.
The following shows the structure of Python’s ‘for’ statement.
Here, object S has a sequential form and has lists, strings, tuple, and dictionary.
Please refer to another book to use the iter method to create iterators.
Image 3.26 below shows an example of the ‘for’ statement. Each element of the number list is assigned to
the ‘i’ variable in order, and each element is produced sequentially through the ‘print i’ command.
Image 3.27 below is the result of using the ‘range’ function to generate the list of numeric values.
107
3 - python
3. 4 control statements
As shown in image 3.28 below, when the ‘range’ function and ‘for’ statement are combined,
it can create the same program the ‘while statement’ did that can add 1-5.
image 3.28 - Example of using ‘for’ and ‘range’ for adding (ex18.py)
108 3 - Python
3. 4 control stAtements
3.5 Functions
Functions are used to wrap several syntaxes into one for processing. Type, range and print are some
of the typical functions we have explored thus far. These functions have already been defined in Python to
play those roles; in this chapter we will explore how the user can define and use functions. Python functions
actually contain a lot more complicated and diverse information than what will be explained in this chapter,
so we will describe only the most basic information here. For more in-depth information, please refer
to a specialized book about Python.
3.5.1 Definition
Python declares functions in a slightly different manner than other existing languages. A function
is declared starting with ‘def’ and ends with a colon (:), and the beginning and end are denoted with
indentation. Please keep this in mind because it is different from C/C++ or BASIC where the beginnings
and endings are generally explicitly denoted.
def’ is a syntax that declares a function, and ‘function name’ is the name that will be used when calling the
corresponding function. ‘Argument 1, Argument 2, ... Argument N’ inside the parentheses is used to record
the necessary external delivery variable when processing the function, and colon (:) is used to end the
declaration.
The processing syntax processes the corresponding function; the syntax can be used freely and another
function can be called by the processing syntax. ‘return’ is used to return the resulting value of the processed
function; you can end the function even without ‘return’ and ‘None’ value will be returned.
Image 3.29 below shows a function that adds two arguments (a and b) and returns the value.
Image 3.29 - Example of how to use declare and use the ‘add’ function (ex19.py)
The result is returned after a and b arguments are added. Since Python doesn’t have anywhere to declare
a Type, the argument type is determined at the point of delivery. Hence all types that support + operations
can be delivered, and addition is possible for both numbers and strings.
3 - python 109
3. 5 functions
3.5.2 return Value
As shown in image 3.29 above, the return value is possible because of the ‘return’ statement. When you
encounter a ‘return’ statement while executing a function, the corresponding function closes and returns
to where it was called from.
The function will end if ‘return’ is not used or if ‘return’ is the only thing written; in this case, object ‘None’
will be delivered as the return value (image 3.30).
‘return’ can only return one object. However, it can return several objects through the function that enables
you to turn multiple values into a tuple.
The next example uses the ‘calc’ function to return the entire result from the arithmetic
(image 3.31).
3.5.3 Parameter
Python uses references to deliver parameters to functions. This is a bit different C/C++. For Python,
the decision is made according to whether or not you can edit the parameters.
For general number values, even if the data is modified within the function, this is not reflected outside
the function. However, for parameters formed by lists, if you change the data within the function, you will see
that the modified content is reflected outside the function (image 3.32).
110 3 - Python
3. 5 Functions
image 3.32 - Difference in processing functions between modifiable and non-modifiable types (ex22.py)
3.5.4 PaSS
‘pass’ syntax is used to create a code that won’t execute any kind of action. image 3.33 shown
below is a code that doesn’t do anything, and even if it did, it wouldn’t show any results.
‘pass’ is used fairly often when generating codes. For example, when creating a temporary function, module,
or class during a project, you can assign a name but not create any content for it.
This is when ‘pass’ is used. As explained in image 3.30 above, although you can use ‘return’ instead of ‘pass,
you have to use ‘pass’ for class because a return value does not exist.
3 - Python 111
3. 5 Functions
3.6 ClaSS
Object-oriented programming is possible for Python through class. In C++ or Java, it was possible
to implement all the functions in equal levels, and these functions are used for abstract programming.
This chapter will explore the method necessary for object-oriented implementation and class declaration.
However, we will not go into details about the concept behind object orientation, inheritance, polymorphism,
and information hiding.
3.6.1 DeClaration
Class declaration occurs when the data and method are both defined together. You can define a simple class
without any content using ‘pass’. In the code below, ‘classobj’ object is created when class is declared (they
happen at the same time). You can create an instance by calling the constructor to a random name. We will
not go into details about constructors here, but they are the first method used to create instances.
image 3.35 demonstrates how general classes are used. ‘import math’ is the module called to load
the ‘sqrt’ function which obtains the square root. This will be explained in detail in the next chapter.
In general, a module refers to a collection of functions with specific purposes. Point class declaration
is indicated through ‘class Point:’ and initializes and assigns the variables inside x and y. The ‘def
distance(self)’ method is declared here and ‘self’ plays the same role as ‘this’ in C++ and Java.
The class method in Python must have the first argument as ‘self’ by default.
This points to its own instance objects and accesses the x and y variables within the class through ‘self.x’
and ‘self.y.’ The ‘p1 = Point()’ refers to creating instance variable for the ‘Point’ class, and internal variables
and method are called by using the dot operator much like ‘p1.x’ and ‘p1.y’
image 3.35 - General class (defining variables and methods and instance call) (ex25.py)
112 3 - Python
3.6 clAss
3.6.2 relationShiP betWeen ClaSS anD inStanCe
Class instance refers to the memory space used for storing and saving the actual class content.
In image 3.36 below, class structure is declared by ‘class …,’ and ‘x1 = secondClass()’ and ‘x2 =
secondClass()’ create each of the instances of a class.
Both ‘x1.name’ and ‘x2.name’ stores the ‘hi’ string because no initial changes were made to the data,
but if ‘x2.name = “hello”’ is used to change the data, you will see that the content in ‘x2.name’ also changes.
Here, even though it has the same class as ‘x1.name,’ the value does not change to ‘hi.’ This is because ‘x1’
and ‘x2’ each have its own independent instance memory
Another unique feature of Python is how you can dynamically and independently add instance and class
variables. Let’s say that you use the same instance of class for both ‘x1’ and ‘x2’ and add a new variable
named ‘age’ in ‘x1.’ This will now be a variable unique to ‘x1’ and will not affect ‘x2’ in any way. Although
this is one of Python’s unique features, it would be best not to use it, because if you don’t maintain
consistency within a class, it will be very difficult to figure out where an error has occurred when
the program gets bigger.
3 - Python 113
3.6 clAss
The ‘isinstance’ method can be used to learn about the relationship between class and instance.
An instance is placed for the first argument, and the name of a class will be placed
in the second argument. You can use this to determine whether an instance was created
from the corresponding class.
Image 3.38 - The ‘insistence’ method used to learn about the relationship between class and instance (ex28.py)
In Python, the constructor is defined as ‘_ _ init _ _()’ and destructor as ‘_ _ del _ _’ ().’ If ‘_ _’
is attached to the back of the variable or function name in Python, it means that it was predefined
for a special purpose. For the constructor method, you can deliver the member variables
for initialization when trying to create instances much like how you would deliver an argument
when a function is called.
Constructor and destructor are methods inside a class, so the first argument must point to
their own instance. The corresponding argument being used as ‘self’ can have a different name,
but it is recommended to keep it as ‘self’ since it is generally used as such.
Image 3.39 - Simple example using the constructor and destructor (ex29.py)
114 3 - python
3.6 class
Image 3.39 shows a simple example of both constructor and destructor. When calling a constructor in
this example, “Constructor Called” message is produced, and if a destructor is called, “Destructor Called”
is called.
A constructor is called when you create a ‘c1 = conClass()’ instance, and you can verify that a destructor
has been called when ‘c1 = 0’ is used to change the connection object to 0.
Also, a destructor is called even when you try to erase it from the memory using the ‘del(c1)’ command
through the ‘del()’ method.
Generally when constructors and destructors have to define the object’s initialization process
(for example, when exchanging emails using the NAO robot), they are used to deliver an email address
or IP address to the object.
Also, use “staticmethod” to explicitly declare on ‘static_print’ that ‘print_hello’ will be a static method.
When calling a static method, you can use ‘class name.static method,’ and you can call it without having
to create an instance.
Operator overloading refers to assigning new roles to the classes you made for the operators (+. -. *. /).
If it is a class for numbers, + can be the operator for addition, but if it’s a class for strings, + can be defined
by adding a text. Here, operator overloading is used, and Python provides a pre-defined method (Graph 3.3).
3 - python 115
3.6 class
Graph 3.3 - Predefined methods
_ _ abs _ _ (self) + +A
_ _ neg _ _ (self) - -A
_ _ invert _ _ (self) ~ ~A
A lot of other methods are also supported, but the graph above displays the most commonly used methods.
To see more, please refer to the Python reference.
The most typical example of operator overloading is the addition of strings (Image 3.41).
Use TString to define one string processing class so the + operator can add the strings.
116 3 - python
3.6 class
image 3.41 - String additions for operator overloading (ex31.py)
3.6.6 inheritanCe
Inheritance is one of the most important techniques in object orientation. Inheritance usually refers to
passing down all the parent class properties to the child class. When using this kind of class inheritance,
it prevents the creation of identical codes for each class, and you can increase the consistency of codes by
letting the child class inherit the properties that are only common in the parent class. Easy maintenance is
another advantage. Additionally, if you call the child class through the interface defined by the parent class,
you can access the specialized functions of each class by using the common interface
image 3.42 below is an example of the inheritance relationship. It is first divided by the number of legs
a robot has; it can largely be divided into ‘quadruped walking’ and ‘biped walking.’ The Aibo and Bioloid
are quadruped robots while Hubo and NAO are biped robots. Although Aibo and Bioloid can be referred
to as quadruped robots, they can just be considered as robots.
This is because those in the sub-layer subsume the characteristics of those that are in the upper layer.
More specific differences are recorded in the sub-layer. Common differences, rather than specific
differences, can exist as variables in the upper layer. Object-oriented programming uses and implements
the inheritance relationship between parent - child by using these hierarchical elements starting from
the most abstract part.
Aibo
Quadruped
Walking
Bioloid
Robot
Hubo
Biped Walking
NAO
3 - Python 117
3.6 clAss
image 3.43 shows you how you can represent the inheritance relationship as a code.
The robot has ‘move’ and ‘operating’ methods, and each sub-layer of the biped walking robot has a ‘leg’
variable for each of their legs.
Here, the inheritance relationship is implemented after being defined a class ChildClass(parentClass):
If inheritance relationship is used this way, instances that were created in Biped_Robot are made with both
‘move’ and ‘operating’ methods of the parent class. Although the variables aren’t immediately inherited,
you can resolve this by using Biped_Robot. _ _ init _ _(self, leg) to call the constructor for Biped_Robot.
The ‘issubclass’ method can be used to verify the relationship between the child class and parent class
(image 3.44). It will return ‘True’ only when there is a parent-child relationship. If not, ‘False’ will be returned.
image 3.44 - The ‘issubclass’ method for verifying the relationship between child class and parent class (ex32.py)
To add the Hubo robot instead of NAO, create the code shown in image 3.45.
>>>class Hubo(Biped_Robot):
def __init__(self,leg):
self.company = “Kaist”
self.os = “nothing, 16bit microprocessor”
Biped_Robot.__init__(self, leg)
118 3 - Python
3.6 clAss
Much like NAO, ‘move’ or ‘operating’ methods can be called for Hubo. Here, if you would like NAO
and Hubo to have different movements, you can use a technique called method overloading.
Method overloading behaves the same as operator overloading where you can redefine some parts
of the parent class to have them execute a different function in the child class.
To define new ‘move’ and ‘operating’ methods by using method overloading in Hubo and NAO, employ
the same method used for defining existing methods. So, if method overloading and inheritance are used,
the parent class can define the interface, and this will enable the use of sub-classes for programming.
There are other functions used in object-oriented programming, but please refer to another book for more
detailed information.
3 - python 119
3.6 class
3.7 moDule
A module is a collection of codes with specific functions and is reused in many different places. We previously
used ‘import math’ and ‘math.sqrt’ in 3.5.1 to provide a brief introduction of modules while explaining classes.
Here, ‘math’ is the module used, and it contains all the functions used for various tasks related to mathematics.
The ‘sqrt’ function used in 3.5.1 is a function that obtains the square root, but there are other functions built into
it including log, sin, and cos. To verify what kind of functions exist, you can use the ‘dir’ function (image 3.47). As
a default, Python provides approximately 200 or so modules and these can be combined to easily create codes
with functions you want.
image 3.47 - Using the ‘Math’ module and ‘dir’ function to verify the function within the module (ex33.py)
In addition to functions, you can also use a constant defined by a specific name. The most typical example of
a constant is ‘pi.’ If ‘math.pi’ or ‘math.e’ is executed, you can use the predefined pi value and natural constant
value (image 3.48).
image 3.48 - Math module’s pi value and natural constant value (ex34.py)
In addition, there are a lot of other useful modules like ‘random’ used for creating random numbers
and ‘time’ and ‘date_time’ for calculating or managing dates and times (image 3.49).
120 3 - Python
3.7 module
When you call a certain function of a module, you can use only the function name rather than using module
name.function name. You can make the declaration as shown below:
You can call the random function of a random module using the method shown in image 3.50 below.
First, generate a calculate.py file that contains the following content by using Notepad, a text editor like
Wordpad, or IDLE’s file editing functions (File -> New Window in IDLE).
return a+b
return a-b
return a*b
return a/b
3 - Python 121
3.7 module
Save this inside the Lib folder where Python was installed. Then, read and call the module from Python
as shown in image 3.51.
If you want to save it using a different path instead of saving it inside Python’s Lib folder, you must adjust
the variable value of the system environment. First, for Windows, you can either go to Control Panel
- System and Security - System or click the right mouse button on My Computer to access Advanced
System Settings. (image 3.52).
The System Properties window will appear if you click Advanced System Settings; you can click
the environment variable here (image 3.53).
In addition to searching the corresponding folder, Python will search the default folder when the module
is imported to verify whether the actually module exists if you edit he system variables and set up the folder
where the module is located.
As shown in image 3.54, click Create New from the Environment Variables window to set the variable name
to PYTHONPATH and set a folder with a module for the variable value (for example, c:\\modules).
122 3 - Python
3.7 module
Image 3.53 - System Properties window Image 3.54 - Setting the user module folder
For Linux, you have to add the syntax shown below to the shell file (bash_profile for the commonly used bash
shell).
3 - python 123
3.7 MODULE
3.8 Comprehensive Practice
Through Choregraphe
Script Modification
Choregraphe box is composed mostly of Python scripts, so you can edit box movements with just a little bit of
knowledge regarding Python. Also, if you operate the NAOqi framework using Python, you can use Python to
program instead of using the Choregraphe box. Chapter 4 will have more information on this. Here, we will
discuss how to employ Python scripts used by the Choregraphe box and observe some different examples.
Random Eyes box continuously changes the eye colors into random colors. If you execute the corresponding
box, you will see that the LED colors of NAO’s eyes are constantly changing. This box uses the Random
module provided by Python, and the user can edit this to change the eye colors.
124 3 - python
3.8 comprehensive pratice through choregraphe script modification
First, when you click the right button to select the Edit Box Script after dragging the Random Eyes box
to Choregraphe’s task window, a script editor window will appear as shown in Image 3.55.
It is same as the source code we’ve seen thus far in IDLE, and you will see that the corresponding box script
is defined as MyClass which inherited GeneratedClass. Import Random has been declared to use the Random
module. You can see that the Random module was used in rRanTime = randomuniform (0.0,2.0) from the
onInput_onStart(self) method (Image 3.56).
The ‘onInput_onStart’ method shown in Image 3.56 is the box that becomes active when Random Eyes box
is executed, and the block of code right after ‘while True:’ is the code that makes the random LED color
changes.
If you want to create a code that changes the eye color at specific intervals, change random.uniform (0.0, 3.0)
in time.sleep(random.uniform(0.0, 3.)) to a fixed number.
The random.uniform method produces a random value with uniform distribution, and (0.0, 3.0) refers to
producing a real number between 0-3.0. Therefore, if you change it to time.sleep(1.0) to execute
the program, you will see that the color of the eyes changes every second.
To edit the code so a particular value of an eye color is repeated, you have to edit the value for
256*random.randint(0,255) + 256*256*random.randint (0,255) + random.randint(0,255) from ALLeds.
fadeRGB(«FaceLeds», 256*random.randint(0,255) + 256*256*random.randint(0,255) + random.randint(0,255),
rRandTime).
The fadeRGB method processes the RGB value in 256*R + 256*256*G + 256*256*B form, so you can light
the LEDs with a color of your choice when you input the RGB value (please refer to the reference for more
details).
3 - python 125
3.8 comprehensive pratice through choregraphe script modification
3.8.2 Using Python to Create New Choregraphe Boxes
Until now, default boxes were used to program through Choregraphe. Most functions executed using NAO
are provided as Choregraphe boxes. For functions that aren’t provided and functions you would like to edit,
you can edit the existing box as previously explained. However, if you edit an existing box for all the functions,
you will run into problems if the tasks are used repeatedly in several different places.
Here, we will explain how Choregraphe functions and Python are used to create boxes. For our example, Python
will be used to create an adder and Choregraphe will be used to register a new box.
First, choose to add a new box as shown in Image 3.57. When you press the right mouse button inside
the Choregraphe task window, ‘Add a new box’ will appear. As shown in Image 3.58, when you select the
corresponding menu, a box will appear to let you add a new box.
Image 3.57 - Adding a new box Image 3.58 - ‘Create a new box’ window
The ‘Name’ above creates a name for the box the user is trying to create. Here, we will enter ‘Adder’ as the
name. ‘Tooltip’ shows a simple explanation regarding the corresponding box.
We will simply enter «the two numbers received will be added.»
Then, press ‘-’ to delete the values for onStart, onStop, and onStopped which were registered with default
values for Inputs and Outputs. Image 3.59 shows how this will look. Ignore and move on when you get
a message telling you that a default image will be used because there is no bitmap image available.
126 3 - python
3.8 comprehensive pratice through choregraphe script modification
Image 3.59 Adder box setting and the Adder box in Choregraphe
Add the Adder box we are creating to the library. However, if you try to add it to the default library (default),
you will get a message stating that it cannot be added because it is a read-only library. This is why you
should create a new library titled MyLibrary. You can add this as shown in Image 3.60 below.
3 - python 127
3.8 comprehensive pratice through choregraphe script modification
As shown in Image 3.61 you can add the Adder box to MyLibrary.
Although it now has an appearance, the new Adder box does not have any internal functions yet. The Adder
box will receive two external inputs and adds them to send it out as an output. Let’s now set the input
variable in order to receive the two inputs. First, select the Edit Box menu as shown in Image 3.62 to edit
the Adder box. Once this is chosen, a menu identical to the one shown in Image 3.59 will appear.
Image 3.62 - Edit Adder box Image 3.63 - Button for adding input variables
128 3 - python
3.8 comprehensive pratice through choregraphe script modification
Press the ‘+’ button to open the window to edit the input variables as shown in Image 3.63.
As shown in Image 3.64 below, enter ‘A’ as the Name. Adjust ‘Type’ using ‘Dynamic’ and use ‘onEvent’
for ‘Nature.’ There are four kinds of ‘Type’ including ‘Bang,’ ‘Number,’ ‘String,’ and ‘Dynamic,’ and four types
of ‘Nature’ including ‘onEvent,’ ‘onStart,’ ‘onStop,’ and ‘ALMemory.’
Each function is shown in Graph 3.4. Both ‘onEvent’ and ‘onStart’ play substantially similar roles.
Name Role
Bang Does not deliver anything other than the start signal for the input.
3 - python 129
3.8 comprehensive pratice through choregraphe script modification
When you have finished adding variable A, add variable B with the same parameters as variable A.
A and B are two variables for input (Image 3.65).
Now, add variable R in Outputs as shown below. Variable R is used to output the result of variables A and B
added. Although the output variable window looks identical to the input window above, unlike the ‘Nature’
input, there are two elements including ‘punctual’ and ‘onStopped.’ Regardless of time, ‘punctual’ immediately
sends the output as soon as the corresponding box is processed. Although ‘onStopped’ behaves similarly
as ‘punctual’ and immediately sends the output as soon as the box is processed, the difference is that it sends
the result only after processing the lower level boxes.
Processes 1) and 2) above were used to add input variables A and B and output variable R in the Adder box.
The Python code will be edited here to add input variables A and B to output the result as variable R.
130 3 - python
3.8 comprehensive pratice through choregraphe script modification
First, select ‘Edit box script’ to open the script editor.
The script editor will then appear as shown in Image 3.66. The structure is similar
to the Random Eyes box. First, there is the __init__(self) constructor and additional onLoad, onUnload,
onInput_A, and onInput_B methods. We will edit __init__, onInput_A, and onInput_B methods as well as
adding an extra method for calculating additions.
First, edit the codes as shown below. When entering the value of variable R in Line 17 (process method),
unlike A and B, you must be cautious of the part that uses the constructor method.
1. class MyClass(GeneratedClass):
2. def__init__(self)
3. GeneratedClass__init__(self)
4. self.bA=False
5. self.bB=False
6. def onInput_A(self, p):
7. s self.R() #~ activate output of the box
8. self.bA = True
9. self.A = p
10. if self.bA and self.bB:
11. self.process()
12. def onInput_B(self, p):
13. self.bB = True
14. self.B = p
15. if self.bA and self.bB:
16. self.process()
17. def process(self):
18. self.R(self.A + self.B)
19. self.bA = False
20. self.bB = False
To explain the variable, classes use A, B, bA, bB, and R instance variables.
Here, A and B are temporary storage spaces and they each receive data from A and B inputs. Currently, bA
and bB are used as variables for evaluating whether A and B inputs have been established. R will be the
output value.
Constructor (Line 2-5) explicitly calls the previous parent class constructor and initializes variables bA
and bB as False. Variables bA and bB are used to save whether or not the corresponding values have been
properly delivered when A and B of the Adder box receives an input.
When input is delivered to A and B, onInput_A (Line 6-11) and onInput_B (Line 12-16) become executable
methods. The ‘self’ in onInput_A(self, p): refer to its class, and ‘p’ becomes an input signal delivered from
the outside. If the input is delivered to A, onInput_A method is called, so the input delivered to A is saved
in variable ‘p.’
3 - python 131
3.8 comprehensive pratice through choregraphe script modification
The onInput_A method has a section that saves value ‘p’ within its internal A variable (Line 9)
and a section bA = True that signifies whether the value has been received (Line 8). Additionally, if ‘if self.
bA and self.bB:’ receives normal values, a process for adding A and B is called. Meaning, both sides will
contain this processing syntax to make up for the fact that onInput_A and onInput_B can at times be executed
quicker than the other.
This part exists for both onInput_A and onInput_B methods because, if Choregraphe is executed,
all the boxes can be executed in parallel and input/output can also be simultaneously executed.
The ‘process’ method (Line 17-20) receives A and B values to add them and deliver the result to value R.
It also stops bA and bB variables from getting reset to False and being executed again. Here, R calls the
constructor to create the value instead of using substitution. When Choregraphe sends the next output
of data, you have to create a new instance to make a proper delivery of the changed status of the value.
132 3 - python
3.8 comprehensive pratice through choregraphe script modification
3.9 reference
3 - python 133
3.9 reference
4 naoqi
& DCm
LEARNING
Chapter 4 explains the NAOqi framework
which forms the foundation of the NAO robot
and the DCM used for controlling all the devices.
Special characteristics including the NAOqi
framework structure, file structure, and Broker
as well as the NAOqi framework are used
to control NAO.
134
content
4.1 NAOqi Overview 136 4.7 Low Level Architecture 168
135
4.1 NAOqi
Overview
OROCOS (Open Robot Control Software), YARP (Yet Another Robotic Platform), and Urbi (Universal Real-Time
Behavior Interface) are some of the current widely known frameworks. Since the release of Aibo, many
research institutes started developing their own framework.
NAOqi is a framework developed specifically by Aldebaran Robotics for using the NAO robot, and this
includes elements like parallel processing, resources management, synchronization, and event processing
generally required for robotics. Although NAOqi is configured with general layers similar to other
frameworks, these layers are created and processed in NAO, and this method is perfect for controlling
the robot. NAOqi also enables information sharing and programming through ALMemory and communication
between Homogeneous Modules like motion, audio, and video that serve other roles.
NAOqi is a SDK created in C++. It includes functions like simulation execution, calling Python, Urbi or C++
in Choregraphe, calling C++ functions in Python, and programming, simulation, and control functions.
Broker Broker is a program that receives and executes commands from specific IP addresses
and Ports. NAOqi($AL_DIR/bin) is called the «main broker.» Audioout(TextToSpeech)
is an independent broker connected to NAOqi.
Module Module is a class that includes functions for robot motions (including motion, TextToSpeech,
(specialized leds, etc.). Library called from $AL_DIR/modules/lib/autoload.ini is also called a module.
class for When calling a library from NAOqi, objects of the module are systematically instantiated.
ALModule) Modules are always linked to brokers.
Proxy Proxy is used to access the module. In order to call a method from the module, you must
create a proxy for the module.
CMake CMake creates the appropriate project for the desired OS (OSX, Linux, Win32) and IDE (Visual
Studio, Eclipse, etc). NAOqi requires CMake Version 2.6 or higher.
Extractor Converts NAO’s sensor values into data that can be used by NAO’s memory.
ALMemory NAO’s memory can be accessed by all modules, remote modules, remote tools, and other NAOs.
Smart pointer Pointer where memory removal and deletion occurs automatically.
This chapter will explain the theory about the NAOqi structure. First, we will define the components
of distribution and the role of the module and explain some of the ways they interact with one another.
Image 4.1 below shows NAOqi’s framework structure.
NAOqi Framework
NAOqi framework works by having Choregraphe, Monitor, Motion module, and Audio module pass
information to each other. NAOqi is executed by having Broker deliver information and commands.
All the elements in 4.1 operate together to execute a variety of movements. The following explains
the different elements that configure the NAOqi framework.
• Module: Module is both a class and library that uses the function and API defined in ALModule
to obtain information or control regarding each module.
• Communication: Communication uses Local Procedure Call (LPC) or Remote Procedure Call
(RPC) to connect to NAO and exchange information.
• Introspection: Introspection is the default element that monitors the functions for the robot API,
amount of memory, monitoring, and motions. The robot recognizes all the usable API functions. Also,
what releases a library will automatically delete the associated API functions. Functions defined in
the module can be added to the API by using BIND_METHOD. BIND_METHOD is defined in almodule.h.
• Python interpreter: It is an interpreter used for interpreting and processing Python commands
in NAOqi.
• Python wrapper: Python wrapper allows you to use functions with the same name in both C++
and Python.
• Proxy: All Aldebaran modules have been modularized. Rather than directly referencing other
module files, you can request the Proxy to find the corresponding module. If the module doesn’t
exist, an exception occurs. The user can call the corresponding function or module through the
Proxy from two independent brokers, mainBroker(local call) and myBroker(remote call).
• ALValue: In order to be compatible, some NAOqi modules or methods are saved as a specific
data type in ALValue.
• Logger: ALLogger uses web based SSH or the robot to look into software information or logs.
• Exception: All NAOqi errors are processed based on exceptions. Exceptions are processed after
all user commands are encapsulated in try-catch blocks.
• Thread Pool: NAOqi and NAOqi modules are configured without any interference between the
threads. There may be some interference in all modules that use the module generator. Although
user function can be called in parallel, you must implement it so there is no interference between
the threads. The user can use the critical section in order to protect this function.
• Smart pointer: Smart pointer is a class that helps with dynamic memory management.
Even though the user does not have to use the smart pointer, all methods of the framework
return the value through the smart pointer. Class structure is not private, so the user can create
a proxy that uses general pointers. (You must include corresponding directives like ‘new’ and ‘delete.’
• Tinyxml - This is a library used to manage XML config files. For detailed information please refer to
http://www.grinninglizard.com/tinyxml/
• Libcore - A library with basic functions like type, smart pointer, and error.
• libfactory - This is a library for concurrent code creation for several developers; instantiates according to
the name of each function. (Please refer to the factory design pattern.)
• libsoap - This is a library for web service provision and usage. (gsoap 2.7.12)
• rttools - This is a definition header for the real-time communication management tool between devices.
• libvision - This is a library with image processing functions and image and screen definitions.
alcriticaltrueiflocked.h Encapsulation of critical section pthread. NAOqi does not have any interferences or
constraints between threads. If necessary, client applications must be managed as multi-
thread.Mutex does not block the other threads that weren’t created by critical section.
alcriticalsection.h Encapsulation of critical section pthread. Only one process creates the critical section
that can enter the section.
altask.h Threadpool executes the tasks. Task is created in altask and can be added
into the threadpool queue.
• alcommon - This is a header that defines general NAOqi elements like module, proxy, and broker.
albroker.h All executable modules create more than one broker in main.cpp.
Broker waits for http requests or remote C++ requests from PC applications.
alproxy.h Enables Proxy creation within the module and calls the bound method. If the method
is located within the same executable module, Proxy will choose the quickest local call.
If the method is located in another executable module, it will choose the slower remote call.
alsharedlibrary.h Manages the dynamically loaded library (a library that loads at a time other than
when the program is starting)
alsingleton.h Singleton design pattern – makes it so there is only one instance of a specified class
altaskmonitor.h Task monitor used to see whether a task is being executed, has ended, or s waiting
to be closed.
alvalue.h Definition header for ALvalue. ALValue is configured with common unions.
• iNAOqi - iNAOqi Enables Python wrapper and C++ functions to be used in Python.
4.2.2. broKer
#shell example
./bin/naoqi -b 127.0.0.1 -p 9559 #listen on ip 127.0.0.1 and port 9559
Module generator (address) creates the project for the user. It generates a library for connecting executable
files with the robot and for connecting with the main broker (must be added inside the autoload.ini setup file).
Module generator manages connections and locations. It the user to focus on applying the desired functions.
(Please refer to the reference [advanced/SDK] for more detailed information regarding module generators.)
MyModule is executed in a remote broker called myBroker which is communicated through the main broker
and IP 127.0.0.1:9559.
#shell command
# IP 127.0.0.1 and port 9559 by default
./bin/naoqi -b 0.0.0.0
Web browser can be used to see the description of all API brokers (http://127.0.0.1:9559
in Image 4.2). In Image 4.2 above, MyBroker is connected to the main broker. Use the -pip command to
connect with other brokers.
#shell command
./modules/bin/myBroker -pip 127.0.0.1 # connect to mainBroker 127.0.0.1
Modules are represented by green circles. They can be called at the ‘launcher module’ runtime
or from the setup file autoload.ini ($AL_DIR/modules/lib).
#autoload.ini sample
[core] # required files
albase
Broker is both an executable object and server. This server receives remote commands
and ALProxy allows the connection for sending commands to the broker.
// C++ sample
ALProxy p = ALProxy(«module name», parentIP, parentPort);
p.info(«display it on remote broker parentIP and parentPort);
Modules can access the local broker through the getParentBroker function.
// C++ sample
getParentBroker()->getIP(); // Brings the user’s IP address data.
bool keepAlive; Automatically closes the broker if the parent broker closes
TALModuleInfoVector
int getModuleList (); Calls the local module list
&pModuleList
The following programs are needed in order to use NAOqi to create programs:
• In Windows, naoqi.exe (as well as additional user-generated codes) requires several DLL files
so a path must be set (%PATH%).
A - Click the right mouse button on Desktop and select «Properties» from the menu (Image 4.3).
B - If you click «Advanced System Settings» on the left side of the System Properties window,
the system properties window will appear (Image 4.4). Then, click the environment variable
from the System Properties window.
C - If you click ‘Path’ and then ‘Edit’ from the Environment Variable setup window, Edit System Variable
window will appear as shown in Image 4.5. You must add Python and SDK paths to ‘Path’ under System
Variable. Each path is separated with a semicolon (;).
D - To use the SDK library in Python, create PYTHONPATH as shown in Image 4.6 and register the path
as shown below:
All environment variables must be set within the script, and they can be found in the SDK root. The user must
always use a script to execute NAOQi. (NAOQi-bin’s executable file does not directly support the execution.) Also,
in Windows, user-generated executable files must exist inside the bin/ directory.
B - For “Where to build the binaries” field, choose the temporary build folder of the sub-folder example.
If the folder does not exist, create a new one. (ex: «/path/to/aldebaran-sdk/modules/src/examples/
helloworld/build»)
C - Click the «configure» window. Depending on the operating system and IDE being used, select ‘Ide to be
used.’ (Select «Visual Studio 8 2005» or «Visual Studio 9 2008» for Windows and «UNIX Makefiles»
for Linux or Mac operating systems)
E - Click the «configure» button one more time if the configuration field becomes red. It was done correctly
if the background for all the fields turns gray.
G - For Window, .sln file will be created inside your own build directory, and you can open this through IDE.
H - For Linux or Mac, compile the sample project. Input «make» inside the Build directory.
I - For SDK folder address, you can select the folder you installed.
#ALProxy reference
from naoqi import ALProxy
#Set IP and port
IP = «nao.local»
PORT = 9559
#loggerProxy generation
loggerProxy = ALProxy(«ALLogger», IP , PORT)
#Deliver command
loggerProxy.info(«Python», «it works»)
Other modules and variables with «0» value named «myValueName» are set up to share using the following
code. The next code makes other modules share the variables named «myValueName» with «0» as the value.
InsertData is a function that records the data name and value in ALMemory. The reference (NAOqi API) has
the explanation regarding the functions used (can be used) in NAOqi.
#include «almemoryproxy.h»
void myModule::init(void)
{
getParentBroker()->getMemoryProxy()->insertData(«myValueName»,0);
}
Python script can be used from your PC to call the robot’s function. If not, you can use the robot’s embedded
interpreter for quick activation. If the user doesn’t want the function to stop while being executed, parallel
calls can be used to resolve this. The following code uses parallel calls to speak the given test through NAO:
Use CMake (makefile of Linux or .sln file of Microsoft Visual Studio) when creating and compiling a project.
Enter the following command to call the user library in NAOqi: myModule.so is called when NAOqi --load
myModule is executed, and the initialization method is automatically called.
Options Actions
To directly load the new module to NAO, you can either cross compile the modules using C++ to generate
the module or use Python to create the modules. When using C++ to create the modules, because the NAO
Robot is configured based on embedded Linux, you must compile the modules to fit the Geode in Linux. This
is because an error may occur when different CPU and operating systems are used since machine language
and commands are interpreted using different methods. NAO robot provides a separate cross compiler
to resolve this issue.
Python has interpreter properties, and these errors do not occur here since a Python interpreter
is already loaded inside NAO. Although it is better to use Python to create modules, C++ may be more
suitable for high-speed processing like image processing.
This chapter uses a simple example to explain the tools needed for loading modules into NAO, how to use
C++ to execute the cross compiler in Linux, and how to load the actual modules. Linux used in this book is the
Ubuntu 10.04 LTS Lucid Linx version, and some commands may not be compatible if a different version is used.
At the time this book was written, it was being distributed as ctc-1.8.16.tar.bz2 based on Version 1.8.16.
The corresponding file is decompressed in the /home/useraccountname/ directory. You can use the following
command to decompress in the terminal. Image 4.9 shows the location of the terminal in Ubuntu Linux.
When decompressed, the tools needed for cross-compiling are saved in the /home/useraccountname/
Downloads/ctc-1.8.16-linux32 folder, and you are now ready to cross-compile. Check the SDK and cross-
compiler through the ls –al command in the terminal, and go on to the next step if both of them are there
(Image 4.10).
Image 4.10 - Check SDK and cross-compiler directory using the ls -al command
CMakeLists.txt file has the list that uses CMake to automatically execute the compiling, and bootstrap.
cmake file has the records of the exceptions that occur while using CMake. Source code files that actually
get compiled are alhelloworld.h, alhelloworld.cpp, and helloworldmain.cpp.
If you execute CMake-GUI in the terminal window, a screen will appear as shown in Image 4.12.
If CMake-GUI does not get executed here, this is because the files related to CMake are not installed. If
Ubuntu Linux is used and is not connected to the internet, you can use the following command for the install:
The ‘apt-get’ command adds/removes Windows programs; it is responsible for adding and removing
programs in Ubuntu Linux. The ‘sudo’ command is used to obtain temporary authorization to add/remove
programs. It will ask for a password if you run the command above, and CMake will be installed once the
correct password is entered (Image 4.13). If you get an error that CMake cannot be found, you can still
install using Ubuntu’s system/management/synaptic bundle management menu.
You can use CMake-GUI if you install it using the command shown below:
Once the installation is complete, perform the same task used to set the the C++ project using CMake in
Windows. First, add a new directory named ‘build’ in the helloword directory.
You can add this by using the ‘mkdir build’ command, and you can execute CMake-GUI by moving
it to the corresponding directory.
/home/useraccount/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/src/examples/helloworld/build
u
v
After the input, press the Configure button in l (image 4.14). Select Unix Makefiles for ‘Specify the
generator for this project’ and press Next after selecting ‘Specify toolchain file for cross-compiling’ from the
options (image 4.15). Then, enter the toolchain-geode.cmake file previously decompressed into the ctc-
1.8.16-linux32 folder into ‘Specify the Toolchain file.’ (image 4.16).
image 4.17 appears when you finish the process above. If there is a checkmark next to HELLOWORLD IS
REMOTE, remove the check and then click the Configure button. If HELLOWORLD IS REMOTE is checked,
the module is configured so that it is loaded in NAO only for remote execution.
If there are no other issues, the Generate button (located to the right of the Configure button) will be
activated. If you click this, Generating Done message will appear as shown in image 4.18 to let you know that
the project file has been created correctly, and then you can end CMake.
If the process above has been completed successfully, new files will be created in the helloworld/build
directory (image 4.19). These files contain information regarding project compilation and executable file
generation, and execution is very simple.
The ‘make’ command has to be executed using the terminal. However, you have to first move
it to a previously specified directory (/home/useraccount/aldebaran-cpp-sdk-1.8.16-linux-i386/modules/src/
examples/hello world/build). Then, execute the ‘make’ command in this directory.
If you go through the process thus far, the cross-compiling process for creating the module for NAO using
C++ will be complete. The file created will be saved as /home/useraccount/aldebaran-cpp-sdk-1.8.16-
linux-i386/modules/src/examples/helloworld/build/sdk/lib/naoqi/libhelloworld.so. It is a file that has to be
uploaded into NAO later on, so you must remember this location.
This file has to be uploaded into NAO but you must change some of the information first. The autoload.ini file
information is as follows. (Some may vary depending on the version).
framemanager
pythonbridge
videoinput
behaviormanager
helloworld
#urbistarter
You have to add the the ‘helloworld’ section shown below. Meaning, you have to load the other modules first
and then upload the ‘helloword’ module into the memory at the very end.
The basic task is over if you upload the revised autoload.ini file and the libhelloword.so file (the one
we previously created through cross-compiling) into NAO’s memory.
Image 4.16 in 4.4.2 had a section where you prevent HELLOWORLD_IS_REMOTE from being checked.
If it is checked, you have to add an additional [remote] line right above ‘helloworld’ to have it operate
normally. If registered as a [remote] module, you can remotely execute other corresponding modules
from then on.
…
[remote]
helloworld
…
Now, upload the new module inside NAO. After connecting with NAO using Choregraphe’s Connect,
click File Transfer from the Connection menu to transfer the file (Image 4.21). nao/nao is the default login
ID and password. There are several folders, but the naoqi folder is the main target for setup.
autoload.ini - /naoqi/preferences/
libhelloworld.so - /naoqi/lib/naoqi/
The two files must be uploaded to corresponding locations, but the /naoqi/lib/naoqi/ folder where you
have to upload the libhelloworld.so file is not created by default. This is why you must manually create the
corresponding folder and upload each of the two files to their respective locations.
To check whether the module was loaded properly after going through this process, you just have
to re-execute NAOqi inside NAO. You can execute this by connecting to NAO’s IP address. If NAO’s IP address
is 192.168.123.150, use the internet browser to connect to 192.168.123.150 and then enter the same ID
and password you used when the file was previously uploaded.
If you click NAOqi from the Advanced menu, it informs you NAOqi’s current status and provides a menu
to turn it off and back on (image 4.22). After using the Restart button to restart, click the Log menu from
the Advanced menu to check whether the ALHelloWorld module was registered properly in the memory.
If it was registered normally without an error, you will see a message shown in Image 4.23, and you are
now ready to execute the corresponding module.
image 4.23 - Using the Log menu to check whether the module was registered
If you sequentially execute the commands above in Python, the helloWorld module is called.
An error will occur if there is an issue, but it will execute normally if there are no issues.
In order to actually use this process, you must first use NAOqi to implement functions like speaking
and moving the upper body to the helloWorld module. Then, you must cross-compile in a series and upload
this to NAO’s memory and use it by calling the module.
DCM (Device Communication Manager) is a part of the NAOqi system. It is a NAO software module
and manages the communication of all the devices (board, sensors, actuator, etc) excluding the cameras
and sound (Image 4.24).
The following two methods can be used for other modules to access the robot sensors and actuators
(Image 4.24).
First, in order to access the sensors, you must find the value inside ALMemory that has the name
of the auxiliary device. DCM automatically updates the sensor values inside ALMemory. Modules will only use
the newly updated sensor values.
Second, the following describes how to access the actuators. The modules use the values updated by DCM
through «Timed Command.» However, this method cannot directly change the DCM value inside ALMemory.
In this instance, usage request is received from DCM and the actuator value inside ALMemory is changed
within DCM itself.
The user can send one or more timed commands in order to deliver the same command to one or more
actuators. Time is measured in ms and is a 4-byte integer. If the robot’s motherboard has the module,
it can request DCM to obtain the current time or read it directly.
DCM saves all Time Commands of each actuator. It then analyzes the DCM cycle of the next command based
on the current time and uses linear interpolation to calculate the appropriate command. Here, the previous
command is deleted right after it is used.
Maintain the last command if the command hasn’t arrived yet, and the last command will be sent in the next
DCM cycle.
You can obtain this by using the equation below which represents the F(x) function near the x0 , x1
variables through function expansion.
This is the simple interpolation formula referred to as either ‘proportional part’ or ‘linear interpolation.’
We will look at two examples that calculate the appropriate command by using linear interpolation.
Let’s first assume that the DCM cycle is 10ms (Image 4.26). Linear interpolation is used to calculate the
appropriate command if you receive Command 1 from (10, 10) when it is t=30ms and Command 2 from (80, 40).
• t = 30ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 60
Command = (( 40 – 10 ) * 10) / 60 + 10 = 15
• t = 40ms
DiffTime1 = 40 – 30 = 10 , DiffTime2 = 80 – 30 = 50
Command = (( 40 – 15 ) * 10) / 50 + 15 = 20
• t = 50ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 40
Command = (( 40 – 20 ) * 10) / 40 + 20 = 25
• t = 60ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 30
Command = (( 40 – 25 ) * 10) / 30 + 25 = 30
•t = 70ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 20
Command = (( 40 – 30 ) * 10) / 20 + 30 = 35
• t = 80ms
DiffTime1 = 30 – 20 = 10 , DiffTime2 = 80 – 20 = 10
Command = (( 40 – 35 ) * 10) / 10 +35 =40
We will assume the DCM cycle is 10ms here as well (Image 4.27). When t=10ms, linear interpolation will be
used to calculate the appropriate command when D CM receives four commands ((15,10), (25, 30) , (45, 20) ,
(65, 0)) as shown in Image 4.27.
• t = 10ms
DiffTime1 = 10 – 0 = 10 , DiffTime2 = 15 – 0 = 15
Command = (( 10 – 0 ) * 10) / 15 + 0 = 6.66
• t = 20ms
DiffTime1 = 20 – 15 = 5 , DiffTime2 = 25 – 15 = 10
Command = (( 30 – 10 ) * 10) / 10 + 10 = 20
• t = 30ms
DiffTime1 = 30 – 25 = 5 , DiffTime2 = 45 – 25 = 20
Command = (( 20 – 30 ) * 5) / 20 + 30 = 27.5
• t = 40ms
DiffTime1 = 40 – 30 = 10 , DiffTime2 = 45 – 30 = 15
Command = (( 20 – 27.5 ) * 10) / 15 + 27.5 = 22.5
• t = 50ms
DiffTime1 = 50 – 45 = 5 , DiffTime2 = 65 – 45 = 20
Command = (( 20 – 20 ) * 5) / 20 + 20 = 15
• t = 60ms
DiffTime1 = 60 – 50 = 10 , DiffTime2 = 65 – 50 = 15
Command = (( 20 – 15 ) * 10) / 15 +15 = 5
• t = 70ms
Command = 0
Auxiliary devices are mostly actuators and sensors controlled by the devices. An auxiliary device is defined
by the device itself and the device type and number. Each device has its own distinct name. This name is
used to communicate with the upper level.
“Face/Led/Red/Right/0Deg/Actuator”: This is one of the LED actuator names and it refers to the red LEDs
near the right eye that corresponds to 0 degrees. These LEDs have an important key value called “Value”
which is a float type value from 0.0 (LED off) to 1.0 (LED full).
You have to add “Device/SubDeviceList/” if you want to use this as an auxiliary device. Meaning,
the perfect key name is “Device/SubDeviceList/Face/Led/Red/Right/0Deg/Actuator/Value.” Currently
the key name that can be used to obtain the LED value is saved inside ALMemory.
Also, you must use this name to send the timed-command value for this actuator to DCM.
“LShoulderPitch/Position/Sensor”: this is the name for one of the joints (left shoulder pitch).
This joint has an important key value called “Value,” and it is a float-type radian value. If you would like
to use this as an auxiliary device, you have to add “Device/SubDeviceList/.”
- MotherBoard: The main CPU board is located in the head and has a Geode processor.
- ChestBoard: The chest board has an ARM processor.
- MotorBoard: All the motor boards inside the robot control all the joints excluding the legs and hands.
- MotorBoardHand: It is the motor board for the robot’s hands. It currently plays the same role
as the MotorBoard.
- MotorBoardFoot: It is the motor board for the robot’s legs. After the announcement
of NAO’s first version, this board no longer controls the motor.
- TouchBoard: This board has a capacitive sensor near the uppermost part of the head.
- FaceBoard: It is a board that’s around the robot’s eyes and has LED and IR sensors.
- USBoard: It is a board with ultrasonic sensors.
- InertialSensor: It is a board with accelerometer and gyrometer sensors.
- EarLeds: This board controls the LED sensors of the ears.
- Battery: It is a board inside the battery.
A - Actuators
- Joint: Joint is an actuator that lets you adjust one joint angle of the robot.
- JointHardness: It is an actuator that adjusts the voltage sent to the motor to control the joint.
- Led: LED with just one color can adjust the value from 0 to 100%.
- Power: Not used.
- Charge: Not used.
- UsSend: An actuator that sends the ultrasonic sensor value.
B - Sensors
- JointPosition: Sensor value for the angle location of the robot’s joint
- Current: Current value of one specific joint motor
- FSR: FSR sensor value
- Touch: Status of the capacitor proximity switch (Pressed=1.0, Not Pressed=0.0)
- USReceived: Return value of the ultrasonic sensor
- Accelerometer: Return value of the acceleration sensor
- Gyrometer: Return value of the gyro sensor
- Angle: Angle of the entire robot (Receives the return value from the inertia board.)
- Temperature: Temperature value of the motor or battery
- Switch: Status of the button on the chest or the bumpers on the feet (Pressed=1.0, Not Pressed=0.0)
- Battery: Battery status sensor
4.8.1 Introduction
DCM has two configuration files. One is the Device.xml file which has the hardware characteristics of the
robot itself and the other is the DCM.xml file that sets the specific parameters for DCM. These two files are
applied equally to all the robots, have default values, and are located in the naoqi/preferences folder.
The NAO robot is separated into two parts: body and head. Each part has Device_Head.xml and Device_Body.
xml files called «subPref.» Device_Head.xml file is saved in the flash of the internal geode board, and
Device_Body.xml file is saved in the ChestBoard flash. When reading the aforementioned .xml files through
DCM, the .xml files are copied saved at the same location where Device.xml and DCM.xml files are saved.
When the aforementioned subpref files are read while booting the system, they will both change to have
the same key values as the key values inside Device.xml and DCM.xml.
Device.Head.xml file is also read from the same directory and saved inside the memory. The key/value inside
the file is sent to ALMemory. DCM reads the Device_Chest.xml file from the chestboard’s flash memory
and sends the key/value to ALMemory after creating a copy in the naoqi/preference directory.
image 4.33 shows an example of 10 seconds after the current DCM time.
4.9.2 Set
The Set command uses one or more timed commands to deliver control signals (graph 4.12).
Output: None
When a new command is sent through DCM to control the same device, the command will operate by using
the next four update methods.
[«Merge»]
Also a very simple method. The new command is combined with the previous command.
(Image 4.35)
[«ClearBefore»]
Removes the earlier part of the new command (Image 4.37).
The next command sends four commands to control the red LEDs (image 4.39). The LEDs gradually
get brighter for two seconds and gradually dim until four seconds, and then repeats this between six
and eight seconds.
The next example activates the ultrasonic sensor on the chest for three seconds every 100ms (image 4.40).
Each of the sensor values will be saved inside the next ALMemory.
Device/SubDeviceList/US/Left/Sensor/Value, Device/SubDeviceList/US/Right/Sensor/Value,
Device/SubDeviceList/US/Left/Sensor/Value1, Device/SubDeviceList/US/Right/Sensor/Value1, Device/
SubDeviceList/US/Left/Sensor/Value2, Device/SubDeviceList/US/Right/Sensor/Value2,...
The Alias command has the control device list and uses the “setAlias” function to request an update of all the
control devices that have different commands.
The createAlias function returns the ALValue data, and since incorrectly assigned device names will be
removed, you can use this to correct errors (graph 4.13).
Output: Returns the ALValue array data used for input, and if it cannot
find the device, it returns after deleting that data.
Image 4.41, alias named “ChestLeds” has been defined for the three control devices (3 LEDs).
The “setAlias” function is the most useful for sending a command that has been defined as an alias.
However, you can still send the same control command by using the “set” function.
Output: None
[Time-mixed]
For each device control, a command (in a format that combines both command and time) is sent. Following
is one example of Time-mixed which sends two commands for red LEDs and one command for the blue LED
and one command for the green LED at the same time.
The LEDs change to red for four seconds, and then the blue and green lights are turned on at the same time
for two seconds and three seconds respectively. After six seconds, the red LEDs will go off and only the green
and blue LEDs will remain.
[Time-separate]
For the control of each device, the time list and commands for each specific time will be sent separately.
This method is more efficient for bit control. The following is a Time-separate example where the time list
and command list (for each specific time) are assigned independently. (image 4.43)
Output: None
DCM provides a synchronization method through a callback function for real-time threads.
The callback function has “Preprocess” and “Postprocess” methods.
The “onPreProcess” method is called before being sent to the chestboard, and there is a short delay
as the command is being sent. The “onPostProcess” method is called after all the values in ALMemory have
been updated, and you can use this to obtain the new values from all the sensors.
To operate the functions (called from DCM threads) in real-time and to prevent DCM cycle delays,
you must comply with the following conditions:
Most of the functions designed by the user should have set time intervals inside the system cycle.
If the return time of the designed function occurs in 1ms, 10ms, and 1ms order, it will bring inefficient results
to the entire control loop.
182
content
5.1 Overview 184
183
5.1 Overview
Robot Kinematics is a research of end effector movement of robots with multiple degrees of freedom.
Kinematics allows you to identify locations by calculating the connection between the default position
and each of the parts and by calculating the joint angle values.
It is largely classified as speed and location and divided into forward kinematics and reverse kinematics.
Forward kinematics utilizes matrix operations for joint rotations and structural lengths to identify the device
locations. Reverse kinematics sets the robot’s endpoint to calculate the necessary rotation value of the joints
to plan movements.
We use the 4x4 transformation matrix where the matrices are multiplied with other matrices to calculate
movement and rotation. Matrices are composed as shown below, and the element of each matrix has a role
for every position change.
The nine numerical values between A and I represent the coordinates of the rotation, and L, M, and N
represent the moving part. Through the multiplication of the transformation matrix, several position changes
can be represented with just one transformation matrix.
A three-dimensional move during a position change can be calculated using the transformation matrix
shown below. If you liked to calculate the coordinates after moving from the points (x,y,z) in the Cartesian
coordinate system to X-axis for L, to Y-axis for M, and to Z-axis for N, you can obtain the desired points
(x’,y’,z’) by assign them to the following equation.
Three-dimensional rotation can have a different transformation matrix depending on the axis of rotation.
The following shows the matrix for rotational transform based on x, y, x axes.w
The following gives you more detailed information regarding the length of each part.
• There are currently two versions of NAO (3.2 and 3.3). Since there are some differences between
the two models, you will need to edit the kinematics calculations made based on Version 3.2 in order to apply
it to Version 3.3.
Information regarding NAO’s specific weight and center of gravity for each part can be found in the following
web page:
http://www.aldebaran-robotics.com/documentation/family/nao_h25/masses_h25.html
It is 126.5mm from the center of gravity to the neck, 100mm to the shoulder joint, 85mm to the hip,
and 100mm to the thigh. It is 102.75mm from the knee to the ankle, the foot height is 45.11mm,
the length between the neck and shoulder joint is 98mm, and the hand width is 15.9mm.
For the arm, it is 90mm from shoulder to the elbow, 50.55mm from elbow to wrist, and the length
of the hand is 58mm.
NeckOffsetZ 126.50
ShoulderOffsetY 98.00
UpperArmLength 90.00
LowerArmLength 50.55
ShoulderOffsetZ 100.00
HandOffsetX 58.00
HipOffsetZ 85.00
HipOffsetY 50.00
ThighLength 100.00
TibiaLength 102.74
FootHeight 45.11
HandOffsetZ 15.90
Graph 5.1 above shows the information regarding the lengths of all the joints.
These values can be used for the kinematic calculation of each joint.
HeadPitch Head joint front and back (Y) -38.5 to 29.5 -0.6720 to 0.5149
RShoulderPitch Right shoulder joint front and back (Y) -119.5 to 119.5 -2.0857 to 2.0857
RShoulderRoll Right shoulder joint right and left (Z) -94.5 to -0.5 -1.6494 to -0.0087
RElbowYaw Right shoulder joint twist (X) -119.5 to 119.5 -2.0857 to 2.0857
LShoulderRoll Left shoulder joint right and left (Z) 0.5 to 94.5 0.0087 to 1.6494
LElbowYaw Left shoulder joint twist (X) -119.5 to 119.5 -2.0857 to 2.0857
LHipYawPitch Left hip joint twist (Y-Z 45°) -65.62 to 42.44 -1.1453 to 0.7408
RHipYawPitch Right hip joint twist (Y-Z 45°) -65.62 to 42.44 -1.1453 to 0.74080
LHipRoll Left hip joint right and left (X) -21.74 to 45.29 -0.3794 to 0.7904
LHipPitch Left hip joint front and back (Y) -101.63 to 27.73 -1.7739 to 0.4840
LAnklePitch Left ankle joint front and back (Y) -68.15 to 52.86 -1.1895 to 0.9227
LAnkleRoll Left ankle joint right and left (X) -44.06 to 22.79 -0.7690 to 0.39780
RHipRoll Right hip joint right and left (X) -42.30 to 23.76 -0.7383 to 0.4147
RHipPitch Right hip joint front and back (Y) -101.54 to 27.82 -1.7723 to 0.4856
RKneePitch Right ankle joint front and back (Y) -67.97 to 53.40 -1.1864 to 0.9320
RAnkleRoll Right ankle joint right and left (X) -22.27 to 45.03 -0.3886 to 0.7858
NAO consists of 25 total joints: 2 head joints, 5 joints for each arm (10 total), 5 joints for each leg (10 total),
1 on the pelvis, and 2 that executes the opening and closing movement of the hand. Independent control of
each joint is possible, but the 2 pelvic joints must be controlled at the same time. Each joint has a limited
angle of movement, and the limitations are shown in the graph above under ‘Range.’ For the legs, crashing
with robot’s cover has been put into consideration for the joint limitation. Graph 5.2 (Motion) gives you the
axis of rotation for each joint.
You can use the key names in NAO’s ALMemory to access the current joint or sensor values.
The following shows you the command. Use ‘Joint Name’ of the desired joint from Graph 5.2
above for “Device Name” used by the command. The joint value returned will be in radians.
Command
(radian):
Device/SubDeviceList/”Device Name”/Position/Actuator/Value
Sensor
(radian):
Device/SubDeviceList/”Device Name”/Position/Sensor/Value
The head consists of Pitch and Yaw joints. Pitch is the joint that moves the head back and forth rotating
around the Y-axis. Yaw is the joint that moves the head side to side rotating around the Z-axis.
Image 5.4 shows the angular limit of each joint.
Shoulder, elbow, and wrist joints make up the arm. The shoulder has the Pitch joint moving back and forth
around the Y-axis and the Roll joint that lifts the arms based on the Z-axis. Elbow consists of the Roll joint
that rotates left and right based on the Z-axis and the Yaw joint that rotates based on the X-axis. Wrist
consists of the Jaw joint which rotates around the X-axis. Image 5.5 shows the possible angular movements
for each joint.
Pelvic joints consist of two RHipYawPitch and LHipYawPitch joints that rotate based on somewhere
in the middle of the Y-axis and Z-axis of LHipYawPitch and RHipYawPitch. Independent control of these joints
are impossible because they are physically configured with just one motor.
The leg consists of 5 joints including HipRoll of the pelvic joint which rotates left to right based
on the X-axis, HipPitch that moves the leg back and forth based on the Y-axis, KneePitch of the knee joint
rotating around the Y-axis (similar to HipPitch), AnklePitch of the ankle joint which moves the leg back and
forth around the Y-axis, and AnkleRoll that rotates the leg left to right around the X-axis. For the joints that
make up the leg, in addition to the limits shown in Image 5.7, pitch and roll joint values are also limited
in order to prevent crashing with NAO’s surface cover. Image 5.8 and Graph 5.3 show a more detailed
information of these limitations for both legs.
image 5.8 - NAO’s left leg joint lmitations graph 5.3 - Limitations of NAO’s left leg Joints
image 5.9 - NAO’s left leg Joint limitations graph 5.4 - Limitations of NAO’s right leg joints
5.4.1 oVerVieW
The robot generally consists of continuous joints and links; movement of one joint will influence
the connected joint. Connected joints can have any length of link including 0 and can rotate around
any axis. A standard coordinate system is assigned for each joint to interpret the connection and position
and determines the general process of converting one joint to another. Starting from the reference point
to the first joint, from the first joint to the second, third, etc., until the last joint, once you are finished
with the entire transformation, you can obtain the transformation matrix which enables you to obtain
the position of the joint from the reference point.
In this book, we will use the Denavit-Hartenberg (DH) representation for kinematic calculations.
The DH method is used to describe the robot’s kinematics and represent its motions, and it can be used
regardless of the shape of the robot. As shown in the image, the DH method displays the relationship
of the connected joints in four different variables.
image 5.10 - Relationship of joints shown using the Denavit-Hartenberg (DH) Method.
• Link torsion a: Based on the Xi-1 axis, it is the angle from the Zi-1 axis to the Zi axis.
• Link length a: Offset distance from the intersection between the Xi-1 axis to the Zi-1
axis to the intersection of the Zi axis.
• Joint angle o: Angle between the Xi-1 axis and Xi axis based on the Zi axis
• Link offset d: In the Zi axis, it is the distance between the Xi-1 axis of the base frame
and the Xi axis.
In order to show the robot’s joint relationships using the DH method, first, set a standard coordinate system
for each joint. Assign Z and X axes for each joint as shown; the Y axis mutually perpendicular to both Z
and X axes can be calculated at any time. The DH method does not use the Y axis. You will need a total of four
standard exercises in order to convert to the coordinate system of the next joint.
b - Move Xi-1 until di following the Zi-1 axis. This process places Xi-1 and Xi in the same position.
C - Move until Ai following the Xi-1 axis. This process places the origins of the two coordinate systems
in the same position.
D - Lastly, rotate until i based on the Xi axis. After this process is over, the two coordinate systems
become identical.
Matrix A can be obtained by multiplying the four matrices (which represent each exercise) in front
of the position. The following is the equation of this relationship. Each ‘n’ and ‘n+1’ represent the current joint
and the next joint to be connected.
The transformation matrix is generalized as shown below when the matrix is calculated and organized.
Unit:
The arrows in Image 5.11 each represent the X axis of the Cartesian coordinate (green), Y axis (red),
and Z axis (blue), and these are used to calculate the kinematic rotation.
Pos represents the position of the Cartesian coordinate system, and as T (Transform Matrix), the subscript
of the upper left corner represents the transformation objective and the subscript of the lower left corner
represents the current position. ‘h’ is the hand and ‘o’ is the center point. ‘T’ (position change) from the
equation above can be represented through coordinate movements and rotations.
The right arm consists of five joints. Starting from the uppermost joint, there is the shoulder
(RShoulderPitch, RShoulderRoll), elbow RElbowRoll, RElbowYaw), and wrist (RWristYaw). You can obtain
the hand position if the position change occurs in the aforementioned order starting from the center point.
The following is the position change in order from the center point to the hand: center point - shoulder
- elbow - wrist - hand.
The length from the wrist to the hand and the move from the center to the shoulder are not included.
Transformation matrix for T, which is the transformation matrix for the five joints above, is divided as shown
below and T is represented as A. For the DH method, all = ± 90° when the relationship of NAO’s arm joints
are being analyzed, and since C ( ) = 0 and a = 0, A... can be simplified as shown below.
The position change between the non-revolving pivot point and the shoulder is expressed only by the position
movement, and this position change ( ) is expressed as shown.
Here, Sy represents the move to the Y axis and Sz represents the move to the Z axis, and appears as
the sum of the two moves. Cartesian coordinate position change from the center point to the right shoulder
happens -98mm toward Y and 100m toward Z. The following shows the transformation matrix.
Rotation transformation is included in shoulder to wrist position changes, and the transformation matrix
is determined by d, a, , and according to the DH method. Position change of the shoulder joint is
represented by which contains two pivot joints as shown in the following equation. represents
the Pitch rotation for the Y axis rotation and represents the Roll rotation of the X axis rotation.
Here, the from RShoulderPitch (the first pivot joint) from the DH method is written as A1, and since A1
is 1 = −90° , d1 = 0, the transformation matrix is as follows.
from RShoulderRoll (the second pivot joint) from the DH method is written as A2, and A2 is 2
= 90°
and d2 = 0. The transformation matrix is as follows.
The rotation is also accomplished by the move of 1 and rotation of 2. represents the Roll rotation
of the X axis and represents the Yaw rotation of the Y axis.
Move from the shoulder to the elbow in RElowbowRoll (first elbow joint) is 90mm in the X axis,
and it is represented by ‘d.’ , a transformation matrix containing the move, is written as A3.
The transformation matrix is as shown below since it is 3 = 90°, 3 = 90°, and d3 =90 in A3.
, which is the position change of RElowbowYaw (pivot joint of the elbow) represents the Yaw rotation
in the Y axis, and it is written as A4. The transformation matrix is as shown below since it is 4 = −90°
and d4 = 0 in A4.
Position change of RWristYaw (wrist joint) occurs through the move of 1 and rotation of 1. Move to the wrist
is 50.55mm in the X axis, and represents the Yaw rotation in the Y axis.
Move from wrist to hand is 58mm in the X axis and -15.90 in the Z axis, and the following shows
the transformation matrix for .
The hand position in the Cartesian coordinate system based on the center point can be obtained through
the following matrix calculation.
For kinematic analysis using Python, you need the math library for calculating trigonometric functions
(sine and cosine for example) and theNumPy (Numerical Python) library for matrix operations.
The following shows how to use the library. First, you have to import the library you installed.
import math // import numpy as np
Since a 4x4 matrix is used for the transformation of each coordinate system during kinematic analysis,
a transformation matrix is created and the element is initialized with 0.
Trans = np.mat([[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0]])
The following shows the Python code that calculates the position of the right hand using
Python’s matrix calculation and having NAOqi call the joint value. Forward kinematic calculation
is done following the process mentioned above, and it outputs NAO’s current right hand position into the
center point-based Cartesian coordinate system.
# Define center position, define and calculate transformation matrices, calculate hand position
Pos_o = np.mat([0,0,0,1]).T
Pos_s = Pos_o + np.mat([0,-98,100,0]).T
# Define variable value used for DH method (unit of rotation: Rad, unit of move: mm)
Theta1 = 0
Theta2 = 0
Theta3 = 1.57
Theta4 = 0
Theta5 = 0
D1 = 0
D2 = 0
D3 = 90
D4 = 0
D5 = 50.55
# Transformation matrix A calculation which includes all rotation transformations of the robot
A = A1*A2*A3*A4*A5
# Calculate the hand position using the move through the hand at A (change from shoulder to wrist)
Pos_w = A*Pos_s
Pos_h = Pos_w + np.mat([58,0,-15.90,0]).T
Result from when it is RSP = 1.03395/ RSR = -0.02305/ REY = 1.61679/ RER = 0.89129/
RWY = 1.17653
Pos_h
matrix([[ 172.3705713 ],
[ 132.72582755],
[ 99.73226304],
[ 1. ]])
Pos_h2
matrix([[ 172.3705713 ],
[ 132.72582755],
[ 99.73226304],
[ 1. ]])
5.5.1 oVerVieW
Inverse kinematics is a form of research to find out how much each joint has to rotate to reach a certain
point when the position and bearing of the robot’s hands or feet are given. Beyond inverse kinematics for one
motion, it can also be used to control the movements by calculating the joint angle value that continuously
moves the position of the hand.
There isn’t just one combination used for the robot’s hand to reach a certain position, so it is impossible
to find the matrix element to obtain the trigonometric function value that will calculate all the joint angle
values. Therefore, in order to separate the equation used to obtain the joint angle value, you must multiply
the inverse of the transformation matrix ‘An’ to the left side of the relation equation to obtain the elements
needed to calculate the angle values.
First, the following shows the transformation matrix A regarding NAO’s right arm obtained in Section 5.4.2
Although the Yaw joint of the wrist (from NAO’s arm joints) is useful for determining the direction of the hand,
it is meaningless for obtaining the position, so the calculation for this joint is unnecessary when calculating
the hand position. Therefore, for A, you only need to obtain 1 2 3 3 (joint values of A1, A2, A3, A4) in order
to determine the hand position. Move the link value from the Yaw joint (from the equation above) in the future.
The modified equation is as follows.
Joints 2 and 3 are parallel to each other, so it is not suitable to multiply the inverse matrices of A2
and A3 in the front to obtain the joint value.
From the calculation for obtaining the next 3, in order to obtain the angle value of the first shoulder joints
of A2 and A3, multiply (inverse of the transformation matrix) to each left-hand side
of the equation above.This is demonstrated in the equation below.
You can obtain the angle values ( 1, 2, 3, 4) of each joint through the process above in order
to position the robot’s hand in the Cartesian coordinate system (Px, Py, Pz).
The code below obtains the angle value of each joint calculated in Section 6.5.1. Joint values are
obtained in order starting from NAO’s shoulder joint, and it processes exceptions to prevent dividing by 0.
### Joint value calculation for moving to the desired coordinate
# Call library and definition of function
import math
import time
# Output calculated joint angle value print theta1, print theta2, print theta3, theta4
# Set the wait time for verifying the joint angle value time.sleep(2)
The Cartesian coordinate system determines the hand location and enters the required joint angle value
into the joint movement plan in order, and then delivers these commands to make NAO move. In the next code,
where joint angle value has already been determined, this value is used to control the movements.
# Set joint stiffness (the joint will not move if the stiffness is not set)
names = ‘Body’
stiffness = 1.0
proxy.stiffnessInterpolation(names, stiffness, 1.0)
proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
time.sleep(1)
proxy.setAngles(“RShoulderRoll”,-0.3,0.3)
time.sleep(1)
proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
time.sleep(1)
proxy.setAngles(“RShoulderRoll”,-0.3,0.3)
time.sleep(1)
proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
time.sleep(1)
proxy.setAngles(“RShoulderRoll”,-0.3,0.3)
time.sleep(1)
proxy.setAngles(“RShoulderRoll”,-0.8,0.3)
# Option that determines whether relative joint values (which includes the current joint value) or absolute joint
values will be used
isAbsolute = True
time.sleep(5)
Image 5.13 is a simulation screen showing how Choregraphe moves the NAO robot as the next
codes executed. Nine target positions received and the joint angle value is calculated. Then the robot
will move by delivering this value to each joint. (Use Python to create the executable file through
the code shown below. )
point = [[1,-1,-1],[1,-0.8,-1],[1,-0.5,-1],[1,-0.3,-1],[1,0.01,-0.8],[1,0.3,-1],[1,0.5,-1.2],
[1,0.55,-1.4],[1,0.7,-1.57]]
for i in range(0,9) :
px = point[i][0]
py = point[i][1]
pz = point[i][2]
print px,py,pz
if pz==0 :
theta2 = 0
else :
theta2 = math.atan((math.cos(theta1)*px + math.sin(theta1)*py)/pz)
if (math.sin(theta1)*px-math.cos(theta1)*py)==0 :
theta3 = 0
else :
theta3 = math.atan((math.cos(theta1)*math.cos(theta2)*px +
math.sin(theta1)*math.cos(theta2)*py - math.sin(theta2)*pz)/(math.sin(theta1)*px -
math.cos(theta1)*py))
if pow(math.cos(theta3),2) - pow(math.sin(theta3),2)==0 :
theta4 = 0
else :
theta4 = 1/(pow(math.cos(theta3),2) - pow(math.sin(theta3),2))
print theta1,theta2,theta3,theta4
proxy.setAngles(«RShoulderRoll»,(-0.2),0.2)
proxy.setAngles(«LShoulderRoll»,-(-0.2),0.2)
216
content
6.1 Choregraphe Application 218 6.5 Combining Recognition and 261
Movement – Using Images for Object
6.1.1 Program Configuration 218 recognition and Grabbing Motion
6.1.2 NAOqi API 219
6.1.3 Keyframe 226 6.5.1 Object Recognition 262
6.1.4 Timeline Editor 231 6.5.2 Grabbing the Object Using 263
Inverse Kinematics Analysis
6.5.3 Grabbing the object using 264
6.2 Motion Control – Timeline Editor 235 inverse kinematics analysis
6.5.4 Combining recognition 267
6.2.1 Saving NAO’s Actual Movements 235 and Grabbing Motion 5
6.2.2 Adjusting NAO’s movements 238
6.2.3 Controlling Joint Movements 242
217
6.1 Choregraphe
Application
Chapter 2 introduced the interface, basic programming method, and boxes provided by Choregraphe.
Chapter 3 explored how to use Python to create new boxes and edit box scripts.
Chapter 4 looked into the NAOqi framework that configures NAO’s system and the DCM in charge
of the communication between NAO’s devices. In this chapter, we will use the advanced features
of Choregraphe and Python to edit scripts and used the NAOqi framework and DCM features to implement
an actual program.
Most of the boxes provided by Choregraphe have diagrams, and they are a combination of boxes
that consist of internal scripts of a diagram or boxes composed of Timeline. Much like the Say box,
the box composed of scripts is made of Python (or Urbi) and NAOqi API. Like the Arms Example box,
the box composed of Timeline is made up of Keyframe and Frame. Using Choregraphe to program NAO’s
movements is to place and connect these boxes. Using Choregraphe to program NAO’s actions means you
have to place and connect these boxes.
Timeline keyframe has an internal diagram, and its special feature is being able to start at a specific
frame. For example, the Hello box provided by Choregraphe is a Timeline box composed of defined frames
and the FaceLeds keyframe. The diagram for the FaceLeds keyframe is composed of the Light_
AskForAttentionEyes box in script form. (Image 6.2)
Meaning, boxes that use various devices and joints (like the Hello box) are composed of scripts
that call the methods of all the devices and Timeline with predefined movements.
image 6.3 - onInput_onStart method for the Eyes LEDs box script
ALVisionToolbox DCM
■ Example 6.1 Turning off all LEDs for a certain period of time (Example File: Ex6.1_AllLEDs.crg)
NAO not only has LEDs for eyes, ears, and feet controlled by the LED box but also LEDs on the chest
and around the contact sensors. There are ten small blue LEDs on one ear configured at 36 degree intervals.
One eye has eight red, green, and blue LEDs configured at 45 degree intervals. The chest (power button)
and feet LEDs have one red, one green, and one blue LED. Each LED has a group name: AllLeds, ChestLeds,
EarLeds, FaceLeds, and FeetLeds are the most commonly used. Contact sensor LEDs can use AllLeds to
turn on and off much like all the other LEDs, but it cannot be controlled independently. Other group names
and LED names are specified in the reference (Advanced/Robot Lights).
The LED library boxes introduced in Chapter2 target LEDs of specific parts (like ears and eyes),
and the control is fixed. The user must edit the box to have a more flexible control of the LEDs.
The ALLeds module of NAOqi API provides a method to control the LEDs.
This example will use Python programming to turn off the LEDs for the eyes and ears and to turn them
back on after a certain period of time. RGB color representation will be used to control the LEDs of the chest,
eyes, and feet. You can easily turn the LEDs on or off by using the ‘off’ and ‘on’ methods.
This example will control the LEDs by using the ‘fade’ and ‘fadeRGB’ methods. The following is
an explanation of the off/on, fade, and fadeRGB methods:
- void fade(const string& name, const float &intensity, const float &duration)
Role: Intensifies a specific LED or a group of LEDs for a set period of time.
- void fadeRGB(const string& name, const int &rgb, const float &duration)
Role: changes the intensity of a specific LED or a group of LEDs for a certain amount of time.
Name: refers to the name of an LED or a group of LEDs.
RGB: RGB value, an integer. If represented with a hexadecimal, it will be 0x00RRGGBB.
Duration: time it takes to change the LED brightness in seconds
The AllLeds box we will create in this example consists of one input, one output, and three parameters.
The following explains the input and parameters (Image 6.4).
- Duration_off(on): Float type parameter. Refers to the time (seconds) spent on turning
the LEDs on or off completely. Value ranges from 0.0 to 10.0. Default value is 2.0 seconds.
B - The following is the script for turning all the LEDs on or off.
The onInput_RGBvalue(self, p) method becomes active when a signal comes in to the input of the AllLeds
box. As previously stated, the input is an array of numbers with three digits. The next process explains how
to use the RGB value.
LEDs are controlled by using the fade( ) method (Line 2) in the ALLeds module. “AllLeds,” the first
parameter, is a grouping of NAO’s LEDs. 0.0, the second parameter, minimizes the LED intensity.
Self.getParameter(“Duration_off”), the third parameter, changes Duration_off (AllLeds box parameter)
into a circular value. This value refers to the time spent until the LEDs are completely turned off. Time.sleep
( ) in Line 3 is the time module’s sleep method provided by Python; it delays the system for a certain amount
of time. The sleep method parameter is in numeric data type and is measured in seconds. It turns off all
the LEDs and uses the sleep method in order to turn on the LED after a certain amount of time.
Duration_keep parameter in the AllLeds box is the Sleep method parameter.
ALLeds.fade( ) in Line 4 is used differently than the one in Line 2. Here, it functions to turn on all the LEDs.
The second parameter, 1.0, maximizes the LED intensity.
If you are executing by inserting just the part of the script above the default script, you will see that NAO’s
ear LEDs and contact sensors will turn blue, and eye LEDs, power button, and feet will turn white. The white
color comes from RGB color representation (255, 255, 255). This is why the color is white when the second
parameter (LED intensity) of the ‘fade’ method is 1.0. RGB color value of black is (0, 0, 0).
13. b = self.clampColor(p[2])
14. rgb = self.getRGB(r,g,b)
15. #… LED on and off …
16. ALLeds.fadeRGB(«ChestLeds», rgb, self.getParameter(«Duration_on»))
17. ALLeds.fadeRGB(«FaceLeds», rgb, self.getParameter(«Duration_on»))
18. ALLeds.fadeRGB(«FeetLeds», rgb, self.getParameter(«Duration_on»)
19. pass
In order to use RGB color representation, you have to convert three color values into one.
The getRGB method in Line 1-3 handles this.
The parameters of the getRGB method are r(red), g(green), and b(blue). It is 256*256*r + 256*g + b after
being converted by RGB color representation. The clampColor method guarantees the validity of each color
value (Line 4-9). As introduced earlier, each RGB color has a value between 0 and 255. If a value exceeds
this, it will be presented as a completely different color. You can prevent this by using the clampColor
method.
After receiving the array of values for R, G, and B from the Color Edit box, onInput_RGBvalue method
validates the color values and converts them to one number. Parameter ‘p’ has R, G, and B color values
as an element of the numeric array.
Line 11-13 calls the clampColor method and validates the three color values. Line 14 calls the getRGB
method to converwt it to one number. Line 15 turns the LEDs on and off. Further explanation is omitted
here since it was previously explained. Line 16-19 turns on the chest, eyes, and feet LEDs to RGB colors.
ChestLeds, FadeLeds, and FeedLed groupings of the chest, eyes, and feet LEDs are entered as strings
into the first parameter. The second ‘rgb’ is a number converted by the getRGB method.
1. class MyClass(GeneratedClass):
2. def_init_(self):
3. GeneratedClass__init__(self)
4. def onLoad(self):
5. pass
6. def onUnload(self)
7. pass
8. def getRGB(self, r, g, b):
9. return 256*256*r + 256*g + b
10. pass
11. def clampColor(self, p):
12. if(p < 0):
13. if(p < 0):
14. if(p > 255):
15. p = 255
16. return p
17. def onInput_RGBvalue(self, p):
18. r = self.clampColor(p[0])
19. g = self.clampColor(p[1])
20. b = self.clampColor(p[2])
21. rgb = self.getRGB(r,g,b)
22. ALLeds.fade(“AllLeds”, 0.0, self.getParameter(“Duration_off”))
23. time.sleep(self.getParameter(“Duration_keep”))
24. ALLeds.fade(“AllLeds”, 1.0, self.getParameter(“Duration_on”))
25. ALLeds.fadeRGB(“ChestLeds”, rgb, self.getParameter(“Duration_on”))
26. ALLeds.fadeRGB(“FaceLeds”, rgb, self.getParameter(“Duration_on”))
27. ALLeds.fadeRGB(“FeetLeds”, rgb, self.getParameter(“Duration_on”))
28. pass
Although you can edit the box to use the RGB color values for the parameters, it would be easier to use
the Color Edit box to set the desired colors.
Example 6.2 demonstrates simple exercise movements and shouting according to each movement
(Image 6.7). You can use Keyframe to implement the desired movements at specific times. We will omit
how to create movements since it was previously explained. The Aerobic box where NAO’s movements are
defined is saved in the example file (Ex6.2_Aerobic.crg). The followings shows you how to use Keyframe.
B - Layer is created when you press the + button in ‘Behavior layers.’ The layer has several sequential
Keyframes. A diagram opens if you select keyframe1 which was initially generated. This diagram starts
in Frame 1. The start frame can be set by moving the Keyframe.
You can move the Keyframe by dragging it with your mouse or selecting the Edit Keyframe
menu by right-clicking your mouse on top of the Keyframe.
The diagram’s onLoad input (j in Image 6.8) activates when the current frame passes the start frame
of Keyframe.
The onStopped output (k in Image 6.8) is the end of the Aerobic box.
C - To add/delete Keyframe, use Insert Keyframe/Delete Keyframe after clicking the right mouse button
on top of the Keyframe. As you can see in Image 6.9, Keyframe is created where the mouse was.
The chosen Keyframe turn bright blue while the one that wasn’t will turn bright purple.
D - Add Keyframes for Frame 1, 10, 20… 90, and place a Text Edit box and Say Text box in each Keyframe
diagram. Enter the Text Edit box for each Keyframe as shown in Image 6.10. Be careful not to connect
the end output of the Say Text box and the end output of the diagram with each other.
Chapter 2 omitted any detailed information regarding Keyframes, so we did not explore
how to use the Play, Stop, Goto And Play, and Goto And Stop boxes of the Tool library. Keyframe manipulation
will be explained by analyzing the Stand Up box (from the Motion library).
Image 6.11 shows the Keyframe structure of the Stand Up box. The solid red line shows the transitions
between Keyframes. The solid blue lines show the connection between the same keyframe. Keyframe
movements are implemented using the Goto And Stop box. Unlike the Goto And Stop box introduced
in Chapter 2 which sets the frame number, the Goto And Stop box used by the Stand Up box sets the name
of the frame.
Keyframe mainly moves twice. The first move occurs in the DetectRobotPose keyframe as shown
in Image 6.12. If NAO’s current pose obtained from the ‘get Robot Pose’ box is either “Sit”, “Crouch”, Knee”,
“Frog”, “Belly”, “Back”, “Left”, or “Right,” the standing motion for each pose is moved to a defined Keyframe.
Here, you must use the ‘Goto And Stop’ box to prevent congestion between these defined Keyframes.
The second move occurs inside the defined keyframe (Image 6.13). For example, when NAO is sitting down,
it moves from DetectRobotPose keyframe to the FromSit keyframe. StandFromSitted box (moving from
the sitting positing to standing) is inside the FromSit keyframe. Once you complete the StandFromSitted box,
the number of tries increase by 1 through the Increase Count box. It moves back to the DetectRobotPose
keyframe afterward to determine whether standing was successful and to retry if it was unsuccessful.
Timeline Editor can be used in a set form or by the user to manipulate automatically interpolated
movements. You can also define NAO’s actual movements in the Timeline frame at regular intervals by using
the motion recording function.
x
v
u
w
Image 6.14 - Example of Timeline Editor (curves mode) and joint information
Image 6.14 shows a Timeline Editor screen of a simple movement. Timeline Editor is divided
into worksheet mode, curves mode, and record mode. Worksheet mode shows whether the joint
has been defined (Image 6.15). Curves mode shows the changes of the joints (Image 6.14).
Record mode provides the function that allows you to save NAO’s movements in the Choregraphe frame;
a later example will explain how to do this.
Timeline Editor interface is largely divided into four parts. Actuators (j in Image 6.14) show the name and
color of NAO’s joints. k in Image 6.14 consists of buttons that manipulate the frames. l in Image 6.14
shows joint movements. The points here show that the movements (rotation angle of joint) in corresponding
frames have been defined. Worksheet mode shows whether each joint is being used. The Record menu saves
NAO’s actual movements in Choregraphe’s frame (m in Image 6.14).
Constant: Executes joint movements only in defined frames. The joint movements executed thus far
have been smooth, but the movements edited with Constant are stiff.
Linear: Movement is linearly interpolated until the next frame (where the movement has been defined).
Bezier: Interpolation using the Bezier curve. It moves rather slowly in at the beginning and end compared
to other spots. Two tangents are created for each point, and the user can use the tangent of the slope
to manipulate the Bezier curve.
Automatic Bezier: Interpolation using the Bezier curve. Choregraphe sets the tangent of the slope
on both ends. Here, smoothness of motion will be the standard. Movement of this curve usually occurs
when movements are defined in Timeline frames.
Bezier Smooth: Interpolation using the Bezier curve. It is similar to the two manipulation methods above.
The only thing different from the Bezier method is that the two tangents maintain a straight line. Meaning,
if you change the slope of the left tangent, the slope of the right tangent will also change.
Simplify: Button that reduces the frame with the defined movements limited to the frames that have identi-
cal joint movements. For example, if you assume that NAO’s movements are defined in four frames and that
the head joint is not used by the entire section, this button can be used to delete the head joint definition for
the two middle frames.
Show tangents: Shows the tangent limited to the joints and movement-defined frame in Curves mode.
View all: If the output of Curves mode was edited by the user, this button can be used to initialize it.
Frame manipulation for Timeline Editor uses the Curves buttons shown in k (Image 6.14) and the joint
information curve in m (Image 6.14). If the user is personally manipulating the Curves mode, you must
use the joint points defined by each frame. Graph 6.2 explains the buttons used for the Curves mode.
Image 6.16 - Record mode and operational buttons for Timeline Editor
If you use Timeline Editor’s Record mode, you can save NAO’s actual movements in Choregraphe frames.
As introduced earlier, Record mode can be executed by using ‘Switch recording mode’ in the Record menu.
If you place a checkmark for ‘View record toolbar’ in the Record menu, Record mode buttons will be added
to the Timeline Editor interface (Image 6.16).You can see that buttons for recording and playback have been
added for each joint in the Actuators window of Timeline Editor (Image 6.16). Image 6.16 shows the Record
mode buttons on the upper right corner: (starting from the left) switch recording mode button, start button,
and settings button.
As shown in Image 6.17, a setup screen will open if you click the ‘settings’ button in Record mode.
‘Mode’ can select how to save NAO’s actual joint information in Choregraphe. ‘Advanced’ can set
when it will be saved. ‘Periodic’ (in ‘Mode’) saves NAO’s actual movementsto the frame in set cycles,
and ‘Interactive using bumpers’ uses NAO’s bumper sensors to save NAO’s actual movements.
After actually manipulating NAO, the user can save the movements to the Choregraphe frame by pressing
NAO’s left bumper. The right bumper can set whether to enable the joint locks. ‘Time step’ (in ‘Advanced’)
is the interval where NAO’s movements are saved, and ‘Allow Timeline extension’ can set whether
Timeline’s end frame can be expanded.
We have explored NAOqi API, Timeline’s Timeline Editor, Keyframe, and Record mode.
The following sections will show examples of how to create exercise movements using Timeline Editor
and Record mode, finding paths using landmarks, and memorizing the multiplication table using Python
and NAOqi API.
It is important and useful to interpolate the joint value between the two frames where NAO’s joint values have
already been set. The control buttons for Curve introduced in graph 6.2 each refers to an interpolation method.
Typically linear interpolation interpolates between two frames, so the variation of the joint value is the same.
Interpolation method in NAO focuses on how suitable it is for the movement. Choregraphe uses the Automatic
Bezier interpolation method, and it focuses on the smoothness of the motion. You can use the Curve control button
to easily implement wide, spur of the moment joint motions as well as slow rotations.
The Record mode saves NAO’s actual movements. It is not an easy task for the user to select the frame from
the Timeline window to save the joint values in the appropriate frame. Choregraphe provides the ability to record.
The Record mode saves NAO’s movements in the frame by using specific time intervals and specific frame intervals
through signals provided by the user. This feature helps the user program a more ideal movement.
In this section, Timeline Editor will be used to record NAO’s actual movements, adjust joint movements,
and control movement formations between frames using the Curve control button. This exercise is implemented
in ch6.2.timeline editor.crg.
image 6.18 - Setting the Record mode (left) and Record Start message (right)
Set the Record mode to ‘Interactive using bumper,’ the time to 0.5 seconds, and activate the ‘Allow Timeline
extension.’ (Left side of image 6.18) The message on the right side of image 6.18 will be shown when you press
the start button in Record mode. The movement will start saving once you press the OK button.
a - In order to save NAO’s actual movements, you must select the joint that will be used.
(image 6.19) This exercise will use the head and arm joints, so click the Record button for both Head
and Arms; the Record button will turn red. If a joint is not selected, the joint will neither lock nor unlock
even if NAO’s right bumper is pressed.
b - Click the ‘stand’ position from the Pose Library in order to set the initial pose, and then press NAO’s
left bumper. If the left bumper is pressed, a red line will appear in Frame 7 as shown in image 6.19.
This is the frame where NAO’s movement will be saved; this is determined by both the FPS value
of Timeline and the ‘time step’ value of the Record mode. In this exercise, the default value (15)
was used for FPS, and 0.5 seconds was used for the ‘time step’ value. Meaning, FPS (15) / time step (0.5)
determines the frame where the actual movements are saved.
C - After setting the initial pose, press the button for the right bumper to unlock the joint. After the user sets
the unlocked joint into a desired position, the right bumper has to be pressed to lock the joint again. Once set
in the desired position, press the left bumper to save the position to the Choregraphe frame. image 6.21
shows the screen that’s been saved to the frame after outstretching NAO’s forward. There is a red line shown
in Frame 15 in Timeline. This means that it is saved every 7.5 frame intervals. Frames are separated
by integers, so Step [2] was saved in 7 frame intervals while it was saved in 15 frame intervals in this exercise.
e - If all movements have been saved, click the Stop button in Record mode to stop the recording.
When the recording stops, the curve movement saved in the frame will appear on the screen as shown
in image 6.23. You can see that it has the frame with the initial pose we previously selected and 5 frames
from image 6.22. The movement starts in Frame 7 and ends in Frame 45.
a - Select the desired joint for observation in Actuators from the Timeline Editor window.
image 6.24 shows the joint curve for the entire arm for both left and right arms. If NAO’s entire joint
is used, 25 curves will be shown. If this feature is used, only the desired joint can be observed.
b - When the cursor is placed on a point inside the frame where movement has already been defined,
frame number and information regarding joint rotation will come up as shown in image 6.25.
image 6.25 shows the rotation information of the LShoulderPitch joint and RShoulderPitch joint.
C - If you would like to set the two joints with the same value, drag the point (of the corresponding joint)
to move it. If a more precise control is desired, the Expansion button in Timeline Editor or the mouse wheel
can be used to expand the screen (left side of image 6.26). You can control with more ease by using the Curve
key edit button (pencil shaped button) (right side of the image 6.26)
D - In this exercise, ShoulderPitch, ShoulderRoll, ElbowYaw, and ElbowRoll joints will be used. However,
other joints may move in the process of locking and unlocking the actual NAO joints. The previously
introduced ‘simplify’ feature can remove this type of joint movement.
image 6.27 shows the HeadYaw and HeadPitch joint information. The head joint itself was not moved,
but it moved because of the exterior shock resulting from controlling NAO’s movements.
In order to remove this slight movement of the joint, you must select the removal area (j from image 6.28)
and then click the ‘simplify’ button.
If the movement variation in the selected area is below the error margin, the movement will be removed
(k from image 6.28). The movement will be removed across all sectors, but the joint information
of the starting and ending points will be preserved (l of image 6.28).
As previously introduced, the joint is set to move using Automatic Bezier by default. In this section, we will
use the adjusted movement from 6.2.2 and observe how NAO will move as the joint movements are modified.
a - image 6.30 shows NAO’s movements according to Automatic Bezier. The orange frame has been
previously defined so it will not change even if the movements change. The blue frame is the movement
between the two frames with previously defined movements; NAO’s movements will change
if the formation of the movement changes.
b - In order to change the joint movement information, you must select the points from the joint as shown
in j from image 6.31. You can select the points by dragging the mouse or pressing he ‘ctrl’ key
and then clicking the left mouse button.
After selecting the points, click the Linear button (k from image 6.31). When you click the Linear button,
you can see that the movement curve has changed into a straight line as shown in l from 6.31.
image 6.32 is the movement from the Linear formation; you can see that the movement is not very different
from the Automatic Bezier formation.
image 6.33 - Movement from Automatic Bezier formation and Constant formation
image 6.34 show constant movements and you can see that it is different from the movements that use
the two other forms we previously mentioned. The Constant movement moves the joint from the point
where it was defined, so it is easy to understand that it will not move until Frame 14.
However, it is still in its initial pose even in Frame 15. This phenomenon occurs because it takes time
to deliver the actual commands. Meaning, the point of completion for NAO’s movements is actually
the frame that comes right after the defined frame.
Another important thing here is that the movement in Frame 45, the completion frame, is different.
The completion frame does not need to deliver the command anymore so it does not provide the robot
with other signals in Choregraphe.
You must be careful when executing excessive rotation using the Constant form. If you try to bend the arm
backward too quickly while using Constant form, NAO will stumble backward.
Processing image data within the robot can be very useful for a lot of diverse operations. For example,
the robot can be used to identify objects that appear in an image or calculate the location of the object
in question; this is something that cannot be done using infrared and laser sensors or microphones.
- Despite this advantage, processing image data is an extremely difficult task, so it is not used very
often in the early stages. This is also because you have to know a lot of diverse information regarding
the characteristics and patterns of the image data, and the user must configure it to detect these things
through mathematical analysis.
- This problem can easily be solved by using the default image recognition module. In NAO, algorithms
that detect marks, faces and images of objects selected by the user are provided as a module.
If Chapter 2 Choregraphe showed you how to use the boxes formed with these modules, this section
will use the ‘vision reco’ box to control NAO’s walking.
Landmark Detection
Determine
Behavior
Database
Verification
Image Acquisition
image 6.35 shows the configuration of the program we will create in this chapter. NAO’s bottom camera
is used to gather continuous images of the floor. If there is a match for the gathered image from the landmark
image recorded in NAO’s database, NAO will execute the program that corresponds to the landmark.
Here, the landmark is programmed to go, stop, turn left, and turn right.
The landmarks used in this section are shown in image 6.36. Enable each landmark to go, stop, turn left
and turn right. The landmark includes all the areas acquired by the camera, and you must make sure to clearly
separate them in the acquired image. The landmark above is saved in Landmark.pdf. The image study process
was conducted in Exercise 2.8, so detailed explanation will be omitted here. The set value for each landmark
is the same as image 6.37.
If the landmark image has been learned, the study process for Telepathe image will not be explained in detail
here since it was already explained in Exercise 2.8. The set value regarding each landmark image is the same
as image 6.37. After the landmark has been learned, Monitor must be used to verify. image 6.38 shows the
image after the landmark image was verified using Monitor.
6.3.2 Programming
image 6.39 shows the initial program setup. Activate NAO’s bottom camera by using the ‘Select Camera’ box. Set
the volume loudness by using the ‘Set Volume’ box. Use the ‘Wait for Signals’ box to stand by until you are finished
setting up the camera and volume loudness. When setup is complete, use the ‘Stand Up’ box to lift NAO and the
‘Say’ box to announce the start of the program. ‘Init’ box is provided by the Pose library; it is the default pose of the
‘Demo Omni’ box.
If you double click the ‘Init’ box, you will see that the default pose is set in Frame 20. Set the HeadPitch joint
to 2.0 in Frame 20. By bowing NAO’s head, you are letting the top camera look at the area near the feet.
Image 6.41 shows a program that uses the ‘Vision Reco’ box to control walking. As shown earlier, the ‘Vision
Reco’ box outputs two strings, so the ‘Dispatcher’ box list will be added as a string array.
The ‘Demo Omni’ box is used for moving straight, turning left, and turning right.
Graph 6.3 shows the parameters. If another landmark is detected while moving straight using the ‘Demo
Omni’ box, the ‘go straight’ box must be terminated.
If X, Y, Theta, and Step Frequency all use the ‘stop’ box with 0 to stop, or the link is not configured
in the ‘stop’ box from the ‘Demo Omni’ box, the program will not function properly.
Variable concept is something you must take into precaution when Choregraphe is used to program.
The default Number Edit and Text Edit boxes are used as a constant. Meaning, the value does not change
when the program starts. Python must be used to overcome this limitation in order to add the variable
within the box script.
This section will implement the multiplication program and Choregraphe programming a bit
more complicated than NAOqi API. Image 6.43 shows the multiplication program configuration.
You can use he head’s contact sensor to increase and decrease the singular number and then to execute
the multiplication. In order to increase/decrease the singular number, a variable must be used; as previously
mentioned, Python must be used to implement the variable. Python should also be used to implement
the box that reads the numbers. The multiplication program is implemented in ch6.4timetable.crg
A - Create the multiplicand box, and set the input/output parameters as shown below.
Up “Bang” onEvent
Image 6.44 - Setup window for the multiplicand box and the Parameters screen
B - Image 6.44 shows the window where you can set the multiplicand box and parameters.
Adjust the script to implement the box functions as shown below:
During the initialization process of the box (Line 2-6), set the current singular number (self.x)
to 2, the minimum singular number (self.minnumber) to 1, and the maximum singular number (self.
maxnumber) to 9.
When the box is loading (Line 7-13), read the parameters of the default operand, max number,
and min number to change the multiplication setting.
If there is a user mistake and the minimum singular number is greater than the maximum singular number,
the minimum singular number will automatically be set to 1 and the maximum singular number to 9. If a
signal enters the input (‘Up’) of the box (onInput_Up), increase the current singular number (self.x) by 1.
If the increased singular number is greater than the maximum singular number, set the current singular
number to the minimum singular number (self.minnumber). If a signal (onInput_Down) enters the input
(Down), decrease the current singular number by 1, and when the current singular number is less than the
minimum singular number as it was in the aforementioned ‘Up’, set the current singular number to the
maximum singular number (self.maxnumber).
If a signal enters the input (‘Trigger’), output the current singular number (self.x) to the output (‘Num’).
The multiplier used in this section will have two number inputs and will output the multiplication result of the two
numbers (Image 6.45). Adjust the current ‘Multiply’ box to have three outputs. The names of the output are answer,
num1, and num2, in that order. The adjusted ‘Multiply’ box script is as follows:
1. class MyClass(GeneratedClass):
2. def __init__(self):
3. GeneratedClass.__init__(self);
4. self.rMultiplier = 2.0;
5. self.bMultiplicand = False
6. self.bMultiplier = False
7. self.rMultiplicand = 2.0
8. def onUnload(self):
9. pass
10. def onInput_Multiplicand(self, rVal ):
11. self.rMultiplicand = float(rVal)
12. self.bMultiplicand = True
13. if self.bMultiplicand and self.bMultiplier:
14. self.process()
15. def onInput_Multiplier(self, rVal ):
16. self.rMultiplier = float(rVal);
17. self.bMultiplier = True
18. if self.bMultiplicand and self.bMultiplier:
19. self.process()
20. def process(self):
21. rRes = self.rMultiplicand * self.rMultiplier;
22. X = self.rMultiplicand
23. Y = self.rMultiplier
24. self.bMultiplicand = False
25. self.bMultiplier = False
26. self.num1(int(X))
27. self.num2(int(Y))
28. self.answer( int(rRes) );
Afterward, verify (Line 12) whether the multiplier input was successful. If the status of self.bMultiplier is
‘True,’ because the input of the multiplier was successful, the process method should be called to execute
the multiplication.
*In regards to the multiplier, the onInput_Multiplier method in Line 15-19 performs the same function as the
onInput_Multiplicand method.
If both multiplier and multiplicand are input, the process method should be called.
The process method in Line 20-28 executes the multiplication (Line 21~23) and sets the multiplier and
multiplicand input as ‘False’ (Line 24-25).
Line 26-28 is where the output of the multiplication result occurs. Conversion operators are included in the
parameters of the output method; if the output is in a float type, the NAO will read the nearest whole number
later on.
B - Image 6.46 is the box interface. The script for reading the three numbers is as follows:
1. class MyClass(GeneratedClass):
2. def __init__(self):
3. GeneratedClass.__init__(self)
4. self.tts = ALProxy(‘ALTextToSpeech’)
5. self.ttsStop = ALProxy(‘ALTextToSpeech’, True)
6. self.strnum1 = “2”
7. self.strnum2 = “1”
Line 21-39 are codes that enable NAO to speak; ‘sentence’ saves the spoken strings and commands.
The detailed explanation of the commands used in a sentence is in the reference (Advanced/Audio system).
Line 24-33 are where you would input the sentences that will actually be spoken; sentence
+= “ /Pau=200 /” is the command that creates the 200ms delay. The ‘Speaking’ box speaks in the following
order: the multiplicand, “times,” multiplier, “is equal to,” and the result of the multiplication.
The onInput_answer method, onInput_multiplicand method, and onInput_multiplier method each executes
a string transformation when a number enters the input.
A - Image 6.47 is where the multiplication table is calculated. Manipulate the singular number by connecting
the ‘Tactil Touch’ box and the ‘multiplicand’ box. Set the loop max to 9 for the Loop_multiplier box
to execute the multiplication. Insert the ‘multiplicand’ box and ‘Loop_multiplier’ box from the ‘Multiply’
input. Connect the results of the ‘Multiply’ box with the ‘Speaking’ box input. When the multiplier
is 9 while using the ‘Dispatcher’ box, initialize and then stop the ‘Loop_Multiplier’ box.
B - Image 6.48 essentially added a program control feature to Image 6.47. You can change the multiplier
of the multiplication table by connecting the end of the ‘Speaking’ box to both the trigger input
of the ‘multiplicand’ box and the start input of the ‘Loop_multiplier’ box. Two ‘Wait’ boxes were used
for the program that controls the multiplication table.
First, the ‘Wait’ box between the ‘Tactil Touch’ box, ‘multiplicand’ box, and the ‘Loop_Multiplier’
box delays the starting point of the multiplication table. The ‘Wait’ box is used here to initialize
the ‘Loop_Multiplier’ box. If, as shown in Image 6.49, the output of the center sensor (of the ‘Tactil
Touch’ box) is connected to both the start and initialization inputs of the ‘Loop_Multiplier’ box, the index
initialization will occur after the iteration statement begins. This is why it was delivered using the
‘Wait’ box after the initialization signal.
Image 6.49 - Connection between the Tactil Touch box and Loop_Multiplier
The ‘Wait’ box connected to the start input (of the ‘Speaking’ box) was used for the same reason. If the end
output (of the ‘Multiply’ box) and the start input (of the ‘Speaking box’) are directly connected, the ‘Speaking’
box may start before the input of the values needed for the multiplication table is finished.
C - Use the ‘Set Volume’ box or ‘Say’ box to implement the initial setup for the multiplication
program (Image 6.50)
In this exercise, Python and NAOqi API will be used to implement the functions that aren’t provided
by the default Choregraphe box. It would be easier and more convenient to implement the desired programs
when text-based programming (Python) and graphics-based programming are used together.
Continuous combining of NAO’s recognitions and movements can be used in diverse areas. For example,
it can detect when the ball is getting closer during robot soccer to execute actions like kicking and stopping
the ball; it can also grab and bring back a specific object after detection. The most basic combination of
detection and movement, object detection and grabbing motion through an image, will be implemented in
this section.
In Section 6.3 we looked at how NAO can navigate its way using landmark recognition. Landmarks may be
used this way to provide a type of guideline for NAO. In this section, we will look at how landmarks may be
used to detect objects.
Left and right hand grabbing motion each uses its own program calculated from inverse kinematics analysis.
Start
End
1. import os
2. import sys
3. import time
4.
5. from naoqi import ALProxy
6.
7. IP=“192.168.123.145”
8. PORT= 9559
9.
10. #ALLandMarkDetectionSet the connection for the module
11. landMarkProxy=ALProxy(“ALLandMarkDetection”,IP,PORT
12.
13. #ALLandMarkDetectionproxyFor the continuous reading of the proxy0
14. # Read every 500ms
15. period=500
16. landMarkProxy.subscribe(“Test_LandMark”,period,0.0)
17.
18. #ALLandMarkdetection Save the module output to AlMemory
19. memValue= “LandmarkDetected”
20.
21. #Set the proxy for ALMemory
22. memoryProxy=ALProxy(“ALMemory”,IP,PORT)
23.
24. #After waiting 0.5 seconds, get the corresponding data to memValue
25. time.sleep(0.5)
26. val=memoryProxy.getData(memValue)
27.
28. # verify whether the right value has been received
29. if(valandisinstance(val,list) andlen(val)>=2):
30.
31. # the first value refers to the timestamp
32. timeStamp=val[0]
33.
34. #markInfoArray[0][1] number of the landmark with detected value
35. markInfoArray= val[1]
36. print markInfoArray[0][1];
37.
38. else:
39. print»Nolandmarkdetected»
40.
41.
First, set the proxy (Line 1-11) to connect with the ‘ALLandMarkDetection’ module. Since the ‘ALLandMark
Detection’ module continuously produce values, it uses the ‘subscribe’ method to connect every 500ms
(Line 16). Data is saved in ALMemory when it connects normally, so the proxy should be connected to
ALMemory as shown in Line 22. To verify whether the value has been delivered properly, check whether
it’s a list instance and whether the length is greater than 2. Also, verify the [0][1] index of markInforArray
and check to see whether the landmark was detected using the correct process shown above.
Perform landmark recognition through Python using this process. The purpose of the codes explained
here is to verify whether the current landmark recognition is being executed properly; afterward,
some of the lower section which combines with the part that actually grabs the object will be changed.
1. import math
2. import time
3.
4. import naoqi
5. from naoqi import ALProxy
6.
7. IP=“192.168.123.145”
8. PORT= 9559
9.
10. proxy=ALProxy(“ALMotion”,IP,PORT)
11.
12. MaxSpeed= 0.2
13.
14. names=’Body’
15. stiffness= 1.0
16. proxy.stiffnessInterpolation(names,stiffness,1.0)
17.
18. # Bow the head slightly
19. proxy.setAngles(“HeadPitch”,1.57/5,0.2)
20. # Initial rotation value of the hand for grabbing
21. proxy.setAngles(“LWristYaw”,-0.5,0.2)
22. proxy.setAngles(“RWristYaw”,0.5,0.2)
23.
24. # Open the hand to grab the object
25. proxy.setAngles(“RHand”,0,0.2)
The code is largely divided into two sections; Line 30-79 grabs the object and Line 80-121 is where the lifting
and putting down of the object occurs. The two processes above are coordinates calculated from several
previous attempts, and by continuously performing these movements,
we able NAO to implement the grabbing motion.
Furthermore, Line 71-75 has a section where some of the values of the angle are substituted
in order to get close enough to the object. There are some parts that cannot be moved due to the angle value
limitations during inverse kinematics analysis, but the substitution resolves this issue.
The codes shown above have been created by combining two movements: landmark recognition and grabbing
motion of the right arm. All the libraries of the two codes are used. Several proxies are used as required by
each part.
To execute a different movement for each landmark, the camera is used to gather the landmark ID (Line 44),
and then the movement based on the if-conditional is executed.
You can set the names for the joints you will be moving in Line 55-66: if the landmark ID is 109, the left arm
will execute the movement and if the landmark ID is 114, the right arm will execute the movement.
Line 100-102 adjusts the angle of the left arm’s rotation according to the direction of the joint rotation.
Codes in Line 107-110 enable each arm to move according to the conditional statement.
It also lets each hand move the same way as 117.
Image 6.53 and 6.54 show NAO’s movements executed by using the codes we created
in Section 6.3. In both cases, all movements proceed in the same order.
For each landmark shown at the end of an image, 109 is used for the landmark that moves the left arm
and 114 is used for the landmark that moves the right arm. It starts from NAO’s default position (standing)
to lift and move the arm according to the calculated angle, grab the box, lift the arm to separate,
and then to put the box down again.
275
www.aldebaran-robotics.com
AMERICAS - americas@aldebaran-robotics.com
EUROPE MIDDLE EAST AFRICA - emea@aldebaran-robotics.com
ASIA PACIFIC - asia-pacific@aldebaran-robotics.com
ALDEBARAN Robotics, the ALDEBARAN Robotics logo, and NAO are trademarks of
ALDEBARAN Robotics. Other trademarks, trade names and logos used in this document
refer either to the entities claiming the marks and names, or to their products. ALDEBARAN
Robotics disclaims proprietary interest in the marks and names of others. Choregraphe®
& NAO® are registered trademarks of ALDEBARAN Robotics. The design of NAO®
is the property of ALDEBARAN Robotics. All the photos featured in this document are
noncontractual and are the property of ALDEBARAN Robotics. Aldm400029__p A00