Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Behavior-based Intention Inference

for Intelligent Robots Cooperating with Human

Yasuhiro Inagaki, Hirosi Sugie, Hideyuki Aisu, Shuitiro Ono, Tatsuo Unemi

Laboratory for International Fuzzy Engineering Research


Siber Hegner Building 4 FL. 89-1 Yamashita-cho,Naka-ku, Yokohama 231, JAPAN

Abstract - A method for intention inference behavior [4][5].In those technologies, human's
from the human's behavior is proposed for behaviors are used as just physical motions.
intelligent robots who carry out a simple task Now. it is often the case that humans often can do
cooperatively with human, without complex a simple task cooperatively, without complex
communication. The system is composed of three communication, for example carrying a load,
levels; p e r c e p t i o n , r e c o g n i t i o n and intention cleaning a floor and so on. During cooperative work
inference. At perception level, robots get the a human can infer his partner's intention, not only
physical information about the environment from language but also from bchavior. In this case,
including human by processing data from sensors. some of human's bchaviors have peculiar means, and
The system translates them into qualitative we usually recognize thcm unconsciously. Intcntion
expression that contain vague time scale, e.g., now, infcrrcd from human's bchavior arc vaguc and not
for a while, for all this while, by using fuzzy logic necessarily correct, but it is one of primary factors for
in recognition level. Finally, the intention inference intelligent robots to coopcratc smoothly w i t h a
level has groups of fuzzy rules using qualitative human. As human usually infcrred the intention
expression for the specific cooperative task to subjectively, the intention can't be exprcsscd
infer the human's intention and to qject from the completely by using probability or statistics.
task that is reasoned in intention inference level by On the basis of this idea, we propose a mcthod for
fuzzy rules matching the qualitative expression to behavior based intention inference for intclligcnt
the specific situations. Intelligent robots recognize robots using fuzzy logic 161. B y symbiotic i n this
the means of human's behavior by the system like paper, we mean that:
human doing, and the system will be of use to
construct the human-robot cooperative working * The human and the rohots have a common
system for industry. We set up a simple human- goal and work cooperatively.
robot cooperative task and ascertain the functions
of the system for intention inference The relationships between thc human and the robots
corresponding to the case. in the case are shown as follow.

1. INTRODUCTION . The robots assist the human who has a iask that
is difficult to achicvc alonc.
Reccntly, personal computers and othcr kinds of . Thc robots are not quite slavcs o f thc human,
intelligent machines controlled by microcomputer are bccausc they movc autonomously.
useful at general home. In the futurc, intelligent . The human is sometimcs a supcrvisor, and
robots will be familiar at home, hospital and office. somctimcs hc works as thc othcr robots do.
Since we pay attention to the relationship bctwecn
human and intelligent robots, we try to construct a We sct up thc examplc coopcrativc task, that is
human-rohot symbiotic system. The merit of such the re-formation ofdcsks by a human a n d a robot, for
system is that human work easily and cooperatively experimentation to ascertain the functions of the
with robots without consideration. The research for intention inference systcm. The human controls a
intention inference from human's behavior is robot by wireless controllcr to simplify the
necessary to construct the human-robot symbiotic experimental system, and the radio-controllcd robot
system and is scarcely developed until now. simulates human's behavior. Wc dcnotc the robot "h-
Thc human-robot collaboralion systcm is very robot" to differentiate it from another robot. The h-
complex and dirficult bccause of many problems; the robot and the robot know the final form of desks and
total stability of the system, the communication the plan of moving dcsks is not fixed in the cxample
between robots and human, the unknown task. Therefore the robot works cooperatively with
characteristics of human. Robotics tcchnologics arc the h-robot using the intcntion infcrrcd rrom h-robot's
developed to solve them, for cxample tcleopcration behavior. In the experimentation, we do the
[ I ] , human-robot inicrfacc [2][3]and bchavior-based following assumptions:

0-7803-2461-7/95/$4.00 0 1995 IEEE


1695
' Both of the h-robot and the robot have a simple cooperative tasks. The system is composed of
common purpose and work together. three levels: percepiion, recognition and intention
' Communication betwcen the h-robot and the inference.
robot is only nonverbal one to modcl thc work Theperception level is a kind of sensor fusion
on human doing. system. The system processes the sensing data first,
. The motion of the robot is not optimum but is and the system gets the physical information about
feasible in the sense of planning. the environment including human. They arc crisp
data at a moment; the discernment of each object, thc
In the sequel, first we introduce the outline of thc locations of them and the motion of human. Because
intention inference system composcd of three lcvcls; they have no ambiguitics without scnsing crror, wc
perception, recognition and intention inference. use no fuzziness in percepiion level.
Second we explain the detail of each level The recognition level is a translator from the
corresponding to the example experimcntation. crisp data processed in percepiion level to thc
Finally we carry out the computer simulation and the qualitative expression that contain vague time scalc
experimentation to illustrate the effectivencss of thc by fuzzy logic. Two types of qualitativc exprcssion
proposed systcm for human-robot cooperation. are used in recognition level. One is translated by thc
cxtraction or characteristics pickcd out by the f uzzy
11. BEHAVIOR
BASED INTENTION rulcs using thc relative location of cach object. The
INFERENCE other is human's behavior at ii timc or lor a fixed time
reasoned by the fuzzy rulcs using n o t o n l y thc
We try to construct the behavior bascd intention human's motions but also the rclativc location
inference system by imitation human doing in his between the human and any objcct. Thcy arc
mind by using fuzzy rules, because of impossibilily represented by membcrship functions and [he degrcc
of the modeling of complete human's behavior and of matching to each qualitative expression is
human's considcration. Figure 2 shows the whole calculated in this level. In other words, the qualitative
structure of the system for intention infcrcnce Froin expression in recogniiion level rcprcscm thc broad
human's behavior. It works at the cases of specirk situation of thc environment likc human doing.
The inteniion inference levcl
Intention Inference I has groups of fuzzy rules using
qualitative expression to infcr the
human's intention for the simplc
specific cooperative task. Wc
classify human's intention under
,.. 5 W and 1H; when, where, who,
what, why and how. I f the human's
task is very simple and its goal is
clcar, thc task is rcasoncd by fuzzy
rulcs matching Ihc qualitativc
cxpression to thc spccilic situations
in inieniion inference IcvcI. Thc
dcgrcc of rejcction from tbc task is
calculatcd by luzzy rulcs in cach
group to avoid a dead lock. Thc
rulcs lor cach lask arc fixed und arc
given by human bcforchand. The
Rcasoiiing of systcm corrcsponds to a new task
human's behaviors by addition a new group of rulcs
and ncw qualitativc cxpression Cor
it in the prcscnt circuinstanccs. It is
one of main problems of the
systcm to bc solvcd.
ensor Discernment o f edch objccl

e ofeach ohject
Motion o f human

Fig. 1 Hchavior b'ired intention mfcrcnce system

1696
111. INTENTION INFERENCE SYSTEM FOR of the characteristics using relative locations and are
THE EXPERIMENTATION expressed by membership functions shown in figure
4.
From here, we express the particulars about each
level of the intention inference system corresponding . Iam rlandnp in bontol -3 :.
to the example experimentation.
l l S l
. Thwe W.
my lslt~ids
am $mobm +
The b c l a n c ~IS 1 cm
Myporlon#r(10.
The p0Ul.m 01
(.I y ~ . e ~ ~
0 80)
NO 1 I S

A . Perception

We only use the image processing system as the


sensor in p e r c e p t i o n level for the example
experimentation shown in figure 2. Each objcct
including the h-robot and the robot is discriminated
by its area and girth, and the system tracks thcm
severally. The position and the direction of each
obiect are calculated about 1 Hz in .perceprion
. level.
The goal positions of the dcsks arc imputcd
beforehand by the human.

Fig. 3 Recognition of cnvironmcnt

Table 1 Fuzzy slot and label for the experimentation


SLOT I LABEL
distance I touch. near, away, far
front, back, right, left,
front-right, front-lcft,
direction back-right. back-lcft.
not-front-far, not-front-away,
not-front-ncar
angle parallel, vcrtical, othcr
local x width. not width
I local-y I dcpth, not-dcpth

Fig. 2 Perception lcvcl for the experimentation

B. Recognition

The basic idc.a of the recognilion lcvcl is how to


translate from the crisp data to thc qualitative
expression, bccause of that human recognizc the
environment broadly and infcr the partncr's intention
by using the recognition of the situation shown in
figure 3. We classify the typcs of recognition under
relative location and bchavior.
anglc local-x
B-1. Relaiive loccirion

The relativc location between thc h-robot and any


object, bctwccn a n y objcct a n d any goal is very
important for infcrcnce of human's intcntion in lhc
experimcntation. We use the fuzzy slots and the
local-y
labels as qualitative cxprcssion shown in tablc 1.
They are uanslatcd from numerical data by cxtraclion Fig. 4 Mcmbcrship function of relative location

1697
The system derives more complex qualitative seconds ago by the data of memory counter shown in
expression by fuzzy rules using the fuzzy labels in figure 6. We use some vague time scales in thc
recognition level. For example, figure 5 shows the experimentation. They are as yet, now, for a while
degree of standing face io the desk about h-robot, and all this while, and are calculated by using the
where x, - y, is the local frame of reference of the h- memory counter and are expressed by membership
robot, and x, - y , is the local frame of reference of function shown in figure 7. The fuzzy rules
the desk. When the h-robot is far from the desk, the containing not only relative locations but also simple
degree of standing face to the desk is calculated using behaviors are used to express more complex
only. On the other hand when the h-robot is near bchaviors. We use the fuzzy slots as the qualitative
or is touching the desk, the degree is calculated by expression in the experimentation, as follows: "The
processing the relative position and the relative h-robot is standing still." "The h-robot is prowling."
direction between the h-robot and the desk. The "The h-robot is waiting for help." "Thc h-roboi is
degree of standing face to the desk is derived by 17 moving the desk alone." Table 3 shows the fuzzy
fuzzy rules shown in table 2. rules to calculate the degree of wailing for help.

Table 2 Fuzzy rules for standing face to the desk


I El IF THEN
(Mat~dislanccis far) Man.dcglook
and (Man.direction is front) is truth
(Man.distance is far) Man.deglook
and (Man.direction is not-front-far) is fault
(Man.dislance is far) Mm.deglook
and (Man.direction is f-right) is unknown
(Man.disIance is far) Man.deglook
and (Man.direction is f-left) is unknown
(Man.diswnce is away) Man.dcglook
and (Mmdirection is front) is truth
(Man.distsnce is away)
Man.deglook
6 and (Mamangle is vertical)
is Lruth
and (Man.locsl_x is width)
(Man.distance is away)
Man.deg-look
7 and (Man.loca.1-x is width)
is fault
and (Man.direction is not-front-away)
(Mamdistance is away)
The dislance is near Yan.dcg,-look
The distance is far. or is louch 8 and (Man.local_y is hcight)
is h d t
and (Man.direction is not-front-far)
(Man.distance is near) Man.deglook
Fig. 5 Degree of standing face to the desk
and (Man.direction is front) is truth

8 - 2 . Behavior ,~ (Man.distance is near)


1"
Man.deglwk
and (Man.angle is vertical) is uuth
(Man.distance is near) Man.deglook
The basic motions of the h-robot arc moving and and (Man.direction is not front near) is fault
stopping. The system judges that the h-robot is

I
(Man.distance is near)
moving or is stopping in the sensing cycle by Man.deglook
and (Man.loca1-y is height)
is fault
comparison the new position data of the h-robot with and (Man.dirccrion is not-front-far)
old one. The system has four memory counters to (Man.disIance is touch)
Man.deglook
calculate how long the h-robot is moving or is 13 and (Man.angle is vertical)
is truth
stopping. Figure 6 shows the image of the memory and (Man.local-x is width)
counters. In figure 6, the moving and the stopping are (Man.dismnce i s touch)
Man.deg-look
14 and (Manmgle is o h e r )
the counters for long time. The mo-check and the is unknown
and (Man.local-x is width)
st-check are the counters for short time. The moving
(Man.distance is touch)
and the mo-check are increased and the si-check is IS and (Man.angle is parallel)
Man .deg_look
cleared when the system judges that the h-robot is is truth
and(Man.local v is hcipht)
moving. The stopping and the si-check are incrcased (Man.distance is touch)
and the mo-check is cleared when the h-robot is Man.deg-look
is lault
stopping in the same way. The sopping is cleared
when the mo-check is full, and the moving is cleared
Man.deglook
when the sf-check is full. We know that the h-robot is unknown
is stopping now and it has moved until just a few

1698
shows the example of fuzzy rules for intention
inference in the experimentation. Though the results
of intention inference are related to the decision of
robot's motions, motion control of the robot is
entrusted to the other systems in the experimentation.

Table 4 Intention for the experimentation

What
Where
I ......
Flnw
I
I
desk,. unknown
goal,. unknown
push forward, tum together, push sideways,
II
move alone, unknown
I Ihe h-mboi would like to
mnvsthsdsakd n r m o v e I h c ~ h a i r ' u , I h c g o n l orfhcgnrlgr
~

Table 5 Example of fuzzy rules

I
for intention inference

Fig. 7 Member ship function of moving

Table 3 Fuzzy rules for waiting for help

I! IF I TIIEN

Iv. EXPERIMENTATION
A N D SIMULATION
FOR HUMAN-ROBOT COOPERATION
B-3. Intenlion inference
We sct up thc cxpcrimcntation and thc simulation
The task in the experimentation, that is the re-
of simple cooperative task to check the functions of
formation of the desks, is reasoned in inienrion
the system shown in figure 8 and 9.
inference level by matching the h-robot's behavior to
the specific situations, as follows: "The h-robot is w a i s,.ua.
Pcr.cd cmp,. SSl0"'OdrlII
waiting for help." "The h-robot is moving the desk Po:

alone." Human's intention is inferred by fuzzy rules


described about the task in the group of fuzzy rules.
On the other hand, fuzzy rules calculating the degrecs
of rejection from the task x e included in thc group of
fuzzy rules. The rejcetion rules work in the casc of
changing to a new task and in the case of mistake
about reasoning the task.
Human's intention is classified under 5W and 1H;
when, where, who, what, why and how. Ncccssary
and sufficient types of intention for each task are
used in the group of fuzzy rules. Wc use the
following types of intention in the experimentation
shown in table 4; whal, where, how and why. Whai
and where are inferred severally by f u n y rules using
qualitative expression derived by the system in 3m
recognilion level, and how a n d why arc inferred by
fuzzy rules using the other types of intention. Tahlc 5 Fig. 8 Expcrimcntation of the example task

1699
In figure 9, we do the following assumptions: that robots can infer human's intcntion from behavior
and proposed the structure for behavior-bascd
. There are three dcsks, thrcc goals, a robot and a intcntion inference systcm. We set up the examplc
h-robot in the environment. task, that is re-formation the desks, and asccrtaincd
' The h-robot and the robot can only push the that the human can do the task in the case of almost
desk. all initial locations by the simulation. The human
. They know the final form of the desks and know failcd the task only the case that morc than 2 dcsks
the positions and the directions of all objects. are touching and there were not enough room for thc
. The robot doesn't know the whole plan of robot to carry them. We will construct thc
moving each desk to each goal. experimentation system of the examplc task to
. They have only four ways to push thc desk, as ascertain the functions of the system in the real
shown in figure 10. world. Though the experimentation system is just a
. Active communication from thc h-robot to thc toy system now, the behavior-based intention
robot is only non-verbal onc. inference system will be applied to the human-robot
production system to construct the more comfortable
industrial environment by improving the sensor's
performance in perceprion Icvel.

ti-robot
REFEKENCES

I [ I ] P. Huynh et al., "Experimental and Simulation


Study on Mcchanical Man-Machine Intcraction
i of a Tclc-manipulation Systcm," in IEEE
iI International Confercncc on Robotics a n d
Automation, 1992, pp. 17-24
I 121 K. Kosugc, Y. Fujisawa, T. Fukuda. "lntcrfacc
Design for Man-Machine Intcractions," in IEEE
International Workshop on Robot and Human
Communication, 1992, pp. 143-148
[3] K. Kosuge, H. Yoshida, T. Fukuda, "Dynamic
control for Robot-Human Collaboration," in
IEEE International Workshop on Robot and
Fig. 9 Simulation of the examplc task Human Communication, 1993, pp. 398-401
[4] R. A . Brooks, "Intclligcncc without Reason." in
IJCAI-91, 1991, pp. 569-595
[5] R. A. Brooks, "Intelligence without
Represenration," Artificial Intelligcnce, vol. 47,
pp, 139-159, 1991
161 Y. Inagaki, H. Sugie, H. Aisu, T. Uncmi, " A
study of a method for intcntion inference from
human's behavior," i n IEEE Intcrnational
Workshop on Robot and Human Communication,
Push forward Turn rogcrhcr Movc alone 1993, pp. 142-145

robot
Push sidcways

Fig. 10 Forms olmoving the desk

v. CONCLUSIONS
We havc considcrcd the lact that robots arc
symbiotic and can coopcrotc togcthcr in h u m a n
society. We proposed a systcm based on thc coriccpt

1700

You might also like