Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

2nd IFAC Symposium on Telematics Applications

Politehnica University, Timisoara, Romania


October 5-8, 2010

An Augmented Reality Supported Control


System for Remote Operation and
Monitoring of an Industrial Work Cell
Markus Sauer ∗ Florian Leutert ∗∗ Klaus Schilling ∗∗

Zentrum für Telematik e. V., Allesgrundweg 12, 97218 Gerbrunn
(e-mail: markus.sauer@telematik-zentrum.de)
∗∗
Department of Computer Science - Robotics and Telematics,
University of Würzburg, Am Hubland, 97074 Würzburg, Germany
(e-mail:{leutert,schi}@informatik.uni-wuerzburg.de)

Abstract:
Telematic control systems have a lot of potential applications in teleservicing of remote sites:
they range from surveillance of regular operations over optimization of existing processes to
remote error resolving. In this paper, we describe the general architecture of a telematic control
system employed to survey an industrial work cell, and show how Augmented Reality (AR)
can support an operator in grasping events and conditions at the remote site more clearly, thus
getting a better situational awareness. After describing several specific examples from the AR
tool-range, potential uses of the system are presented.

Keywords: Telematics; Teleoperation; Robotic Manipulators; Augmented Reality; Remote


control.

1. INTRODUCTION tools that help him to assess the problem and find a
suitable solution. Thus, although a lot of information and
For several decades now, ever more production companies data from the remote scenario is available, it is still a
have been employing manipulators in their production large effort for the expert to integrate all this data into his
process to increase their productivity. However, another mental model of the situation, to understand the spatial
trend in production has been the need to provide a more relations, and to draw the required conclusions in order
flexible production palette, which requires that the robot to fix the problem or solve the task. The high mental
programs have to be changed much more frequently than workload of the human to perform the necessary mental
they used to. Especially for small and medium enterprises transformations for the different information sources (cf.
(SME) this proves problematic: they often cannot afford Wickens et al. [2005]) often leads to failures or longer times
to hire an expert that is solely responsible for robot to understand the problem.
programming. As such, disruptions in the working cycle
and especially changes in the manipulators’ programs are The approach of this work is to use mixed reality (Mil-
a risk: if an error occurs that cannot be resolved by the gram and Kishino [1994]) or more specific augmented
local personnel, the production will be on hold until such
an expert arrives to resolve the problem.
Here, a telematic application for surveying and controlling
the robotic work cell remotely could provide a remedy to
this situation: an expert operator in a remote control room
can connect to the deployed system, get a quick overview
over the conditions there and resolve the problem, all with-
out the need to travel to the remote company. Sheridan
[1992] gives a basic summary on problems occurring in
these scenarios, when humans are included in such remote
control scenarios.
Nowadays, many industrial plants which have a high level
of automation already offer the necessary interfaces for an
expert to connect to the system, to monitor and analyze
the system behaviors and problems, and to adjust the
systems if required. However, to be able to diagnose the
situation accurately, the operator needs to get a good
impression and understanding of the remote situation; as Fig. 1. Telematics control room at the University of
he cannot walk around in the robot cell, he needs other Würzburg.

978-3-902661-84-5/10/$20.00 © 2010 IFAC 83 10.3182/20101005-4-RO-2018.00030


TA 2010
Timisoara, Romania, Oct 5-8, 2010

reality (Azuma et al. [2001]) technologies to realize three-


dimensional integrated graphical user interfaces, which
enable to overcome these mental transformation problems.
This is supported by prior research. Pettersen et al. [2003]
showed an augmented reality system for programming
an industrial robot arm. Sauer et al. [2007] shows the
navigation performance increase when augmented reality
is used for mobile robot teleoperation.
The telematic control system proposed here, supported
by augmented reality, can aid the human in monitoring
and controlling a remote industrial robotic scenario. Often
already available camera streams inside the systems can
be augmented by spatially correctly placed visualized
machine data and sensor information. In the presented
work, a framework and the implemented telematic control
system at the Department of Robotics and Telematics
at the University of Würzburg are introduced where an
operator at remote is enabled to monitor and control
an industrial demonstration plant. The user interface
provides him with multiple augmented reality views of
the remote scene in order to maximize his situational
awareness (Endsley [2000]). In addition, also a three-
dimensional stereo visualization of the scenario is provided
in order to allow a better understanding of the spatial
relations at the remote workspace in the form of a 42
inch stereo screen, which allows a stereo image of the
scene for multiple users. This telematic control room is
set up and operated as cooperation between the Zentrum
für Telematik e.V. and the Department of Robotics and
Telematics. Besides the usage for industrial applications it
is also used as mission control room for satellite missions. Fig. 2. Overview of System Architecture and Data Flow.
Figure 1 shows a photograph of this control room with the
wall mounted displays and the stereo display. dedicated Ethernet network. The systems exchange data
with a cycle time of less than 12 ms. The real-time process
In the remainder of this work, first the system architecture control server takes care that all real-time constraints
of our industrial demonstration setup and the remote inside the plant are met, provides the interface to the
control room is explained. In the following, the different public network and hereby the remote control and oper-
Augmented Reality components realized in the context of ation system. Apart from sending status and sensor data
this work are introduced, and afterwards different applica- from the devices in the work cell it is also responsible for
tion scenarios which can be implemented with little effort relaying any commands back to the cell (like the movement
using the presented AR framework are shown. Finally a commands for the KR16 robots), thus enabling remote
short summary of the planned future work is given. control of the workspace. As no closed loop control is
demanded between real-time process control server and
2. SYSTEM ARCHITECTURE remote control room, no specific real-time constraints need
to be considered for this external communication link. The
In previous work (Driewer et al. [2007]) a system for process control server transmits the machine and sensor
distributed three-dimensional models for telematic appli- data with a selectable frequency over a multicast connec-
cations was suggested. In this work the approach is comple- tion to any workstation which joined the established multi-
mented by an augmented reality supported control system cast group. The advantage of this concept is that the data
that is used to operate and monitor the industrial demon- is sent only once by the server to the whole multicast group
stration plant of the University of Würzburg; this demon- and not consecutively to each single workstation connected
stration facility consists of two Kuka KR16 robot arms, a to the system. Thus, the system scales much better with
conveyor belt, and an autonomous transport trolley. The more workstations listening to the data, the load on the
described control system is realized in a remote control network and the process control system is minimized, and
room equipped with four large scale displays mounted to all monitoring systems receive the data synchronously. The
the wall, a 42 inch multi-user stereo display, and eight command channel to operate the industrial demonstration
additional workstations for operators. plant is realized as a TCP socket connection between the
control station in the remote control room and the process
The control room and the industrial demonstration plant
control server.
are connected via an IP based network. Inside the demon-
stration plant the real-time communication between the Besides the machine and sensor data, also several network
different devices (robots, conveyor belt, sensors ...) and cameras in the industrial demonstration plant provide
the process control server is realized with the RSI (Remote feedback as augmented reality view or video only view to
Sensor Interface) protocol from the Kuka company over a the human in the remote control room. These network

84
TA 2010
Timisoara, Romania, Oct 5-8, 2010

cameras are directly connected to the public network.


For the different views on the different screens in the
remote control room a connection to the network camera
inside the industrial demonstration plant is established on
demand dependent on the selected point of view.
Figure 2 shows a schematic overview of the system setup
and the data flow.

3. AUGMENTED REALITY COMPONENTS

The most prominent difference between Virtual and Aug-


mented Reality (VR/AR) is that AR does not place the
user in a completely artificial world, but leaves him es-
sentially in the real world, augmenting it with correctly
aligned virtual information (Milgram et al. [1994]). In
telematic applications however, the user is often a great
distance from the place he is working in, so the best way
to keep him connected to the real world and give him an
impression of his surroundings is showing it to him, using Fig. 3. Illustration of the stereoscopic screen, where the
pictures transmitted from a remote camera. users get a realistic depth impression of the remote
scene.
In our scenario, several webcams are deployed throughout
the robotic work cell: some cameras are placed outside the should occlude real objects when in foreground, but also
work cell and thus supply a view as a human standing be occluded by real objects when behind them. This is not
outside of it would perceive it. Another camera is placed an easy problem to solve, known as the occlusion problem
on the ceiling, giving a good overview over the complete in AR. As most of the objects in the chosen scenario are
assembly. Finally, there is a camera mounted just beyond known a priori, in this work a model-based solution (cf.
the tip of one manipulator, which enables pictures from Sauer et al. [2010]) is chosen for these difficulties: an exact
the perspective of the robot; this also enables viewing 3D-model of the industrial work cell and all objects in
parts of the workspace that would not be visible with it was created and used in calculating object placement
fixed cameras only. In the developed application the user and occlusion. This enables us to achieve a nearly perfect
can freely switch between individual cameras on every occlusion between all (real and virtual) objects (Figure 4),
available screen, choosing the combination of views that and also allows us to easily place virtual information where
is most helpful for him to complete the task at hand. we want it to appear in the camera picture (and thus in
the world), just by setting it at the corresponding spot in
Besides these camera views the user can also use the
the developed 3D-model.
stereoscopic display to gain a better understanding of
the remote scene. In the current setup at the telematic This model need not be a static one: for example, for vir-
control room the virtual model of the industrial work cell is tual information regarding the manipulators to be correct
presented to the user, which he can freely rotate and zoom, and up-to-date it is necessary that the 3D-models of the
thus virtually moving around at the remote location. This robots do not remain static, but follow the movements
is especially helpful for fine motion planning; for example, of the real devices. To achieve this, the axis data of the
when using the manipulator to set some work piece on the manipulators is being continuously transmitted over the
table in the center of the robot cell, the user can zoom in communication network and set in the workspace model,
very closely to see the remaining distance between robot so that the virtual models in our application are always
and surroundings. The device uses a special technique to moving and aligned with the real ones in the picture.
give its’ users a real depth impression of the objects shown
With this system all of the basic necessities for an AR-
on the screen (Figure 3). This improves the operators’
application are fulfilled. Now, there are several possibilities
spatial comprehension of the remote surroundings; as this
for enhancing the view of the user with virtual information;
is a very critical requirement for controlling robots over a
some of the AR tools developed in the context of this work
great distance this screen is exclusively used for this virtual
for the industrial work cell scenario will in the following
representation.
be presented.
The camera pictures can be shown purely on several other
available monitors, giving the user a good first impression A very basic necessity when working with manipulators is
of the situation in the work cell. These non-virtual views to avoid collisions between the robots and the surround-
can now also be enhanced via AR with different kinds of ings. So, when analyzing the condition of the remote work
artificial information to support the human operator in his cell or moving the robot remotely it would be helpful
work. for the operator to measure distances between different
objects in order to be able to predict and avoid bottle-
Still, even though the objects placed in the camera picture necks and collisions when moving the robots. Our system
are virtual, it is necessary in an AR-application that provides the possibility to display the distance between
also these artificial objects are perceived as realistic: first any two objects in the work cell (for example the robot
they should appear correctly aligned and at appropriate tips), or an object and a plane (like the ground level).
places (not just floating in the picture), and secondly they This distance information is intuitively color-coded, i.e.

85
TA 2010
Timisoara, Romania, Oct 5-8, 2010

robot. In advantage to offline programming one does not


need a complete and up-to-date model of all objects on the
real factory floor (which is not always easily available), as
the surrounding objects are visible in the camera picture
as they are present at the moment. Also other conditions
of the application area that often are not being modeled
directly (like people in the surroundings or a mobile robot
crossing certain sections of the work cell) can be checked
for here. Bottlenecks and collisions can be made visible
without actually risking damage to the hardware, while at
the same time knowing that the ambient activities visible
are not only simulated, but really happening like that in
the real work environment.
When programming a manipulator by manually moving
them through the desired waypoints one can only see
where the robot is now; the path it has already taken
or will take can only be remembered by the programmer.
Using AR, in our system this path can be made visible
(Figure 5). Specific points on the robot (e.g. the TCP)
can be tracked and its movements plotted in the camera
picture with a point line. This enables the operator to
see whole movement trajectories (when using simulated
virtual movement even before they are executed), which
can be used to further analyze movement programs for
bottlenecks, inefficient trajectories or times in that parts
of the workspace are blocked.
Parts of the techniques explained here are used when
remotely controlling and moving a manipulator from the
control room. In order to allow for a save movement and
avoid collisions, the user first moves the virtual model to
the desired location; after he is satisfied with the position,
Fig. 4. Virtual information in the camera picture, with he can then order the real robot to execute that movement,
realistic occlusion computing turned off (top) and on and the control system guides the robot to the target
(bottom). The picture shows virtual movement of one position on the user-defined path. Alternatively, the user
of the manipulators, with the distance from robot can enter coordinates for the robot; then, a virtual overlay
TCP to the table plane below. shows the trajectory the robot will be taking. Again, if
the user is satisfied with that movement he can then issue
it switches from green over yellow to red with increasing that this command is to be transmitted, and after the
proximity. This virtual information can be placed freely, corresponding command is sent over the communication
but can also be attached to an object involved in the channel to the process control server at the remote in-
measurement. For example, if the distance between the dustrial plant, which relays it to the corresponding robot
robot TCP (Tool Center Point) and the second robot is control, the robot will start moving. A system that uses
displayed, the information can be attached to the robot this overlay technique has been tested by Kim [1993];
TCP such that it always is following and floating right it effectively eliminates any critical timing conditions in
beneath the robot tip (Figure 4): the user sees the relevant the remote control of the robot, i.e. the delay between
information right where he needs it, at the object that is issuing a command and the execution by the machine. The
in risk of collision. whole movement can be (virtually) completed before the
robot even starts shifting. So, there is no delay between
Another powerful AR tool enables the user to move not starting a movement and seeing the result, which could
the real robot, but only the corresponding virtual model in result in overshooting and dangerous delayed feedback and
the combined view (Figure 4): by that, the operator is able compensation loops induced by the operator. Here, he can
to test positions and also to do path-planning in the real take time to plan the path, and also discard parts of it,
work environment without actually having to move the should he not be satisfied.
real robot. The model of a manipulator is overlaid onto the
camera image, and then moved like the corresponding real This was only a sample of tools that are implemented and
device. Complete new robot movement programs and work can be employed using our Augmented Reality supported
cycles can be planned beforehand and then be evaluated telematic control system. All kinds of information can
with the virtual models on the actual factory floor, without be included in the operator’s view of the remote scene,
having to stop the real robots or risking collisions with this from coordinate systems that guide him when moving
previously untested robot program. Also, by combining the manipulator, information that reminds him of the
this virtual movement with another robot that actually necessary steps in his task up to pointers guiding him
executes its own movement program, one can directly to the suspected problems in the work environment that
study if the planned movement fits to the one of the real triggered an error report. The developed system is fully

86
TA 2010
Timisoara, Romania, Oct 5-8, 2010

palette has changed and a new product is to be assembled


with this system, changes to the robot program are almost
inevitable. The local personnel could create a correspond-
ing program, and send this modified version to the remote
operator, who can safely evaluate it using AR support in
the live working environment. He can then confirm that the
new program is working as intended, or make suggestions
in how to improve it. As AR permits doing this in the
live environment, also other possible sources for trouble
can be taken into account, like mobile robots for material
flow in the workspace or moving parts of the surrounding
machinery. For example the program for one manipulator
could be evaluated using the virtual model to see the
trajectory while the real robot next to it is running to
check for timing conditions in the whole scenario.
Also, programs for the devices can not only be evaluated,
but also created completely in the control room, and
then transmitted to the remote location for testing. Like
that, the presence of a robotic programming expert is not
Fig. 5. Using AR, the covered and future trajectory of the necessary in the remote company, as all the necessary steps
robot arm can be made visible. (programming, evaluation and iteration) can be done from
the control room. Especially for frequent but small changes
open for extensions and can easily be modified to suit the this could be an attractive alternative for a SME.
needs of any telematic application.

4. APPLICATION SCENARIOS

The telematic system presented here can be employed in 5. CONCLUSION AND FUTURE WORK
a variety of ways:
First, the most obvious and simple application scenario With the introduced system the infrastructure and soft-
is tele-monitoring of a remote industrial site. The system ware basis of an AR-supported telematic control system
could autonomously monitor one or possibly even multiple
is laid. The system (and especially the AR-tools) is highly
working cells; in the case of an error, the system could
adaptable to the need of individual application scenarios
alert an operator, switch the remote view to the concerned
and human operator needs.
workspace and give a preliminary analysis of the incidence.
Apart from reporting the event that triggered the alert, In the future a concrete set of use cases of the augmented
also possible error sources could be highlighted with the reality supported control system needs to be identified, to
AR support, focusing the attention of the user on the find out in more detail about the actual specific demands
spot that is most likely the cause for the interruption, of remote operation scenarios in industry. For these, user
together with showing recommended steps to take. The studies will be performed which enable a quantitative
user could then continue this evaluation, taking direct assessment of the potential process improvements and
action like changing and configuring the remote software economical savings.
system, resolving a hardware error or reprogramming a
This will lead to the design of specific application sce-
device, or, if that is not possible, contacting the personnel
narios, e.g. tele-diagnosis of errors in an industrial work
at the remote location to inform them of the error and a
cell, and doing user testing to evaluate the efficiency of
proposed solution for handling it.
these specific telematic applications; the user could be
A different advantage of this telematic surveillance is confronted with different types of errors in this production
the possibility to have an expert evaluate the currently facility, which he has to detect and possibly to correct.
running working cycle with little effort remotely. This This would allow for comparison of different supporting
person does not have to travel to the individual work AR-tools, determining how efficient they can be deployed
places, but can watch, and possibly enhance the efficiency in the specific scenario and identifying components that
of the industrial work cell by changing several parameters can be improved. Also, an indication could be given for
of the site, like movement trajectories and timing. A SME how fast and how reliantly errors in this type of scenario
that cannot afford to employ its own robot expert could can be detected using the developed system.
benefit by requesting this service from a remote expert
center without having to pay for extra travel expenses, A continuation of the work shown here could be the
while a larger company scattered across several locations adaption of the system from the chosen scenario of an
industrial work cell to other application areas; systems
could do this efficiency check from time to time for all of
like operation and control of a group of mobile robots, for
its production facilities from just one place.
example for flow of materials, or other transport tasks in
Another helpful area of application for the proposed sys- logistics scenarios could be accessed and a suiting control
tem is to assist the remote company in changing of, or application could be developed to show the practicability
creating completely new robot programs. If a product and usability of our control system.

87
TA 2010
Timisoara, Romania, Oct 5-8, 2010

REFERENCES
R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier,
and B. MacIntyre. Recent advances in augmented
reality. IEEE Computer Graphics and Applications, 21
(6):34–47, 2001.
F. Driewer, M. Sauer, F. Leutert, and K. Schilling. A flexi-
ble framework for distributed three-dimensional models
in telematic applications. In IASTED International
Conference on Telematics (TA 2007), 2007.
M. R. Endsley. Theoretical underpinnings of situation
awareness: A critical review. In Mica R. Endsley and
Daniel J. Garland, editors, Situation awareness analysis
and measurement, chapter 1, pages 3–26. Lawrence
Erlbaum Associates, 2000.
Won S. Kim. Advanced teleoperation, graphics aids, and
application to time delay environments. In Proceedings
of the 1st Industrial Virtual Reality Show and Confer-
ence (IVR ’93), (Makuhari Meese, Japan, 23-25 June
1993), pages 202–207, 1993.
P. Milgram and F. Kishino. A taxonomy of mixed reality
visual displays. IEICE Transactions on Information
Systems, E77-D(12):1321–1329, December 1994.
Paul Milgram, Haruo Takemura, Akira Utsumi, and Fumio
Kishino. Augmented reality: A class of displays on
the reality-virtuality continuum. Telemanipulator and
Telepresence Technologies, SPIE Vol. 2351-34:282–292,
1994.
T. Pettersen, J. Pretlove, C. Skourup, T. Engedal, and
T. Lokstad. Augmented reality for programming indus-
trial robots. In Proc. Second IEEE and ACM Inter-
national Symposium on Mixed and Augmented Reality,
pages 319–320, 2003.
M. Sauer, F. Driewer, and K. Schilling. Potential and chal-
lenges of stereo augmented reality for mobile robot tele-
operation. In Proc. 10th IFAC/IFIP/IFORS/IEA Sym-
posium on Analysis, Design, and Evaluation of Human-
Machine Systems (HMS), Seoul, Korea, 2007.
M. Sauer, F. Leutert, and K. Schilling. Occlusion handling
in augmented reality user interfaces for robotic systems.
In Proceedings of the 41st International Symposium on
Robotics (ISR 2010) and 6th Robotik 2010, 2010.
T. B. Sheridan. Telerobotics, automation, and human
supervisory control. MIT Press, Cambridge, MA, USA,
1992. ISBN 0-262-19316-7.
C. D. Wickens, M. Vincow, and M. Yeh. Handbook of
visual spatial thinking, chapter Design Applications of
Visual Spatial Thinking: The Importance of Frame of
Reference, pages 383 – 425. Cambridge University Press,
2005.

88

You might also like