Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Symbiosis of Human and Artifact

Y. Anzai, K. Ogawa and H. Mori (Editors) 33


© 1995 Elsevier Science B.V. All rights reserved.

T o w a r d a C o m p r e h e n s i v e M a n i p u l a t i o n S o l u t i o n on 3 D W o r k s p a c e

Nobuo Asahi a, Kazuhis a Okada a, Akira Maenaka ~, Eun-Seok Lee b


and Keiji Kobayashi ~

apersonal-use Electronics Laboratory, Mitsubishi Electric Corporation,


5-1-1 Ofuna, Kamakura, Kanagawa 247, J a p a n

bDepartment of Inform ation Engineering,


Sung-Kyun-Kwan University, Seoul, Korea

1. I N T R O D U C T I O N

We have already developed a system for construction and execution of 3D


animated metaphor environments, named MECOT (Metaphor Environment
Construction Tool), which provides i) easy and efficient construction model for
designers by explicitly separating environments from graphical objects, and ii)
application transparent environments for users by arranging application functions
in a consistent environment [1]. As a result, it was confumaed t h a t it enables rapid
construction of a variety of three dimensional graphical user interfaces
(hereinafter, 3D GUIs). Since MECOT can adopt tmlimited variety of 3D objects as
their interface metaphors and show animation presentation including viewpoint
change triggered by user's operation and applications' events, the 3D GUIs
constructed with MECOT are more intuitive to users than ordinary 2D GUIs,
however, the manipulation on the 3D GUIs has become more difficult than 2D
GUIs.
Generally, the manipulation on 3D graphics is much harder than on 2D
graphics, because there are six degrees of freedom in 3D while there are only three
in 2D. Moreover, since the MECOT assumes t h a t the environment spreads over
wide 3D space, users are required to move their viewpoint to accomplish their task.
In order to give a good look and feel for the users, providing an easy 3D
manipulation method is essential, because the feel of the manipulation strongly
affects the usability of user interfaces.
There are a lot of work relating to this problem [2-6], however, most of them try
to solve specific issues such as object rotation and viewpoint control. In order to
provide users a natural way of manipulation on the 3D GUIs, a comprehensive
solution of the 3D manipulation issues should be investigated. We have categorized
the issues on 3D manipulation to viewpoint control issue, short distance object
placement issue, and long distance object placement issue. In this paper, we will
report the result of the experiments we hadfor the viewpoint controlissue and the
short distance object placement issue. The experiments were executed by
34

implementing several manipulation methods and evaluating them in terms of time


spent to complete the given t a s k a n d e r r o r rate during the execution. For these
experiments, we did not use special input devices for 3D manipulation.

2. ISSUES ON 3D MANIPULATION

2.1. Categorizing Issues


3D GUI assumed in this paper is a 3D graphical environment where there are
various 3D graphical objects. Some of the graphical objects are translatable and/or
rotatable in the environment. When users would like to accomplish their t a s k s in
this environment, they should manage the combination of object
translation/rotation and viewpoint movement.
For example, consider a 3D GUI consists of a desk object with drawers and
some document objects on the desk, and the given t a s k is to store one of the
document objects on the desk to a drawer which is out of view now. The user's
manipulation strategy m a y firstly move one of the documents close to the drawer,
change the viewpoint to see both the document and the drawer in the same view,
then open the drawer and drag off the document to the drawer.
Independent of the input devices to be used, from our experiences, there are
three categories of issues on 3D manipulation: viewpoint control, short distance
object placement, and long distance object placement.
Viewpoint control is a combination of viewpoint translation and view vector
rotation. Generally, view angles used to generate scenes on screens are smaller
t h a n our actual eyes so t h a t users often lose their point of interest (POI,
afterwards) while they are changing their viewpoint and view vectors, and it is
usually hard to recover from the lost situation. To prevent users from POI lost is an
important issue.
Short distance object placement is a combination of view vector rotation, object
translation and object rotation. The t a s k in this category is to move objects to a
different place and align them. In order to move and/or align objects, users need to
rotate their view vectors because the source and the destination are not arranged
to be seen in one fixed view. When users change their view vectors, generally the x,
y and z direction of object movement seems different and this m a k e s users
confuse& As the direction of movement depends on the local coordinate which the
moving object is based on, we assume t h a t the choice of the local coordinate is a
key of this category.
Long distance object placement is a kind of combination of the above two.
Depending on what input devices are used, all of the manipulation on the viewpoint,
the view vector and objects need to be defmed as actual operations on the input
devices. Optimal combination of the view control and the object placement
manipulation, and optimal arrangement of the actual operations could provide a
fully comprehensive 3D manipulation solution for users.

2.2. Proposed Manipulation Methods for Each Category


We have developed a test environment for 3D manipulation on which several
35

different manipulation methods can be evaluate& With this test environment, we


have investigated the viewpoint control issue and the short distance object
placement issue to find out the best manipulation method for each issue.
As for the viewpoint control issue, one of the important techniques is to give a
good manipulation metaphor to users to prevent users from POI lost. According to
Ware et. al., for the control of viewpoint movement in a wide-spread graphical
environment like maze, flying vehicle control which gives users a control like driving
a flying vehicle is better than the other two, eyeball in hand and environment in
hand [2]. In this paper, we examine the flying vehicle control in more detail by
adding some options. The proposed methods for viewpoint control are:
(1) Flying vehicle control only
(2) Flying vehicle control with R&L translation
The way to move to the right and left directions is added to the basic flying
vehicle control.
(3) Flying vehicle control with fire-lock control
It provides the way to lock the viewpoint to a specific object when users hit
space key.
(4) Flying vehicle control with wide-view
It provides a wide-view to show more right and left view for users.

Regarding the short distance object placement issue, since the choice of local
coordinate is one of the important issues, we have firstly examined some local
coordinate systems. The following is the local coordinate candidates.
(1) World coordinate
(2) Local coordinate in terms of moving object
(3) Local coordinate in terms of destination object
(4) Local coordinate in terms of view vector

Furthermore, as we have noticed that positional guidances would be helpful for


users while they are dragging objects, some of them are examined for each local
coordinate.
(a) Grid guidance
It shows the horizontal position of the moving object on the grid on the floor.
(b) Beam guidance
It shows the x, y and z axes extending from the center of the moving object.
(c) Both grid and beam

3. EXPERIMENT

3.1. Experiment System


Figure 1 shows the block diagram of the experiment system used for the
experiments. Experiment environment defmition defmes graphical environments, so
we have only to change the defmition to change the experiment environments.
Event manager reads a n experiment environment defmition and shows an
environment according to the current viewpoint. Manipulation control manager is
36

an add-on module of the event manager. In order to test various manipulation


methods, a manipulation control manager is replaced to another.
The experiment system is working on IRIS 4D/340VGX and now it adopts only
the three button mouse as an input device.

ManipulationControlManager
I
f Experiment 1
EventManager ~-.e" Environment
t Definition
Figure 1. Block Diagram of Experiment System

3.2. E x p e r i m e n t Method
Five subjects were given a t a s k for viewpoint control evaluation and a task for
short distance object placement evaluation. For each task, subjects are required to
accomplish it with all the proposed manipulation methods described in section 2.2.
The evaluation is done by measuring time spent to accomplish tasks. All subjects
did this experim ents on everyday for a week.
The viewpoint control task is to see six ornaments. Each of them is put in a box
only one of whose facet is open. The directions of the open facets are varied so that
the subjects need to turn around the boxes to see the ornaments. The short
distance object placement task is to move three plates from one place to another,
and pile and align the plates. The source place and the destination place are not so
far but they cannot be seen in one view. The subjects can see the destination place
just by rotating their view vectors, so no viewpoint translation is require&

3.3. Result
Figure 2 - 4 show the result of the experiments. Each graph shows average,
minimum and maximum time spent for each manipulation metho& We can say
that better manipulation method is of smaller average time and of smaller
difference between maximum and minimum time. The difference between
maximum and minimum time can be regarded as one of the indicators of error rate,
because manipulation errors make the accomplish time longer.
As a results of the manipulation experiments:
(1) Flying vehicle control with fire lock control is the best for the viewpoint
movement (Figure 2).
(2) Giving local coordinate based on moving object is the best for the object
movement (Figure 3).
(3) As for the positional guidance, beam or grid depends on individual
preference (Figure 4).
37

Sec.
900
800

700
600
500

400
300

0
/11
FV
I

+R&L
I

+FL
I

+WV
I

Figure 2. Viewpoint Movement Task with Various Control Methods

Sec.
Sec.
250
250
200
200

150 150

100 100

50 50

world view object destination beam grid beam+


grid

Figure 3. Object Movement Task Figure 4. Object Movement Task


under Various Coordinate Systems with Various Positional Guidances

4. D I S C U S S I O N S

According to Ware et. al., they have investigated three types of manipulation
metaphors for the viewpoint control, and found that flying vehicle control is the best
for walk-through and environment in hand is the best for object investigation [2].
The task given to the subjects for the viewpoint control investigation can be
regarded as a combination of the two: walk through type and object investigation
type. The flying vehicle control with fire lock control, which is the best way
according to our experiment, is a good manipulation metaphor for this type of task,
because this method provides easy switch between flying vehicle control for
38

walk-through and a similar control to environment in hand for object investigation.


Furthermore, with this manipulation, a speed control of viewpoint movement
can be easily implemente& Mackinlay et. al. suggested a way to control the
viewpoint movement speed according to the distance toward POI [3]. The subjects
mentioned after the experiments that they feel the operation for the POI setting is
so natural that they can use fire lock control easily. This means, with this method,
the system can get users' POI without disturbing their task.
With regard to the short distance object placement issue, we could not get a
significant difference among the four local coordinate, but according to the subjects'
opinion after experiments, most of them said that the local coordinate in terms of
moving object provides the best feeling. As for the positional guidances, we can say
that giving some positional guidance significantly reduce the operation time
compared with no guidance environment. It might be a good idea to show a guidance
only when the user is grabbing an object.
The long distance object placement needs to be studied next time based on the
result of this research. We are now implementing a combined method incorporating
the best of viewpoint control and the best of short distance object placement, and
applying it to MECOT.

REFERENCES

1. Asahi, N" An Environment for Developing Metaphor Worlds - Toward a


User-friendly Virtual Work Space based on Metaware, Proceedings of the 1994
FRIEND21 Symposium, 1994.
2. Ware,C., & Osborne,S.: Exploration and virtual camera control in virtual three
dimensional environments., Proceedings of the 1990 Symposium on Interactive
3D Graphics, In Computer Graphics 24, 2, pp175-183, ACM, 1990.
3. Mackinlay,J.D., Card, S.K., & Rovertson,G.G.: Rapid controUedmovement
through a virtual 3D workspace., SIC~RAPH'90 Conference Proceedings, In
Computer Graphics 24, pp171-176, 1990.
4. Chen,M., Mountford, S.J., & Sellen,A.: A study in interactive 3-D rotation using
2-D control devices., Proceedings of SIC~RAPH'88, In Computer Graphics 22,4,
pp121-129, 1988.
5. Houde,S.: Iterative Design of an Interface for Easy 3-D Direct Manipulation,
Proceedings of CHI'92, pp135-142, 1992.
6. Bire,E.A.: Snap-dragging in three dimensions., Proceedings of the 1990
Symposium on Interactive 3D Graphics, In Computer Graphics 24,2,
pp193-204, 1990.

You might also like