Professional Documents
Culture Documents
Review On Development of Gesture Interfaces For HCI
Review On Development of Gesture Interfaces For HCI
Abstract:
With the development of human-computer interaction technology, how to use natural if not
quasi-natural, intuitive interaction methods has become an important topic of research.
Gesture is one of the most important communication methods of human beings, which can
effectively express users' demands. This article focuses on the development of gesture
interaction technology in the past few decades and discusses the definition and classification
of gestures, input devices for gesture interaction, and gesture interaction recognition
technology. Also the application of gesture interface in human computer interaction is
studied, the existing problems in the current gesture interaction are summarized, and the
future development is suggested.
Keywords:
Gesture Interaction, Gesture Interface, Human-Computer Interaction
1. Introduction:
In this paper one very particular use of the term "gesture" that is, hand gestures that
co-occur with spoken language is to be discussed. Why such a narrow focus given that so
much of the work on gesture in the human-computer interface community has focused on
gestures as their own language gestures that might replace the keyboard or mouse or speech
as an direct command language?
In real life, people use their hands to perform operations such as grasp, move, and
rotate. In the process of communication, people spontaneously attract the attention of others
by hand movements. Gesture is the way people express their will under the influence of
consciousness. As Justin Cassell puts it, “I don’t believe that everyday human users have any
more experience with, or natural affinity for, a "gestural language" than they have with DOS
commands. Thus if our goal is to get away from learned, pre-defined interaction techniques
and create natural interfaces for normal human users, we should concentrate on the type of
gestures that come naturally to normal humans.” [1].
Gesture interaction is intuitive, natural, and flexible. Therefore, it is also very
important for some users with physical disabilities, such as visual impairment and hearing
impairment, to interact through gestures. Much work has been conducted in the investigation
and development of natural interaction interfaces, including gesture interfaces. Science
fiction literature and movies also dream up gesture interfaces, for example the movies Johnny
Mnemonic (1995), Final Fantasy (2001), and Minority Report (2002).
Application domains in this direction involve virtual reality, augmented reality,
wearable computing, and smart spaces, where gesturing is a possible method of interaction.
Gestures can also bring a new perspective on the creativity of designers. Beside the
advantages of personal style, speed and preservation of perceptual-motor skills, designers
sketch because they need indeterminacy and controlled ambiguity to stimulate idea
generation. [2]
1.1. Human-Computer Interaction
Human-computer interaction refers to the research in design, utilization, and
implementation of computer systems by human. It is a technology to study about
humans, computers, and the interaction between the two. With the development of
computing devices and related technologies, the interaction between humans and
computers has become a part of daily life including work, shopping, and
communication [3]. The graphical user interface based on mouse and keyboard is the
commonly used user interface; however, the user interface based on the Window,
Icon, Menu, Pointer (WIMP) interface paradigm is limited to two dimensional (2D)
planar objects, which cannot interact with the objects in the three-dimensional (3D)
space, reducing the naturalness of the interaction.
1.2. Gesture Interface Technology in HCI
Unlike traditional WIMP, in a natural user interface or a supernatural user interface
scenario, the user interacts in a 3D environment simply involving gloves or high
degree of freedom devices. Previous research in this area can be classified into the
following approaches: indirect cursor control, direct cursor control, device-based
direct pointing, and free direct pointing through finger, hand, or body movement. [4]
With the continuous development of interactive technology like VR and AR has been
widely used in many fields such as game entertainment, medical care, and education.
In recent years, information acquisition based on high-tech equipment such as
electromyography signal acquisition has gradually become a research focus. Owing to
the variability and complexity of the gesture, how to use the existing technology to
process the input signal collected by the device, how to determine the spatial position
and posture of the hand, and how to obtain an accurate recognition result have great
impact on the effect of subsequent gesture interactions. Gesture recognition
technology is mainly categorized into gesture recognition based on wearable sensor
devices, gesture recognition based on touch devices, and gesture recognition based on
computer vision.
Most interactive user interfaces (UIs) in HCI are based on the traditional eye-centred
UI design principle, which primarily considers the user’s visual searching efficiency and
comfort, but the hand operation performance and ergonomics are relatively less considered.
As a result, the hand interaction in VR is often criticized as being less efficient and
precise. The hand operation in the downward direction was more efficient and accurate than
that in the upward direction. Thus, in a VR scene, targets at lower positions relative to the
hand can be selected more efficiently and accurately than those at higher positions.
It has also been found that the choice of the hand being used had a crucial impact on
free hand interaction in VR, the left hand selected targets at the left side more promptly and
accurately, but the right hand performed better in selecting targets at the right side. [11]
4. Potential issues
Firstly, although gesture interaction simplifies the interactive input method, there is
no standardized operation specification. Since the gesture and the task do not have one-to-
one correspondence, it is necessary to select appropriate gesture types according to the
characteristics of the task. Owing to the diversity and complexity of gestures, it is difficult
for developers to build a consistent operation platform. Therefore, when users use gesture
interactions for different products, they need to be trained for a period of time, which
increases the difficulty of learning and cognition.
Secondly, gesture-based interaction is closer to the human expression, but in some
HCI interaction scenarios, it is required to wear cumbersome gesture collection devices,
which reduces the comfort and naturalness of the interaction, immersing the users in a
negative situation, and affecting their mood.
Thirdly, compared with the point-to-point precision operation of the mouse and
keyboard, the gesture interaction is not an accurate operation, and its application range is
affected by many factors such as the interaction device, the recognition method, and user
proficiency.
5. Conclusion
This paper summarized the following aspects of gesture interaction technology in HCI.
Firstly, the definition of gestures is provided, and the existing gesture classification
methods are investigated and summarized.
Secondly methods to finding ergonomic and intuitive gesture was discussed.
At last potential issues in gesture interface was provided.
Reference:
1. Cassell, J., 1998. A framework for gesture generation and interpretation. Computer vision in
human-machine interaction, pp.191-215.
2. Fish, J. and Scrivener, S. (1990). Amplifying the mind’s eye: sketching and visual recognition.
Leonardo, Vol. 23, No.1, pp. 117-126.
3. Pantie M, Nijholt A, Pentland A, Huang T S. Human-Centred Intelligent Human Computer
Interaction (HCP): how far are we from attaining it? International Journal of Autonomous and
Adaptive Communication Systems, 2008, 1(2): 168 -187 DOI:IO.1504/lJAACS.2008.019799
4. Hespanhol, Luke, Martin Tomitsch, Kazjon Grace, Anthony Collins, and Judy Kay.
"Investigating intuitiveness and effectiveness of gestures for free spatial interaction with large
displays." In Proceedings of the 2012 International Symposium on Pervasive Displays, pp. 1-
6. 2012.
5. Hummels, Caroline, and Pieter Jan Stappers. "Meaningful gestures for human computer
interaction: beyond hand postures." In Proceedings Third IEEE International Conference on
Automatic Face and Gesture Recognition, pp. 591-596. IEEE, 1998.
6. Cassell, J. (1998). A framework for gesture generation and interpretation. Computer vision in
human-machine interaction, 191-215.
7. Yang LI, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI. Gesture interaction in
virtual reality. Virtual Reality & Intelligent Hardware, 2019, 1(1): 84-112 DOl: I
0.3724/SP.J.2096-5796.20 I8.0006
8. Nielsen, M., Störring, M., Moeslund, T. B., & Granum, E. (2003, April). A procedure for
developing intuitive and ergonomic gesture interfaces for HCI. In International gesture
workshop (pp. 409-420). Springer, Berlin, Heidelberg.
9. Beringer, N., 2001, April. Evoking gestures in SmartKom-Design of the graphical user
interface. In International Gesture Workshop (pp. 228-240). Springer, Berlin, Heidelberg.
10. Nielsen, Jakob. Usability engineering. Morgan Kaufmann, 1994.
11. Lou, Xiaolong, Xiangdong A. Li, Preben Hansen, and Peng Du. "Hand-adaptive user
interface: improved gestural interaction in virtual reality." Virtual Reality (2020): 1-16.