Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

2015 7th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT)

Sign Language Learning using the


Hangman Videogame

Filomena Soares, João Sena Esteves, Vitor Carvalho Carlos Moreira, Pedro Lourenço
Industrial Electronics Department Industrial Electronics Department
Algoritmi Research Centre University of Minho – Campus of Azurém
University of Minho – Campus of Azurém Guimarães, Portugal
4800-058 Guimarães, Portugal, +351253510180
E-mail address: fsoares@dei.uminho.pt

Abstract — This paper describes the preliminary study and of interaction with the user. With two monochromatic cameras
development of a videogame aimed at learning the alphabet in and three infrared LEDs, it observes a hemispherical volume
Portuguese sign language, through gestures. Leap Motion with a radius of about one meter, detecting the movements of
Controller is used, allowing the detection of fingers and hands
with high accuracy and resolution. The project is based on the the hands and fingers in an accurate way. It has a high
well-known hangman game. The player inputs the gestures that resolution because the observation volume is reduced. The
represent the letters of the alphabet in the sign language. The sensor outputs 3D coordinates. Joining the Leap Motion
Leap Trainer framework for gesture and pose learning and Controller features and sign language, it is expectable to obtain
recognition was used. This framework is helpful to the an educational game that helps in learning this gesture
development of the proposed game since it allows the recording language, mainly by children. This game will allow learning
of a gesture and a subsequent comparison with the recorded
gesture, giving a match percentage result. There are many sign
only the alphabet, but it may be a starting point to the learning
language learning games, but they are not interactive. The of the whole sign language, including the body parts.
proposed game is primarily for children but also adequate for This paper is structured as follows: Section II presents the
adults. related work, the developed project is introduced in Section III
Keywords — Deaf people; Gesture recognition; Hangman and the final remarks are presented in Section IV.
game; Leap Motion; Sign language.

II. RELATED WORK


I. INTRODUCTION
The learning of sign language is very important in the
This project aims to develop a game that allows the learning
interaction, integration and sociability of deaf people. This is a
of the alphabet in sign language. Sign language is processed
topic that deserves much attention and investment.
through gestures and captured visually. Thus, people with
The Leap Motion Controller has an application system. It is
hearing problems as well as the surrounding community use it
possible to develop a dedicated application, using the sensor
as a means of communication. Sign language is not universal,
as input, and place it in an online store. From all the
and this project will be carried out based on the Portuguese
applications already developed in the Leap Motion store, the
Sign Language.
one related to this project only has to do with the learning of
These are some requisites of the new game:
letters [2]. The writing is performed drawing letter forms in
• Definition and detection of hand gestures symbols;
the air with a finger. In this application, the relation with the
• Definition of an appealing interface;
sign language is minimal.
• Possibility of keeping a constant game development
The project described in [3] is focused on the relation of
and improvement.
sign language with Leap Motion Controller and concluded
that, although the sensor has enough potential, the Application
Parameters like the movement and orientation of the hand
Programming Interface (API) is not yet ready to support sign
and the articulations must be accurately determined in order to
language. The authors of that paper believe it will be possible
ensure correct gesture recognition.
to completely do the sign language learning once the
The non-handed language, like body language, is also a
calibration and configuration of Leap Motion Controller is
relevant factor for communication, but it is beyond the scope
adequate, as well as the position of the hands relative to the
of the new game.
sensor.
The Leap Motion Controller, a sensor developed by the
Leap Trainer [4] is a gesture and pose learning and
American company Leap Motion, Inc. [1], is used as a means
recognition framework for the Leap Motion Controller that can

This work has been supported by FCT - Fundação para a Ciência e


Tecnologia (in Portuguese) in the scope of the project: UID/CEC/00319/2013.

978-1-4673-9283-9/15/$31.00 ©2015 IEEE 231


serve as a starting point. This framework, developed in
JavaScript, allows the user to record various gestures, train
them and, when the user makes a gesture, the application
returns a percentage of matching to a certain previously
recorded gesture. Performing such a task requires an Artificial
Neural Network (ANN), intended to process data in a manner
similar to the human brain. It may be interpreted as a
processing scheme capable of storing learning-based
knowledge (experience) and provide this knowledge for the
application.
One application of the Leap Motion Controller API is the
leap-playback.js [5]. This application shows and makes
possible to download the recorded frames. Its interface
presents a rigid model of a hand with all the joints. It is
possible not only to record a simple gesture but a frame with Fig. 1 - American Sign Language App [8] (left) and Sign4Me iOS App [9]
motion. Finally, the user may download the gesture in a (right)
compressed or JSON format.
A solution for the translation of sign language into text will
be marketed in the near future. This product, developed by
Motionsavvy [6] and called UNI, is able to communicate in III. DEVELOPED WORK
two different ways: sign to speech and speech to text. The first Fig. 2 shows the experimental set-up. The Leap Motion
translates the gestures captured by Leap Motion Controller into Controller is connected to a computer and installed. Light
text and speech. The second uses a voice recognition engine intensity in the environment does not interfere with the hand
that translates the speech into text. In the beginning, the gesture detection.
dictionary will have at least 2,000 signs. Users will be able to
add their own signs, so the number of signs available will grow
over time. Right now, UNI can only recognize ASL (American
Sign Language). However, incorporating other foreign
languages is a goal for UNI in the future. After that, they
intend to develop body expression recognition.
Regarding related products, Microsoft Kinect is a sensor
that allows the recognition of the hands, arms or even the
entire human body. [7] presents a system for the recognition of
gestures and translation into text or speech using a Kinect
sensor. First, based on the depth and colour image, a 3D
trajectory of the hand corresponding to the input word is
generated. Then, that trajectory is normalized by linear
resampling to enter a trajectory alignment. Recognition is the
Fig. 2 – Leap Motion Controller and computer set-up
result of a matching score between the input and a gallery of
trajectories. Once recognition is made, letters and words are
The Leap Motion Controller transforms the gesture into
known and the translation is done.
data and transmits it to the browser via the Leap Motion
With an existing game application for smartphones [8], the
Javascript API.
user can learn the alphabet in American Sign Language
The Leap Trainer analyses the data and uses it to learn
through images (Fig. 1 - left). Another application [9],
gestures using its recognition system that implements ANN-
developed for the iOS system, allows the learning of a set of
based gesture recognition. The recognition system converts a
most used expressions, pre-defined, through an animation with
gesture into an array, which will then be compared with the
a doll (Fig.1 – right).
arrays of recorded gestures from a training set. ANNs require a
There are many other gaming applications, more or less
uniform input. This is ensured by brain.js, a library
animated, which allow the learning of sign language [10].
implemented on Leap Trainer, which truncates all training data
However, in all of them, the learning is performed through
arrays to the length of the shortest one. The number of
images, videos or animations. The aim with this project is the
iterations is set to 20000. The number of nodes depends on the
development of a game that allows learning the sign language
number of layers. Layers are defined based on the number of
through gestures.
“active” fingers of the hand, and its position. Thus, the ANN is
adjustable with each gesture.

232
The number of trainings by the ANN may be configured on
the Options menu, on the game interface. Then, events are
triggered when known gestures are detected.
With Leap Motion Controller, it is possible to obtain
information of the position of the fingers and hand through
coordinates. It is possible to keep the various settings of
gestures through the three-dimensional coordinates. An
application was developed to interpret these values and assign
them to a particular gesture previously recorded, within a
margin of error. Thus, all possible gestures may be stored in
Fig. 4 – Game environment. Inserting a letter.
order to form a gestures library.
When a gesture is recognized, an event is triggered. The
correlation between the parameter gesture and a data set of
gestures is characterized by a numeric value between 0.0 and
1.0 for each known gesture, describing how similar the gesture
is to the data set.
The well-known hangman game was chosen as a suitable
application for the sign language learning videogame. In this
game, the player must guess a word based on the category of
the word and the number of letters. When a wrong letter is
suggested, a new member is added in the hanged body. If the
hanged body is completed, the player loses the game. With the
new game, the player is supposed to suggest the letters by Fig. 5 – Game environment. Inserting a wrong letter.
applying the associated sign gesture to that letter, and therefore
practicing, in a didactic way, the knowledge obtained. The
player will also be given the opportunity to build his own
word/gestures database.
For example, Fig. 3 shows the game environment. The left
side lists the letters of the alphabet. By clicking each letter, it is
possible to see the corresponding gesture.

Fig. 3 – Game environment. Learning a letter

When the game starts, the gallows and the guessing word
are shown (Fig. 4). By applying a certain gesture,
corresponding to a letter, if the letter exists in the word to
guess, it is inserted. When a gesture corresponding to a letter
that does not exist in the word is applied, a new member is
added to the hanged body (Fig. 5).
A schematic of the game is shown in Fig. 6. In this scheme
it is possible to see that learning and gaming are performed in
parallel. Thus, at any time, the user can see how a gesture is
done and then apply it.

Fig. 6 - Schematic of the game's mechanism.

233
IV. RESULTS As future work, it is suggested putting this game in Leap
The preliminary version of the game was tested in Motion applications store and schools so children and also
laboratory environment. With only one learning performed by adults can learn the Portuguese sign language. A major goal of
the neural network, it was quite difficult to distinguish the this project is developing a useful game not only for deaf
different letters. Increasing the number of learnings performed people but also for anyone wanting to learn sign language.
by the neural network the initial problem has improved but not
yet as required. The authors believe that the problem is the
hand model. With the current configuration, the hand model REFERENCES
[1] Axes system on Leap Motion Controller.
only has one joint per finger. Therefore, the ongoing step is to https://gallery.mailchimp.com/b6db0e3e157a48749e1048615/images/Le
develop/modify the hand model in order to obtain a new one ap_Axes.png (accessed in May 2015).
with three joints per finger. This model will give a more [2] Nayi Disha Studios Pvt. Ltd. Skywriting Alphabets. Leap Motion App
realistic visualization of the hand, and a better and more Store.
specific learning by the neural network. https://apps.leapmotion.com/apps/skywriting-alphabets-trial/windows
(accessed in March2015)
The system was tested with five different gestures. It takes
[3] L. E., Potter, J., Araullo, and L. Carter, “The Leap Motion controller: a
about one second to identify a letter with an accuracy around view on sign language”, Proceedings of the 25th Australian Computer-
95%. Human Interaction Conference (OzCHI '13) – Augmentation,
Application, Innovation, Collaboration, Adelaide, Australia —
V. FINAL REMARKS November 25 - 29, 2013.
[4] Robert O'Leary. Leap Trainer.
This paper presented the preliminary study and
https://github.com/roboleary/LeapTrainer.js/tree/master#sub-classing-
development of a project whose goal is the development of a the-leaptrainercontroller (accessed in March 2015)
game that allows learning the Portuguese sign language. The [5] Leapmotion – playback. https://github.com/leapmotion/leapjs-playback
selected sensor was the Leap Motion Controller, which is (accessed in May 2015)
capable of detecting the joints of the fingers with enough [6] Motionsavvy. http://www.motionsavvy.com/ (accessed in June 2015)
resolution to be used on this project. The Leap Trainer allows [7] X. Chai, G. Li, Y. Lin, Z. Xu, Y. Tang, and X. Chen, “Sign language
recognition and translation with Kinect”, Proceedings of the 10th IEEE
the recording of a gesture and gestures comparison in real time International Conference on Automatic Face and Gesture Recognition
through the teaching of a developed neural network. It is very (FG2013), Shanghai, China, Apr. 22-26, 2013.
useful for the development of the game since the goal is [8] TeachersParadise.com, Inc. ASL American Sign Language. Google
allowing the player to check if he/she is doing the correct Play.
https://play.google.com/store/apps/details?id=com.teachersparadise.asla
gestures as well as to compare those gestures with predefined mericansignlanguage (accessed in March 2015)
recorded gestures (used as references). The well-known [9] Vcom3D, Inc. Sign 4 Me for iPad - A Signed English Translator. iTunes
hangman game, with the input mode of the letters through Store.
gestures, is under development. Using a hand model with only https://itunes.apple.com/pt/app/sign-4-me-for-ipad-
signed/id383462870?mt=8 (accessed in March 2015)
one joint per finger and after some training, the system is
[10] Google Play. https://play.google.com/store
capable of identifying all the letters of the alphabet and
numbers but not with a high success rate. Possibly, a new hand
model with three joints per finger would improve this rate.

234

You might also like