Professional Documents
Culture Documents
Dissertation - Eye Tracking As A Primary Input in Video Games
Dissertation - Eye Tracking As A Primary Input in Video Games
Date
4th August 2021
Name of Candidate
George Cremen-Merrill
Title of Dissertation
Word Count
Approx. 5250
DECLARATION: I declare that the above work is my own and that the material herein has
not been substantially used in any other submission for academic award.
SIGNED: DATE:
All dissertations submitted as part of an assessment process for a degree become University
property once handed in, and are not normally available to be returned. It is therefore
recommended that candidates retain a personal copy. The submitted copy may be retained by the
Library for reference by others.
I would also like to thank the supporting staff of University of Suffolk who have helped me with
managing my time and with accepting outside help due to mental health struggles.
This project would also not be possible without the hard work and dedication of the Tobii team who
first introduced me to eye tracking and have helped support this project with the software and
documentation they provide.
Table of Figures
Figure 1: Studies by Year of Eye Tracking in Video Games…………………………………………………………………6
Figure 4: Tobii’s demo scene of a first-person shooter style game using eye tracking features……….11
Figure 5: The Gaze Awareness script supplied from the Tobii SDK, adapted for this project……………12
Figure 6: First 50 lines of the TobiiHost script, which is over 200 lines………………………..………………….13
Figure 9: The Kill Plane game object, placed just behind the main camera….……….………………………….15
Figure 10: The UserPresence script that pauses the game when users’ eyes are not looking at the
screen……………………………………………………………………………………………………………………………………………..15
Figure 11: The debug log displaying the score once the game ends………………………………..……………….17
Figure 12: Data table of collected scores from the different difficulty tests………….………………………….17
It is found that creating a game with only eye tracking as an input is a unique challenge, which
requires not only coding solutions but changes to the game design that make it possible. It also
opens unique opportunities that would not be possible without the speed and precision of todays
current eye tracking hardware.
Keywords
Tobii, Eye Tracking, Gaze Interaction, Inputs, Unity.
1. INTRODUCTION
Eye tracking has been used in many ways within video games. Most of the time as an enhancement
of traditional control methods such as keyboard and mouse controls as well as standard controllers.
It has been used by organisations such as Special Effect (2021) to allow those with impaired motor-
skills to play video games using their project named Stargaze. With companies such as Tobii focusing
products specifically for gaming, developers have used the additions that Tobii have worked on to
create games with more immersive uses of eye tracking such as “Dynamic Lighting Adaption” and
“Clean UI” (Velloso, 2016).
This study aims to identify if it is possible, with the current Tobii eye tracking hardware, software
and the interaction between the Tobii Eye Tracker and current, industry standard game engines such
as Unity (2021) to create a game entirely controlled via eye tracking. While organisations such as
Special Effect use proprietary software to get their eye gaze games to work and are limited to web
games at the current time of writing, this project aims to create a game entirely functioning with eye
tracking in an industry standard game engine with just the Tobii software that are provided.
1. To create a functioning game loop that only uses eye tracking as an input.
2. To investigate how feasible it is to create an entire game using only eye tracking.
3. To discuss the impact that eye tracking in video games could have on both the user and
developers experience.
3
Yearly Distibution
0
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006
The first thing to assess from the SLR is the yearly distribution as shown in figure 1, this shows an
inclining trend of studies published involving eye tracking within the creation or use of video games.
Sundstedt, Navarro and Mautner (2016) also commented on the increase of production of
affordable and quality eye tracking products for use in gaming, and the increased support that
leading retail companies in the field of eye tracking such as Tobii have given to developers, allowing
them to directly support a list of 163 games (as of writing) (Tobii [3], 2021).
The affordability and ease of use for developers (discussed further later), specifically by Tobii, has led
to eye tracking being used by big game companies such as Ubisoft (2020) as discussed by Gomez and
Gellersen (2019) regarding “Tom Clancy’s: The Division 2”.
HTC
Legend / Tobii
Android 65%
7%
Tobii HTC Legend / Android Fove N/A
Figure 2 shows the overwhelming dominance that Tobii holds over the eye tracking in video games
market. Sundstedt, Navarro and Mautner (2016) note an increase in affordable, reliable and non-
intrusive eye trackers on the market, specifically referencing Tobii as paving the way forward with
leaps in retail technology, this sentiment is echoed by Lankes et Al (2018), who state that Tobii
Gaming have been leading the charge for affordable eye trackers for use in video games.
The other technologies mentioned in figure 2 are FOVE and an HTC Legend / Android eye tracker.
The HTC Legend or Android eye tracker used by Dybdal, Agustin and Hansen (2012) use an Android
OS along with an HTC Legend smartphone that used a USB night vision camera paired with open-
source software ITU Gaze Tracker (EyeComTec, 2019) to track the users eye movements on the
screen and convert them to inputs. Essentially this method did not use any specific eye tracking
hardware but used software to adapt hardware for the task.
FOVE is the second and last-named brand of eye tracking peripheral used, this is an addon for virtual
reality headsets that enable eye tracking. Khamis et al (2018) comment on the emergence of various
similar addons that enhance VR gameplay in a similar way that Tobii’s peripherals do outside of
virtual reality. Khamis et al (2018) states that “Eye tracking is a key technology for VR headsets and
has therefore been integrated…”.
As mentioned above, Tobii currently have 163 games that they support with varying eye tracking
features. These games range in genre, age and features that they use. Some of the more notable
features are those such as “dynamic lighting adaption” which changes the brightness of the scene
depending on where the user is looking. As Tobii describe it, “The human eye adapts amazingly to
different light conditions so let the in-game environment take advantage of that”. Another, less
game-changing, feature is “clean UI” which hides the UI such as the mini-map, health bar or ammo
count until the user looks at where it would normally be, this can help with increased immersion.
Most of the features offered by Tobii are used as enhancements of traditional controls, however
there are some that act as a primary input such as “extended view” which pans the camera in the
direction the player is looking, or “mark at gaze” which would mark an enemy or objective that the
In 2010, Vickers, Istance and Smalley discussed the possibility of those without full use of their
motor functions to use gaze interaction and gestures to participate in online communities such as
World of Warcraft (Blizzard, 2021). Their conclusion was that emulating mouse movements alone
was not enough for players to be able to access all features of the game, and that features such as
gaze gestures would be needed to make the game function at full capacity. Doing this they were able
to get 12 experience World of Warcraft players to move and engage in combat via gaze gestures, the
gestures created excelled in spell casting but struggled with control of continuous movement.
Vickers, Istance and Smalley (2010) state that “Eye movements are both fast as an input device and
are natural as means of pointing when compared to other input devices”. Gomez and Gellersen
(2019) back this sentiment, noting that it is almost too easy to aim in games using Tobii’s gaze
control.
This suggests that eye tracking as a primary input could be used in games that require a very fast
reaction speed such as an arcade style shooter or rhythm game, since these games generally do not
require movement at the same time as aiming, which many implementations of eye tracking
struggle with since it is simply too complicated for both the hardware to reliably track and for the
user to input.
Velloso and Carter (2016) go in depth about eye tracking as an input for several game mechanics
such as movement and aiming. They note many different methods for both of these mechanics and
the usefulness each method fits best in, for example, the mapping method in which the players
avatar is positioned exactly where the user is looking is noted to be best for games such as
“Breakout” (Dorr et al, 2009) or where the camera is fixed and the user is controlling an object in
view. Virtual buttons could also be used to control a more complex style of game, such as an MMO
or RPG and works by mimicking inputs of the regular controls that the game uses. This style was one
implemented by Vickers et al (2013) as mentioned above.
The counter to this point is bought up by Ekman, Poikola and Makarainen (2008) and Dybdal, Agustin
and Hansen (2012), they state a general lack of precision of the eye trackers used, making it hard for
specific eye movements to be used as inputs. While, at the time, these criticisms and shortcomings
of the technology were justified due in part to the infancy of the hardware and the lack of support
for software, these issues have been solved with the continued innovation and support of Tobii and
other companies such as Special Effect.
The feasibility of a game controlled completely via eye tracking is absolutely possible, and has been
done by previous studies, such as Vickers et al (2013) and Dorr et al (2009). With the additional
support of Tobii’s SDK that they provide for developers to use in the most popular game engines
(Unity and Unreal) the ability to create a game that controls exclusively with eye tracking is well
within the scope of this project.
3. METHODOLOGY
This section describes the decisions made when planning the development section below and
explains why those decisions where made.
This made the decision of hardware for this project straight forward, with no other companies really
challenging Tobii’s reign on the eye tracking market, it was obvious that the Tobii Eye Tracker 5 was
the way to go.
Included with the Tobii Eye Tracker 5 is all the software benefits that having a Tobii product also
come with, such as Tobii Ghost (Tobii [2], 2021) which allows the user to record footage with an
overlay on top showing the exact position of the users eyes. This will be extremely useful for this
project in demonstrating the way the game loop in Unity will function with the user’s eye inputs.
The Tobii development license states that “This license is for non-commercial use and not for
distribution. It’s a limited license that provides development rights with Tobii SDKs and eye tracking
related software and APIs”. Since the project is purely research, there is no conflict with this license.
The license also states that there can be no storage or transfer of eye tracking data and currently
there are no plans to store eye tracking data.
Of the assets and code that are not part of the Unity base assets or the Tobii SDK, they are of original
creation. (Tobii Technology, 2021)
Following the research of Velloso and Carter (2016) it is clear that movement using just eye tracking
can be a difficult task with many different solutions possible, and while creating a game in which the
user controls a character’s movement was enticing and opens up a lot of different aspects of games
eventually it was decided that creating an “on-rails shooter” similar to House of The Dead (Nintendo,
2021). These games control the users’ movements for them and the only input that is involved is
aiming and shooting.
According to Velloso and Carter (2016), the most common way to aim is to assign the gaze point to
control exactly where the crosshair would be. This is extremely natural for the user since, in most
cases, a player would look at an object before they shoot. The use of eye tracking in this scenario can
lead to a faster paced game being created than using traditional aiming inputs, as discussed by
Gomez and Gellersen (2019).
The final problem to solve is shooting in the game, in most conventionally controlled games, a
trigger action is used such as a mouse press or button input. There are some eye gestures that can
be used to input a shot, for example a blink or wink command is supported by Tobii. However, if the
aim is to create a fast-paced shooting game, a small dwell time can be used that allows the user to
quickly scan the screen and find the object they aim to fire at, then after a small amount of time fire
at the object.
Figure 4, Tobii’s demo scene of a first -person shooter style game using eye tracking features.
These demo scenes provide a lot of insight to how the Tobii SDK and the Tobii Eye Tracker itself
interact with Unity, for example it shows which scripts must be placed on certain objects such as the
extended view needing to be placed on the camera object. The demo scenes were integral in clearly
showing how to use eye tracking in Unity and make it a lot more feasible to create a game entirely
with eye controls. They also inspire a lot of creativity in the way that they use eye tracking.
4. DEVELOPMENT
This section of the report aims to document the development of the game loop in Unity.
The first thing to do was to import the Tobii SDK into the Unity project. This SDK includes many
demo scenes and scripts that can be very helpful to see how Tobii themselves integrate eye tracking
into Unity, however these demo scenes will not be used in this project and only a few scripts from
Tobii will be used. Namely, the Gaze Awareness script and the Tobii Host script from the SDK will be
used.
The Gaze Awareness script determines whether an object that it is applied to is being looked at or
not, in this case if the game object interacts with the gaze point, then the object is deleted. This is
the main interaction that the player will have with the game, aiming using their eyes.
The other script that is taken from the Tobii SDK is the “TobiiHost” script, of which part is shown in
figure 4. This script essentially allows the game object it is placed on to become the “host” of the eye
tracking. In the project it is placed onto the camera so the user’s eyes will be on the same plane as
what they see. Without the TobiiHost script the eye tracking would not work at all.
The next step in development was to create an enemy game object. The basic Unity cuboid asset
was used to create a basic shape that is easy to follow and has a very normalised hit detection so
that the eye tracking will not have to be extremely accurate. This object was also made to stand out
of the background, so it is easier to identify immediately by the player. This was done by adding an
emissive mask onto the object so it glowed a neon blue colour.
As shown in figure 7, the collider of the enemy game object has been made bigger, this is because
during original testing of the game it was relatively hard to accurately hit the enemies using eye
tracking, especially when they are further away and therefore smaller. This change makes the game
feel more accurate and responsive, while also making it easier.
The difficultly shift was accounted for by making the spawns of the enemy objects quicker. The
“SpawnEnemy” script is a simple one that is created in such a way that allows for quick changes to
the enemies’ spawn time and locations.
The basic idea of the game is that the enemy cube objects are spawned in and move towards the
player, if they hit the wall behind the player then the player loses a life and the longer the player is
Figure 9, The Kill Plane game object, placed just behind the main camera.
Since the game is completely controlled via eye tracking, it is important for the game to know
whether the user is looking at the screen or not. This is checked by the “UserPresence” script that
takes from Tobii’s SDK to check whether the users’ eyes are focused on the screen, if they are not
then the game is paused.
Figure 10, The UserPresence script that pauses the game when the users’ eyes are not looking
at the screen.
A scoring mechanism was also added in the last stages of development, as well as a lives counter.
The scoring works by adding score the longer the player is alive, and the game is continuing. The
score is increased per second the more lives the player has, so they are rewarded for letting no
The accuracy still suffered from issues though, such as when enemies reached near the sides of the
screen it was hard to consistently hit them, which could be frustrating for the player. This is probably
because of the game’s camera setup however, and not because of the eye trackers accuracy,
although this reasoning is not entirely clear.
Tobii do also provide lots of scripts in the SDK, some of which are required to make the eye tracker
interact with Unity. These scripts are useful to implement, and some have been used, as discussed
above, in the project.
This was tested by changing both the spawn speed and the enemy speed and comparing the scores
that each resulted in, this was performed 3 times for each variation, of which there were 4, and the
scores were averaged out to try to get rid of any anomalies.
Figure 11, the debug log displaying the score once the game ends.
Test 1 had the fastest spawn and enemy speed, test 2 had the fastest spawn speed but slower
enemy speed, test 3 had the slowest spawn speed and fastest enemy speed and test 4 had the
slowest of both. Below are the results (Average score is rounded to the nearest whole number):
6. DISCUSSION
As mentioned in the conclusion of the SLR, it is completely possible to create a fully functioning
game that only uses eye tracking as a primary input. Following the findings of Velloso and Carter
(2016) and Vickers et al (2013) made this project substantially easier as they clearly define what
works for eye tracking in video games. The game loop that has been developed, while primitive, is a
clear building blocks for a bigger, fully fledged game that could still be controlled exclusively via eye
tracking.
Vickers et al (2013) have had the best documented success when it comes to adapting eye tracking
controls to commercial games, managing to create controls for the massively multiplayer online
game “World of Warcraft” which involved features such as gaze gestures and virtual buttons that
the user could press with their eyes. While these features didn’t make it into this project, a lot of
useful information was gathered, such as the statement made by Vickers et al that emulating mouse
movements was not enough to create a game with many features. Velloso and Carter (2016)
provided a great culmination of eye tracking techniques, and while most of them haven’t been
While the enemy game objects are just simple cubes, the addition of the emissive material made
them a lot easier to “lock” onto with using one’s eyes. This was also found by Sundstedt (2010) who
notes that “Building scenes for the purposes of eye selection required careful positioning of cameras
and making selectable objects larger and spaced apart”. In this vein of thinking, the camera was
placed statically, and all objects were spawned in a relatively small cone of vision for the camera,
allowing the user to notice them easier since are within their direct line of sight. The same approach
was taken following the above discussed emissive material and the backgrounds distinct lack of
“character”, which allows the enemy game objects to become the main feature of the scene and
allows for minimum distraction.
The programming for this project was easy than anticipated, with the help of the Tobii SDK which
provides a lot of the interaction between the eye tracker and the game engine via scripts. This
meant a lot of the scripts that were authored for this project are simple and work around the Tobii
SDK. Small adjustments were made to some of the scripts from Tobii, such as the “Gaze Awareness”
script, in which a destroy game objects line was added so that when the user is looking at an enemy,
they would be destroyed. While the online documentation for the Tobii SDK is lacking as previously
stated, the forum support and the SDK demo scenes provide a great insight into how the developers
have used eye tracking and how it can be utilised.
During the testing of the game, it was found that the eye tracker could be slightly inaccurate with
some of the enemy objects as discussed in the results section above. Sundstedt (2010) also ran into
similar issues when attempting to create a point and click style game using eye tracking, the
workaround that was used in that case was to snap the gaze / cursor to the object it was nearest
too. While that specific solution could have been used, since eye trackers have come a long way
since 2010 and a simple change to the enemy’s hitbox as discussed in the development section
made the hit detection of the eye tracker a lot more consistent. However, there were still problems
with enemies that went to the further part of the screen as discussed previously, this may be due to
several things including the calibration to even the preferred eye of the author.
For future work, it would be interesting to see different genres of games being adapted to work with
only eye tracking to see where the limits of the technology currently are, as well as seeing the
different controller methods described by Velloso and Carter (2016) in action and how they compare
to the eye tracking methods used in this project. The continued development of the game in this
project would also be interesting to see and whether some of Tobii’s current features such as
“Dynamic Lighting” and “Extended View” would fit into a game completely controlled with one’s
eyes.
8. REFERENCES
Blizzard (2021) World of Warcraft. Available at: https://worldofwarcraft.com/en-us/ (Accessed: 4th
August 2021).
Dorr, M., Böhme, M., Martinetz, T., Brath, E. (2009) Gaze beats mouse: a case study. Lubeck,
Germany. July 7th 2009.
Dydbdal, M. Agustin, J and Hansen, J. (2012) Gaze input for mobile devices by dwell and gestures.
Barbara, California, March 28th-30th. Doi 10.1145/2168556.2168601
Ekman, I. Poikola, W and Makarainen, M. (2008) Invisible eni: using gaze and pupil size to control a
game. Florence, Italy, April 5th-10th. Doi 10.1145/1358628.1358820
Gomez, A and Gellersen, H. (2019) Looking Outside the Box: Reflecting on Gaze Interaction in
Gameplay. Barcelona, Spain, October 22nd – 25th. Doi 10.1145/3311350.3347150
Jacob, R. (1991) The Use of Eye Movements in Human-Computer Interaction Techniques: What You
Look At is What You Get. Washington, D.C. April. Doi 10.1145/123078.128728.
Khamis, M. Oechsner, C. Alt, F and Bulling, A. (2018) VRpursuits: interaction in virtual reality using
smooth pursuit eye movements. Grosseto, Italy. May 29th – June 1st. Doi 10.1145/3206505.3206522
Lankes, M. Newn, J. Maurer, B. Velloso, E. Dechant, M and Gellersen, H. (2018). EyePlay Revisited:
Past, Present and Future Challenges for Eye-Based Interaction in Games. Melbourne, Australia,
October 28th-31st. Doi 10.1145/3270316.3271549
Sundstedt, V. Navarro, D and Mautner, J. (2016). Possibilities and challenges with eye tracking in
video games and virtual reality applications. Macau, December 5th – 8th. Doi
10.1145/2988458.2988466
Sundstedt, V. (2010). Gazing at games: using eye tracking to control virtual characters. Dublin,
Ireland. July 28th. Doi 10.1145/1837101.1837106
Tobii (2021). Tobii Gaming. Available at: https://gaming.tobii.com/ (Accessed: 4th June 2021)
Tobii [2] (2021). Tobii Ghost – Stream with Eye Tracking. Available at:
https://gaming.tobii.com/software/ghost/ (Accessed: 4th June 2021)
Tobii [3] (2021). Tobii Gaming | PC games with eye tracking. Available at:
https://gaming.tobii.com/games/ (Accessed 4th June 2021)
Velloso, E and Carter, M. (2016) The Emergence of EyePlay: A Survey of Eye Interaction in Games.
Austin, Texas, October 16th-19th. Doi 10.1145/2967934.2968084
Unity (2021) Unity Real-Time Development Platform. Available at: https://unity.com/ (Accessed 4th
June 2021).
Vickers, S. Istance, H and Smalley, M. (2010) EyeGuitar: making rhythm based music video games
accessible using only eye movements. Taipei, Taiwan, November 17th-19th. Doi
10.1145/1971630.1971641
Vickers, s. Istance, H and Heron, M. (2013) Accessible gaming for people with physical and cognitive
disabilities: a framework for dynamic adaptation. Paris, France, April 27th – May 2nd. Doi
10.1145/2468356.2468361
9. BIBLIOGRAPHY
Blizzard (2021) World of Warcraft. Available at: https://worldofwarcraft.com/en-us/ (Accessed: 4th
August 2021).
Burch, M and Kurzhals, K. (2020) Visual Analysis of Eye Movements During Game Play. Eindhoven,
Netherlands. June. Doi 10.1145/3379156.3391839
Dorr, M., Böhme, M., Martinetz, T., Brath, E. (2009) Gaze beats mouse: a case study. Lubeck,
Germany. July 7th 2009.
Dydbdal, M. Agustin, J and Hansen, J. (2012) Gaze input for mobile devices by dwell and gestures.
Barbara, California, March 28th-30th. Doi 10.1145/2168556.2168601
Ekman, I. Poikola, W and Makarainen, M. (2008) Invisible eni: using gaze and pupil size to control a
game. Florence, Italy, April 5th-10th. Doi 10.1145/1358628.1358820
Gomez, A and Gellersen, H. (2019) Looking Outside the Box: Reflecting on Gaze Interaction in
Gameplay. Barcelona, Spain, October 22nd – 25th. Doi 10.1145/3311350.3347150
Jacob, R. (1991) The Use of Eye Movements in Human-Computer Interaction Techniques: What You
Look At is What You Get. Washington, D.C. April. Doi 10.1145/123078.128728.
Lankes, M. Newn, J. Maurer, B. Velloso, E. Dechant, M and Gellersen, H. (2018). EyePlay Revisited:
Past, Present and Future Challenges for Eye-Based Interaction in Games. Melbourne, Australia,
October 28th-31st. Doi 10.1145/3270316.3271549
Mohamed. A, Da Silva. M and Courboulay. V. (2007) A history of eye gaze tracking. Available at:
https://www.researchgate.net/publication/29642053_A_history_of_eye_gaze_tracking (Accessed
4th June 2021).
Smith, J and Graham, J. (2006) Use of eye movements for video game control. Hollywood, California,
June 14th – 16th. Doi 10.1145/1178823.1178847
Sundstedt, V. Navarro, D and Mautner, J. (2016). Possibilities and challenges with eye tracking in
video games and virtual reality applications. Macau, December 5th – 8th. Doi
10.1145/2988458.2988466
Sundstedt, V. (2010). Gazing at games: using eye tracking to control virtual characters. Dublin,
Ireland. July 28th. Doi 10.1145/1837101.1837106
Tobii (2021). Tobii Gaming. Available at: https://gaming.tobii.com/ (Accessed: 4th June 2021)
Tobii [2] (2021). Tobii Ghost – Stream with Eye Tracking. Available at:
https://gaming.tobii.com/software/ghost/ (Accessed: 4th June 2021)
Tobii [3] (2021). Tobii Gaming | PC games with eye tracking. Available at:
https://gaming.tobii.com/games/ (Accessed 4th June 2021)
Tobii Technology (2021) License agreement for developing apps and games with Tobii Engine.
Available at: https://developer.tobii.com/license-agreement/ (Accessed 1st August 2021)
Velloso, E and Carter, M. (2016) The Emergence of EyePlay: A Survey of Eye Interaction in Games.
Austin, Texas, October 16th-19th. Doi 10.1145/2967934.2968084
Unity (2021) Unity Real-Time Development Platform. Available at: https://unity.com/ (Accessed 4th
June 2021).
Vickers, s. Istance, H and Heron, M. (2013) Accessible gaming for people with physical and cognitive
disabilities: a framework for dynamic adaptation. Paris, France, April 27th – May 2nd. Doi
10.1145/2468356.2468361
Wetzel, S. Spiel, K and Bertel, S. (2014) Dynamically adapting an AI game engine based on players’
eye movements and strategies. Rome, Italy, June 17th -20th. Doi 10.1145/2607023.2607029
10. APPENDICES
10.1 Search Strategy
Search Strategy
This section of the SLR aims to show how a clear and unbiased foundation of research was collected
for use in the following sections.
The ACM Digital Library was the main resource of research for this review. Some sources outside of
the main resource may be used in support of the main articles to be found on the ACM Digital Library.
References used in the collected articles from the main resources will also be searched for suitable
information that is relevant to the research questions.
Initial searches will be conducted on the ACM Digital Library using keywords and phrases such as
“Tobii” and “eye-tracking in video games”.
• Studies that discuss eye tracking technology past the year 2004.
• Studies that discuss the use of eye tracking in video games in any form, whether that be for
developer or consumer uses.
• Studies that speculate on how eye tracking can be used in the future of video games.
• Studies that discuss the accessibility impacts of eye tracking technology in gaming.
Exclusion Criteria
Data Extraction
The form below is used to extract data for addressing the research questions within this SLR:
Quality Assessment
This section is used to examine the credibility of the research methods and the relevance of the
citations.
1. Are the citations relevant? The citations should answer or discuss the research questions
highlighted at the beginning of this SLR, otherwise it is not relevant. Also, the citations have
been checked against the inclusion and exclusion data by the researcher who has undertaken
this SLR.
2. Does the research cover all relevant citations? The use of relevant and specific keywords as
well as the “snowballing” technique should ensure that only relevant citations are used.
3. Does each citation contain adequate information? Each citation should contain sufficient
information to answer one or more of the research questions.
Yearly Distribution
The first of the collected research data stems from the year 2006 ,as per the inclusion and exclusion
criteria. As shown by figure 1 below, there has been an influx in citations since 2010.
0
2019
2008
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2007
2006
Figure 1 – Yearly distribution graph
Technology Used
Over the 13 citations found on the ACM digital library that have been used for this SLR, 65% of them
listed Tobii (Tobii, 2019) or a Tobii product as the technology that was used or mentioned for eye
tracking technology. This is shown in the figure below.
HTC
Legend / Tobii
Android 65%
7%
Tobii HTC Legend / Android Fove N/A
Velloso and Carter (2016) mentions Tobii as an affordable device that is being implemented into
gaming laptops as to reach a wider audience and thus having the potential to meaningfully impact
game development. Smith and Graham (2006) also used a Tobii systems product, albeit a much older
version, the Tobii 1750 eye tracker to complete their study on the performance of players when
comparing different inputs.
Sundstedt, Navarro and Mautner (2016) state that there has been an increase in affordable, reliable
and non-intrusive eye trackers and specifically reference Tobii as a leading manufacturer of eye
tracking technology in the gaming industry.
There were 2 other eye tracking technologies mentioned over the 13 citations: FOVE and HTC Legend
and android eye tracking. FOVE (FOVE, 2019) is a VR headset addition that allows user with VR
headsets such as the HTC Vive (Vive, 2019) to use eye tracking. More devices like FOVE are being
designed and produced due to the emergence, and popularity, of VR and AR devices that would
benefit from the incorporation of eye tracking technology (Velloso, 2016). Khamis et al (2018) states
‘The advent of affordable and high-quality VR headsets has incited the development of various VR
applications. Eye tracking is a key technology for VR headsets and has therefore been integrated…’.
The final eye tracking technology mentioned is the HTC Legend and android eye tracking. Dybdal,
Agustin and Hansen (2012) uses the Android OS along with the HTC Legend smartphone in landscape
mode that ran the game that was to be controlled by the user’s gaze. The eye tracking was done using
a 30hz, USB night vision camera and open source software known as ITU Gaze Tracker (EyeComTec,
2019). This system sent 30 samples of input data a second via Wi-Fi to the mobile device.
Lankes et Al (2018) state that Tobii Gaming have been leading the charge of affordable eye trackers
aimed towards the gaming market in recent years, with compatibility with 149 games (as of the writing
of this SLR).
Impacts on accessibility
Out of the 13 citations, only 4 mentioned the possible impacts on the accessibility that eye tracking
technology within gaming could provide (as shown in the figure below).
Figure 3 – bar chart on the discussion of accessibility using eye tracking technology.
Istance et al (2010) discusses the ability to participate in online communities such as World of Warcraft
(Blizzard, 2019) using eye gaze only. They state that the use of mouse emulation by gaze is not enough
for users with motor impairments to be able to play and access all features of the game. A heavily
discussed topic in this citation is the ability to enter text using only gaze gestures, to allow
communication for motor impaired users who wish to play and participate in online communities.
They found that 12 experienced players of World of Warcraft were able to move around efficiently
using only gaze control with a Tobii X120 eye tracker. The participants could successfully engage other
characters in combat and the gaze gestures they developed were good for issuing specific commands
like spell casting or attacking but struggled with continuous control of movement. The citation
concludes ‘We believe that gaze gestures are an effective means of interacting with MMORPGs…’.
Kumar, Burch and Mueller (2018) also discuss the use of eye tracking as an input to make games more
accessible to those with motor impairments or difficulties. They state that at the current time the
application of eye tracking in gaming has mostly been used as an input, a substitute for traditional
input methods such as a mouse and keyboard or controller.
Eye tracking as an input to help people with motor impairments has also been discussed by Vickers,
Istance and Smalley (2010), In which they state that: ‘Eye movement are both fast as an input device
and are naturals as a means of pointing when compared to other input devices’. They propose a test
using the framework Snap Clutch, which acts as a middleware and runs independently of the Tobii eye
tracker and the game. It takes the users gaze position and processes that data to detect gestures in
Developer Benefits
As stated in the previous section, Kumar, Burch and Mueller (2018) discuss and study eye behaviour
in a gaming environment. They note in their results that players tended to ignore the edges of the
screen in their checkers game, this could be due to the edges not containing any relevant data in this
type of game or, as stated by the authors, could be down to human psychology that we tend to focus
more on the centre of screens.
Wetzel, Spiel and Bertel (2009) discuss the possibility of using eye tracking data to significantly better
gameplay experience by adapting an AI agent to a user’s gameplay. This is done by tracking the user’s
eye movements via fixations and saccades (a fixation being a point at which the eye stays still and a
saccade being multiple quick movements between different fixation points). This data was used to
make the AI of the tested game adapt to the user strategies, this resulted in players who knew their
eye movements where being tracked by the AI reporting a decrease in frustration levels with the
game.
Consumer Benefits
Lankes et Al (2018) state that ‘In recent years, gaming has been at the forefront of the commercial
popularization of eye tracking’. They go on to discuss how the continued development of affordable
and easy to integrate eye trackers has led to new and interesting game mechanics, which is also
discussed by Velloso and Carter (2016) in their survey.
Gomez and Gellersen (2019) mention how gaze interaction has progressed from a tool that was mainly
used for accessibility purposes to offer users an enhanced controller performance and greater
emersion. They also mention how eSports athletes sometimes stream their games with eye tracking
software enabled, sharing to their audience where they are looking and can help them understand
the strategies they may be using.
Vickers, Istance and Smalley (2010) observe during an experiment that 90% of participants scored
higher using the eye gaze input versus the traditional keyboard.
On the other hand, Ekman, Poikola and Makarainen (2008) observe that pupil size is too irregular to
work as an input for most games, and teenagers used for the study found it nigh impossible to control
their pupil sizes in order to manipulate events in the game.
DISCUSSION
As shown in figure 1, there has been an increase in citations on eye tracking technology in video games
since 2010. This is also discussed by Sundstedt, Navarro and Mautner (2016), and as previously stated,
there has been an increase in the production of affordable and quality eye tracking products for use
in gaming, as well as increased support as shown by Tobii’s list of 149 games that are being updated
frequently. Triple A gaming companies are implementing eye tracking features into their games, as
Gomez and Gellersen (2019) discuss about “Tom Clancy’s: The Division 2”.
Based on this SLR, it is safe to assume that while eye tracking and gaze control in gaming have mostly
been used as an input device to help those with motor impairment, as discussed by Vickers, Istance
and Smalley (2010). However, recently the use of gaze control has been used as an enhancement to
traditional methods of controlling games or even as a tool for developers to help them create
sophisticated and adapting AI agents, as discussed by Wetzel, Spiel and Bertel (2009).
It is also clear that, based on this SLR, eye tracking and gaze control help those with motor impairments
or difficulties to play and be apart of the gaming community as discussed by Istance et al (2010). It
also allows them to play games that usually require not only full body control but also good reaction
times to achieve high scores. Eye gaze control has the benefit of being a natural movement and is
extremely fast in reaction times to movement on a screen, faster than almost any hand pressing of a
button. This is all discussed by Vickers, Istance and Smalley (2010).