Eyes Can Draw A High Fidellity Freewye Drawing Method With Unimodal Gaze Control

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 21

Int. J.

Human–Computer Studies 170 (2023) 102966

Contents lists available at ScienceDirect

International Journal of Human - Computer Studies


journal homepage: www.elsevier.com/locate/ijhcs

Eyes can draw: A high-fidelity free-eye drawing method with unimodal gaze
control
Lida Huang a ,∗, Thomas Westin a , Mirjam Palosaari Eladhari a , Sindri Magnússon a , Hao Chen b
a
Department of Computer and Systems Sciences, Stockholm University, Stockholm, Sweden
b
People’s Hospital of Guangxi Zhuang Autonomous Region, Nanning, China

ARTICLE INFO ABSTRACT

MSC: EyeCompass is a novel free-eye drawing system enabling high-fidelity and efficient free-eye drawing through
00-01 unimodal gaze control, addressing the bottlenecks of gaze-control drawing. EyeCompass helps people to draw
99-00 using only their eyes, which is of value to people with motor disabilities. Currently, there is no effective
Keywords: gaze-control drawing application due to multiple challenges including involuntary eye movements, conflicts
Free-eye drawing between visuomotor transformation and ocular observation, gaze trajectory control, and inherent eye-tracking
Eye tracking errors. EyeCompass addresses this using two initial gaze-control drawing mechanisms: brush damping dynamics
Unimodal gaze control
and the gaze-oriented method. The user experiments compare the existing gaze-control drawing method and
EyeCompass, showing significant improvements in the drawing performance of the mechanisms concerned.
The field study conducted with motor-disabled people produced various creative graphics and indicates good
usability of the system. Our studies indicate that EyeCompass is a high-fidelity, accurate, feasible free-eye
drawing method for creating artistic works via unimodal gaze control.

1. Introduction a dual task with the eyes only, causing conflict between visuomotor
transformation and ocular observation (Gips and Olivieri, 1996; Hornof
Free-hand drawing is the primary approach employed in able- et al., 2003).
bodied drawing and an important component of creative and artistic Fundamentally, free-eye drawing requires emulation of the click-
activities, especially in the first conceptual phase. Free-hand drawing and-drag mouse event. To the best of the authors’ knowledge, there
stimulated by eyes is known as ‘‘free-eye drawing’’ (Tchalenko, 2001). is a scarcity of recent studies focusing on the interactive modes or
For motor-disabled people, free-eye drawing enables them to express
methods to facilitate accurate control of gaze trajectories in real time.
their ideas visually; thus, it is important to develop accessible free-eye
Consequently, there is neither an effective gaze-control interaction
drawing methods for the motor disabled.
method nor an application for conducting drawing and related graphic
However, currently, no research has been able to facilitate gaze-
control or free-eye drawings with high-fidelity graphics and system ac- arts. Lastly, eye-tracking techniques make unavoidable errors when
cessibility. There are many challenges related to drawing high-fidelity estimating the actual gaze position, causing a disparity between what
graphics via gaze control. For instance, early gaze-control drawing the user’s gaze produces and what they are seeking to achieve.
applications cannot distinguish between viewing and drawing, as the Gaze-control modality is also a challenge for free-eye drawing.
gaze signal is continuous (Gips and Olivieri, 1996; Heikkilä, 2013a,b; Unimodal gaze control has many advantages; for example, inclusion
Hornof et al., 2003; Meyer and Dittmar, 2009; Van der Kamp and for motor-disabled people. As a single modality, it is also convenient
Sundstedt, 2011). In addition, unintentional eye movements and eye to integrate with additional input devices to extend its interaction.
jitters can cause involuntary selections, the so-called ‘‘Midas-Touch’’ However, unimodal gaze control is limited in that it cannot facilitate
problem (Chin et al., 2008; Istance et al., 2008; Jacob, 1991). As the simultaneous viewing and drawing. Many related studies have em-
current gaze-control drawing method generates drawn lines from the ployed multimodal gaze control (Van der Kamp and Sundstedt, 2011;
position of the user’s gaze points on screen in real time, any involuntary
Dziemian et al., 2016; Creed et al., 2020) to allow eyes to accomplish
saccades will cause drawing errors. Furthermore, drawing by hand
the dual task. However, multimodal gaze control is less accessible as
is a dual task, where the eyes are used for viewing and the hands
it excludes severely motor-disabled people whose hand or foot use is
for drawing. By contrast, gaze-control drawing involves performing

∗ Corresponding author.
E-mail address: lyyda@hotmail.com (L. Huang).

https://doi.org/10.1016/j.ijhcs.2022.102966
Received 24 January 2022; Received in revised form 4 November 2022; Accepted 13 November 2022
Available online 21 November 2022
1071-5819/© 2022 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

impaired (Cecotti, 2016; Dziemian et al., 2016; Parisay et al., 2021). It a simple free-eye drawing (Tchalenko, 2001) function using electroocu-
also has a greater requirement for equipment. lography (EOG) to translate gaze-trajectory signals into colored lines.
Almost a decade has passed since the latest gaze-control drawing The study identified two significant problems with the free-eye drawing
literature was published in 2013 (Heikkilä, 2013a,b), there is currently method. First, it could not distinguish viewing and drawing, as the eye-
neither a solution nor a viable free-eye drawing system to address tracking signal was continuous. Second, the study noted the difficulty
the aforementioned problems (i.e., involuntary eye movements, the in accurately controlling the gaze for drawing. Further studies of free-
conflict between visuomotor transformation and ocular observation, eye drawing subsequently stagnated, as the approach was considered
gaze trajectory control, inherent eye-tracking errors, and unimodal gaze error-prone and impracticable.
control). Unfortunately, existing gaze-control methods reported in the
To address the conflicts between viewing and drawing, EyeDraw
earlier literature (Gips and Olivieri, 1996; Heikkilä, 2013a,b; Hornof
proposed a state transition between ‘‘Looking’’ and ‘‘Drawing’’ for
et al., 2003; Meyer and Dittmar, 2009; Van der Kamp and Sundstedt,
eyes to enable ocular observation (Hornof et al., 2003). However,
2011) are feasible neither for visual creativity at a free-hand drawing
precise control of the cursor was indicated to be problematic in the
level nor with modern technological advancements.
user feedback on EyeDraw (Hornof et al., 2003). Later, Yeo and Chiu
This paper therefore proposes a novel free-eye drawing system
(2006) classified the viewing and drawing involved in eye drawing
(EyeCompass) that enables high-fidelity and efficient free-eye drawing
via unimodal gaze control. This system aims to address the bottlenecks by interpreting natural eye movement patterns and distinguishing the
of accurate click-and-drag for gaze interaction and innovate gaze- distances between gaze shifts and eye-behavior jitters; however, they
control drawing methods. There are three important components in the did not report their results.
techniques used by EyeCompass: (1) the ability of the eyes to ‘acquire’ Due to the lack of control of gaze trajectories, previous researchers
the brush and navigate the input, rather than acting as direct input; halted exploration of the free-eye drawing approach and adopted other
(2) the use of visual cursors in the system interface; and (3) the way eye drawing strategies, the most common of which provided a set of
in which smooth pursuit is triggered and employed in gaze interaction. basic shapes to be applied to a canvas. For example, the EyeDraw draw-
These three components integrate within EyeCompass and can also be ing tool was based on a system of built-in geometries, namely straight
generalized to universal eye-tracking applications. lines, rectangles, and ellipses, as well as colors. Another application,
The structure of this paper is as follows. First, the previous literature EyeArt (Meyer and Dittmar, 2009) extended the preset geometries
in gaze control drawing is reviewed in Section 2. Then, the rationale to include, for example, circles, text, polylines, and polygons. It also
of gaze control drawing is discussed in Section 3, while the design supported broader canvas editing functions, such as eraser, color fill,
principles and the techniques used in EyeCompass are introduced in and undo. The most recent gaze-control drawing prototype was Eye-
Sections 4 and 5. Then, in Sections 6 to 8, the implementation of Sketch (Heikkilä, 2013a,b), developed in 2013, which utilizes a similar
two user experiments is presented to explore how our method affects drawing tool to EyeDraw for rectangles, ellipses, and lines.
graphical fidelity. In Section 9, a field study among motor-disabled
There are, to our knowledge, none of the developed applications
people demonstrating the usability of EyeCompass in context by its
in the previous studies are available. Currently, the most used gaze-
intended users is reported. Finally, the implications of this study’s
control drawing applications among people with motor disabilities are
findings for future gaze-control interaction design are discussed in
traditional graphics software, such as Photoshop, Windows Paint, and
Sections 10 and 11.
As stated, the overall aim of the work presented in this paper is to Inkscape, as revealed by the latest gaze interaction research among
create a system that can help people to draw using their eyes instead individuals with motor disabilities (Huang et al., 2022). As conven-
of using their limbs. In particular, the paper makes the following five tional software interfaces cannot support gaze control directly, motor-
contributions: disabled people control these applications via an indirect control,
(1) A retrospective review of free-eye drawing techniques across dis- specifically, emulating mouse and keyboard devices through a toolbar
ciplines, including eye physiology, eye movement patterns in drawing, at the desktop borders and replicating mouse and keyboard events
and eye-tracking technologies, and systematic identification of multiple using dwell click (Menges et al., 2019). Tobii Windows Control1 and
challenges in the field in terms of the usability of existing free-eye Optikey2 are typical indirect gaze control tools available in the market.
drawing techniques. Both the results published for previous applications and indirect
(2) A novel and open-source unimodal gaze-control drawing system gaze control in practice have shown a lack of precision and creative
and drag- and-drop emulation that can be generalized to common flexibility (Huang et al., 2022). The reason for this is that the existing
graphics software and similar gaze interaction. The system has been gaze-control drawing methods are all based on a common form of
made publicly available to enable future researchers to build new gaze-control mechanism: ‘‘what you look at is what you get’’ (Jacob,
functionalities on top of our results. 1991). Specifically, this means that the gaze position of the user is
(3) Brush-damping dynamics that can stabilize gaze-trajectory con- where the pixel is generated on the canvas. Recently, Streichert et al.
trol, enhance drawing accuracy, and enable the user to create smooth (2020) compared the drawing performance via this type of gaze-control
curves that were previously unattainable. modality to the performance via touch input. The error rate with
(4) A gaze-oriented mechanism leveraging smooth pursuits in an
the eye tracker (mean=0.274 ± 0.148) was significantly higher than
innovative way so that the user can control the visual stimulus, reduce
the error rate with touch input (mean=0.098 ± 0.020), revealing the
involuntary eye movements, and navigate the gaze point freely.
infeasibility of precise drawing tasks via current modalities.
(5) A comparison of our methods to the existing free-eye drawing
In summary, the common problem of existing studies is the inability
methods in terms of: a) drawing performance and system robustness,
and b) usability and accessibility among motor-disabled people. to replicate free-hand drawing and control real-time gaze trajectories
accurately. To the best of the authors’ knowledge, there have been no
2. Related research further improvements in gaze-based free-eye brush control, especially
for drawing high-fidelity sketches via gaze-control drawing.
2.1. Unimodal gaze-control drawing methods

The earliest gaze-control drawing function was created in the Ea- 1


https://www.tobiidynavox.com/products/windows-control/
2
gleEyes system (Gips and Olivieri, 1996) in 1996. This system employed http://www.optikey.org/

2
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

2.2. Multimodal gaze-control drawing methods 3.1. Voluntary eye movements

To draw via the eyes, the drawing system inputs gaze trajectories
In 2011, Van der Kamp and Sundstedt (2011) introduced a multi- and outputs pixels corresponding to those trajectories; therefore, free-
modal drawing program managed via gaze and voice input. The voice eye drawing depends on voluntary eye movements to constitute the
commands started and stopped the drawing, while the gaze commands gaze trajectories and conduct drawing.
determined the cursor position. This approach avoided activating draw- There are two types of voluntary eye movements: saccades and
ing using dwell click and aimed to negate the ‘‘Midas touch’’ problem. smooth pursuits (Duchowski and Duchowski, 2017; Verghese and Beut-
The program included six features: line, rectangle, ellipse, polyline, ter, 2002). Saccades are rapid, step-like changing and ballistic move-
flood fill, and curves. However, this multimodal approach raised system ments in very high velocities (e.g., 300◦ per second) (Verghese and
demands, including voice recognition and hardware issues, namely Beutter, 2002). Saccadic scan paths are polylines, and the endpoints
(fixations) naturally leap toward task-relevant objects (Jacob, 1991;
with the microphone and graphics processing unit (GPU). It also ex-
Kasprowski et al., 2016). In contrast, smooth pursuits are smooth
cluded severely motor-disabled people who are incapable of speech.
eye movements that follow moving objects; hence, the velocities and
Nevertheless, this approach did address the aforementioned difficulties positions of the eye movements vary according to the visual tar-
with gaze-controlled drawing. gets (Duchowski and Duchowski, 2017; Verghese and Beutter, 2002).
Dziemian et al. (2016) presented a gaze-control robotic arm for Existing gaze-control drawing methods depend on saccades to con-
writing and drawing in a large three-dimensional space, 87 cm from duct drawing. Previous free-eye drawing methods can be described as
the eye tracker, and with an approximately 84 × 59 cm workspace ‘‘where you gaze is where you draw’’ (Jacob, 1991), in which the drawn
area. The design utilized a 3D eye-movement data flow to control the lines are exactly the saccadic gaze path. Therefore, these methods only
drawing task and gaze positions to determine the robot’s endpoints on generate irregular polylines. Due to eye physiology, it is impossible
to draw smooth curves without seeing the target stimulus, as smooth
the canvas. The attachment or detachment of the robotic arm from
pursuits are not involved. Additionally, ballistic saccades cannot draw
the canvas was executed using displacements of eye-tracking data via
pixel-level figures. Human eyes are the organ of perception. Voluntary
tilting the head forward (toward) and backward from the canvas. This eye movements are naturally triggered by the need to deposit visual
technology demonstrated that eye-tracking is a powerful tool for assist- information into the foveal region (Duchowski and Duchowski, 2017;
ing motor-disabled people in performing graphic activities. However, Snowden et al., 2012). If the eyes look at an object within the foveal
the high-end nature of the gaze-based tele-prosthetic (UR10 robot arm) region, the eye movements become subtle (Majaranta and Bulling,
used meant that it was unavailable to ordinary users. The authors also 2014). For instance, the eyes cannot make saccadic movements when
noted that the need to move the head was a disadvantage that would drawing pixel-level figures.
particularly apply to severely paralyzed people. Studies show that the paths of smooth pursuits are fluent curves
when the eyes are pursuing a smooth motion (Soechting et al., 2010).
In terms of the other similar extant research on gaze-control visual
Therefore, this paper seeks a solution using smooth pursuits as eye
design, Creed et al. (2020) proposed a novel creative design application
movements in the gaze-control drawing system. In Section 4.2, a gaze-
called Sakura that used multimodal gaze interaction, in which gaze oriented mechanism, namely, the ‘‘Compass’’, is proposed to trigger
control cooperated with a mechanical switch, and included a gaze- smooth pursuits based on what the user wants to draw, so that the user
based interface with 100 pixels padding and button size. Its use and can achieve fluent and high-fidelity graphics.
functions included adding texts, shapes (squares, circles, lines, and vec-
tors), importing images, scrolling the canvas, alignment, and preview. 3.2. The stimulus-driven pattern
Nevertheless, Sakura was not tailored for specialist drawing functions,
and it could not facilitate free-hand design. The results indicated that Characterized by their perceptive nature, eye movements follow a
it focused on interaction design, such as webpages and vector-based stimulus-driven pattern; in other words, a goal-driven modality (Hay-
hoe and Ballard, 2005; Yantis, 1998). The occurrence of fixation is
wireframing.
intimately related to the visual stimulus (Hayhoe, 2000; Hayhoe and
In summary, to date, multimodal gaze drawing methods have not Ballard, 2005; Roelfsema et al., 2000; Ullman, 1987). Similarly, Slo-
been able to overcome the obstacles associated with gaze-control draw- bodenyuk (2016) argued that eye movements toward the target are
ing methods. Multiple modalities also increase the complexity of gaze motivated by the need to see external objects and are intimately related
interaction and create barriers for some severely motor disabled people to contextual stimuli. Furthermore, eyes are more sensitive to a moving
to accessing the applications. target, as the peripheral area of the fovea is physiologically sensitive to
motion. In vision, moving targets are ascribed a greater degree of visual
attention than static targets (Johansson et al., 2001). For example, a
2.3. Summary viewer’s gaze can be attracted quickly by a sudden movement in their
periphery (Hillstrom and Yantis, 1994; Johansson et al., 2001). In the
case of free-eye drawing, the drawn lines and the moving brush cursor
The aforementioned studies have addressed the lack of technologies are target stimuli. It is the perceptive nature of eyes to saccade at these
available in gaze-control drawing applications, yet challenges remain target stimuli in the drawing process; the subsequent involuntary eye
in conducting graphic tasks at high fidelity and a creative drawing movements cause unexpected lines to be drawn. It is also challenging
level. Table 1 compares the capacities of previous gaze-control drawing for the user to move their gaze away from the target stimuli to a to-
studies and the differences between approaches in this field of research. be-drawn position in the blank space, due to the lack of gaze-related
targets. Therefore, drawing with the eyes cannot be physiologically
controlled as freely as with the hands.
3. Rationales for gaze control drawing In order to remove the visual distractions and enhance controlla-
bility, the current project built a cursor interface that adhered to the
brush point (described in full in Section 4.2). The cursor interface plays
This section will discuss the rationales behind gaze-control drawing, a role in a moving target stimulus into the top priority position, above
including the eye-movement patterns and eye physiology in eye-hand the canvas elements, and enables the user to focus their visual attention
drawing to comprehend in detail how gaze-control drawing functions on the correct to-be-drawn position by naturally following its path in
and how this corresponds to the solutions provided in EyeCompass. the drawing process.

3
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Table 1
Comparison of the related research.
Project Drawing Functions Free-hand Gaze-control Modality
EagleEyes (Gips and Olivieri, Free-eye drawing Yes Unimodal (Gaze)
1996)
EyeDraw (Hornof et al., 2003) Adding straight lines, rectangles, No Unimodal (Gaze)
and ellipses
EyeArt (Meyer and Dittmar, Adding straight lines, rectangles, No Unimodal (Gaze)
2009) ellipses, circles, text, polylines,
and polygons
EyeSketch (Heikkilä, 2013a,b) Adding straight lines, rectangles, No Unimodal (Gaze)
and ellipses
Van der Kamp and Sundstedt Adding straight lines, rectangles, No Multimodal (Gaze and
(2011) ellipses, polylines, and curves Voice Input)
Dziemian et al. (2016) Free-eye drawing Yes Multimodal (Gaze and
Robotic Arm)
Sakura (Creed et al., 2020) Adding texts, squares, circles, No Multimodal (Gaze and A
lines, vectors, and images Mechanical Switch)

3.3. Gaze pattern in eye-hand drawing vary in different gaze control cases. Even when repeatedly implement-
ing rigorous calibration, eye-tracking errors cannot be eliminated when
Those involved in gaze-control drawing research must understand the eye coordinates are computed (Bozomitu et al., 2017; Feit et al.,
the eye-hand interaction involved when drawing by hand. Tchalenko 2017).
(2007) discovered that there are two eye movement modes in eye- There are several free-hand drawing problems associated with in-
hand drawing, namely ‘‘close pursuit’’ and ‘‘target locking’’. Close herent eye-tracking errors. Specifically, there is always a deviation
pursuit characterizes the fixations that fall within the vicinity of the between the actual and estimated gaze points. When the user draws
pen tip, coinciding with small saccades around the paper (canvas) to using the raw free-eye drawing method, the position where the user fix-
help situate the drawn lines on the correct locations. Target locking ates differs from where the brush draws on the canvas. As the inherent
characterizes consistent fixations that chase the drawn line’s endpoint eye-tracking errors in the eye-tracking program are unpredictable due
when the line is drawn. He also found that the eyes played two roles to their multiple causes, it is challenging for the user to know where
during the drawing process: (1) capturing and cognizing visual data the endpoint of drawn lines will be located.
and determining the starting point on the canvas, before positioning the In order to alleviate the effects of inherent eye-tracking errors, this
pen; and (2) assisting the hand in positioning the pen and executing the paper proposes a solution that employs a user interface (UI) offsetting
drawing precisely. Similarly, saccades between drawn lines and hands method in the EyeCompass system design.
were observed by Gowen and Miall (2006), whose findings indicated
that eyes intuitively fixate on the drawn lines in observation, and 4. System design
integrate saccading to capture external visual information, with the
fixation paths not usually overlapping the drawn lines.
Based on the findings of previous related research, this paper initi-
The eyes are naturally inclined to perceive information during eye-
ated a hybrid approach that enhanced the flexibility of drawing, and
hand drawing. Previous free-eye drawing methods were error-prone,
optimized the graphic fidelity and accuracy of gaze control. First, the
as they required the eyes to carry out conflicting tasks simultaneously
previous free-eye drawing dynamic, in which viewing and drawing are
(i.e., visuomotor transformation and ocular observation). Furthermore,
conjoined, was updated by adding a time lapse between the gaze point
those methods translated the user’s fixations directly into brush tra-
and the brush velocities. Second, a gaze-oriented drawing interface
jectories in real-time. For example, suppose the user tries to observe
leveraging smooth pursuits was employed to navigate the eye move-
the to-be-drawn position with, for instance, ‘close pursuit’ or ‘target
ments, thereby overcoming the aforementioned issues associated with
locking’. In that case, the program recognizes the eye movements
free-eye drawing. Unimodal gaze control was selected as it provided
as drawn lines, causing involuntary strokes. To prevent unexpected
the most straightforward mechanism and optimal compatibility for
strokes, the user must continuously draw without saccades and breaks,
combining external sensors, such as a foot paddle, haptic touch bar,
and cannot encode the structures of the to-be-drawn graphics to omit
and voice input. The resultant accessibility of this design is discussed
observational eye movements. Hence, this drawing mechanism opposes
natural drawing activities. in the following sub-sections.
To preserve the observational eye interaction and provide a po-
sitional tolerance of the eye movements necessary in drawing, we 4.1. Damping function
introduced brush-damping dynamics (described in full in Section 4.1).
Our method enables the user to drag the brush remotely with eye A brush-damping function was employed on the brush point so
movements via a damping function, thus providing a time latency with that the positions and velocities of the brush point and the gaze point
which to make observations and encode the to-be-drawn position on were differentiated. The gaze position was not used as a direct input,
the canvas for the user. as is usually the case, but as a remote controller to pilot the brush
point. Based on Eq. (1), the brush velocity is an instantaneous speed
3.4. Eye-tracking errors determined by the gaze position in real time. When the user moves
the gaze point, the brush moves toward that point with its velocity
Multiple factors cause eye-tracking errors; these can be categorized decreasing linearly, following the decreasing distance between the
into device-related factors, such as pupil detection and mapping algo- brush point and the gaze point (Fig. 1). This type of brush dynamics
rithms that map the pupils to the screen coordinates (Lander et al., emulates the eye-hand drawing motion, as the brush can be freely
2018), and human-related factors (Kasprowski et al., 2016), such as accelerated, decelerated, and ceased. The damping ratio 𝜕 provides a
angle kappa, cornea curvature, and relative head position and orienta- buffer for the user to think and observe the ongoing sketching, which
tion to the eye trackers (Hansen and Ji, 2009). Thus, eye-tracking errors mimics the intuitive eye movement patterns of eye-hand drawing.

4
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 1. Description of the damping function. It is evident that the brush velocity is proportional to the distance between the brush point and the gaze point.

In traditional eye-tracking programs, the Exponentially Weighted where 𝑝gaze is the gaze position; 𝑝brush is the brush position; 𝑟 is the
Moving Average (EWMA) is employed to filter out eye jitters (Van radius of the Compass; and 𝑣brush is the brush velocity.
der Kamp and Sundstedt, 2011; Zhang et al., 2020; Stawicki et al., Second, the circle engages with the range of shapes with ease, using
2017). This algorithm uses time series gaze-position data to smooth out the Fourier Transform (FT) (Boashash, 2015; Bochner et al., 1949;
short-term fluctuations and applies exponentially decreasing weighting Hassanieh et al., 2012). Unlike drawing via the former gaze-control
factors (Hunter, 1986). Our drawing system did not employ EWMA, drawing method, i.e., strenuously saccading anywhere on the canvas,
as it is affected by the older datum points. Instead, another damping users can create various shapes simply by fixating on the outline of the
ratio 𝜎 (Eq. (1)) was applied to the velocity of the eye position, which circle when using CC (Fig. 2). The graphics drawn can, thus, be de-
meant that the gaze point had a minimum time lag and damping from fined as a figure by coordinates in two-dimensions, namely an integral
the actual eye movement. This damping ratio functions as a filter to function dependent on space. In the FT, the brush trajectories can be
soften eye jitters. decomposed to a function of spatial frequency, that is, trigonometric
{ functions or a linear combination of their integrals. Any shape could,
𝑣brush(𝑥,𝑦) = 𝜕(𝑥gaze − 𝑥brush , 𝑦gaze − 𝑦brush ), 0 < 𝜕 < 1 therefore, be drawn by tracing the circular outline in different time lags
(1)
𝑣gaze(𝑥,𝑦) = 𝜎(𝑥eye − 𝑥gaze , 𝑦eye − 𝑦gaze ), 0.1 < 𝜎 < 1 (Fig. 3).
Third, the DN enables the precise control of the brush direction.
where 𝑣brush (𝑥,𝑦) is the vector of brush velocity; (𝑥gaze , 𝑦gaze ) is the Since the brush point is separated from the gaze, the brush may fall
gaze point coordinates; (𝑥brush , 𝑦brush ) is the brush point coordinates; beyond the view area of fixation. The DN displays a line segment
𝑣gaze(𝑥,𝑦) is the vector of gaze velocity; (𝑥eye , 𝑦eye ) is the eye position; pointing to the gaze point from the brush center, preventing the user
and, 𝜕 and 𝜎 are the damping ratios. The value interval (0.1, 1) for 𝜎 from glancing back at the brush center for the drawing direction and
is an imperceptible range of the time lapse. Detachment between the causing unwanted sketches. The line segment signifies the direction of
two points is evident when the damping ratio is less than 0.1 (see the brush movement and can accurately create vertical and horizontal
Section 7.6.2). lines and specific angles (e.g., 30◦ , 60◦ ).

4.2. The compass cursor design 4.2.1. Eye-tracking error offset


The integration of the brush damping dynamics and the gaze-
The interface of the brush is composed of a circular cursor (CC), of oriented mechanism enabled the user to identify the correct spatial
which the circle center is the brush point, and a directional needle (DN) position of the shape to be drawn by observing information from the
pointing from the brush point (circle center) to the gaze point, namely Compass. In an environment without eye-tracking errors, CC radius was
the direction of the brush movements (Fig. 2). The interface functions able to generate a constant brush speed (Eq. (3)), while the DN showed
similarly to a compass, which is why it was named ‘‘Compass’’. It has the brush direction.
three novel functions. However, eye-tracking errors continued to exist even after careful
calibration (Section 3.4). The bias of the estimated gaze position can
First, the Compass serves as a prior visual stimulus on the canvas.
mislead the gaze and cause deviation in gaze trajectory. Fig. 4 illus-
With the damping function, the brush obtains velocity when there is
trates how the eye-tracking error causes drawing mistakes and the bias
a positional difference between the gaze point and the brush point
of brush velocity 𝑒⃗. As shown in Fig. 4, gazing horizontally at the
(Eq. (2)). The brush velocity is proportional to the distance between the
Compass did not achieve horizontal segments; instead, it produced a
brush point and the gaze point, according to Eq. (1). Thus, the brush
slanting segment connecting the brush point and the estimated gaze
is able to move at a constant speed when the user’s gaze follows the
point. The deviation is generated because the system computed the
outline (the radius) of CC (Eq. (3)). Then, triggered by the stimulus-
brush velocity with the estimated gaze position (Eq. (4)).
driven pattern, the eyes will continue to smoothly pursue the outline
of CC. This method follows a gaze-oriented mechanism and minimizes 𝑣⃗ = 𝑣⃗′ + 𝑒⃗ (4)
the uncontrollable saccadic movements and unintentional gaze of the
where 𝑣⃗ is the brush velocity driven by the real gaze position; 𝑣⃗′ is
eyes during free-eye drawing.
the brush velocity driven by the estimated gaze position; 𝑒⃗ is the brush
‖𝑝gaze − 𝑝brush ‖ = 𝑟 (2) velocity bias from the eye-tracking accuracy error between 𝑣⃗ and 𝑣⃗′ .
Based on Eqs. (3) and (4), the brush speed changes as the gaze
𝑣brush ∝ 𝑟 (3) direction changes; the bias of brush speed will be highest when the

5
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 2. Description of the Compass. The dashed circle denotes a circumference of 360◦ , and DN denotes the angle of the brush velocity. When the gaze rests on the outline of
the circle, the brush obtains a velocity toward the gaze point. This mechanism enables the user to manipulate the brush and sketch more accurately.

Fig. 3. Description of drawing different wavy lines using the Compass. For example, by constraining their gaze to the Compass at the location of the red dots, the user can create
smooth curves.

Fig. 4. The drawing effects with a hypothetical deviation of the estimated gaze point. (a) If the user gazed at the right side of the CC horizontally, the brush would move slower
than the actual speed and go toward the upper right; (b) if the user gazed at the left side horizontally, the brush would move faster than the actual speed and go toward the
upper left.

gaze direction and the eye-tracking error are equiangular (Fig. 5). and angular bias. A combination of the SC and ADN was able to resolve
Consequently, the CC cannot control the brush with a constant speed, the mixed errors, e.g., the case shown in Fig. 4.
and the brush movement no longer aligns with the DN.
(1) The Spiral Cursor (SC). (1) The spiral cursor (SC). As shown
Based on the stimulus-driven pattern (Section 3.2), we used two
in Fig. 5, the consequence of the bias was that the brush would be
transformed UIs to orient the gaze to make a visual adjustment: an
Archimedean spiral cursor (SC) to offset the speed bias, and a direc- almost still in a particular direction while moving extremely fast in the
tional needle array (ADN) to offset the angular bias. Generally, in opposite direction. The loop of an Archimedean spiral can compensate
practice the eye-tracking errors were integrated with both speed bias for the difference of 𝑟 (Eq. (2)) between the two opposite directions

6
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 5. The bias of brush speed with an eye-tracking error. When the gaze direction and the eye-tracking error are equiangular, the bias of brush speed is the highest. (a) The
brush speed is particularly low; (b) the brush speed is particularly high.

Fig. 6. Description of the spiral cursor theory. It is represented by a hypothetical deviation of the estimated gaze point and applied to all eye-tracking error scenarios.

(𝜃=180◦ ). The SC is denoted by the polar coordinates (𝑟, 𝜃) (Eq. (5)). approximately 2◦ ) (Snowden et al., 2012), a single DN would fall
outside the fixation area if the eye-tracking error was considerable. A
𝑟 = 𝑝brush + 𝑏𝜃 (5)
user would, therefore, lose the visual target and the information for
where 𝑝brush is the brush center position on the screen; 𝑟 is the radius conceiving the next drawing step. Thus, as a visual stimulus beyond
of the spiral cursor; 𝜃 is the polar angle in the polar coordinate; and 𝑏 the fixation area, the deviated DN is likely to distract visual attention
is the parameter controlling the distance between loops. In the system,
and trigger involuntary eye movements.
the 𝑏 value follows Eq. (6).
In light of the stimulus-driven attention principle of eyes, again, the
𝑏 = 2 △ 𝑟∕𝜋 ≈ 0.64 △ 𝑟 (6)
ADN aims to retain the DN within the perception area of the eyes when
where △𝑟 is the deviation of the estimated gaze position from the real the user makes a visual change, and prevent the user’s eyes from falling
gaze position due to inherent eye-tracking errors. The calculation of the outside the oriented cursor. Instead of directly gazing at the required
𝑏 value is shown in Fig. 6. angle, the user develops drawing strategies by resting their gaze on a
(2) The Directional Needle Array (ADN). As shown in Fig. 4,
particular line of the ADN until the DNs perceive the desired direction
the user tried to gaze horizontally; however, the DN went to the
(Fig. 7). With the gaze-oriented array, the user can easily change their
upper right, due to the angular bias between the actual gaze and the
estimated gaze position. Therefore, the user needs to make a visual gaze position. This method removes the limitation inherent in the
change when gazing at the DN if the brush direction is horizontal. notion of ‘‘where you gaze is where you draw’’, ultimately maintaining
Since eyes have a small high visual acuity region (foveal region of the ocular observation and reducing the angular bias in drawing.

7
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

6. Pilot study

Before conducting the drawing experiments, we tested the error


distributions after careful calibration with a Tobii 4C eye tracker to
ground the accuracy for this particular eye tracker in the specific study
setup.
The data was collected via a Tobii 4C eye tracker with a 27-inch
monitor at 1920 × 1080 pixels (81.59 PPI). The lighting environment
was a windowless room with 300 lux.4 The participants (N=100) sat
Fig. 7. Description of the directional needle array. 70 cm from the screen. None of them had any eye-related diseases or
visual deficiencies. Before the test, the participants were informed of
the goal of the experiment and were asked to sign a consent form. Then,
5. System implementation they were given a five-minute instruction on eye-tracking calibration.
After calibration, the participants were asked to gaze at the screen at
night dots (benchmarks) for two seconds each.
The EyeCompass system was built with Unity using C# and To-
The results showed that the average error of Tobii 4C was 1.33◦
bii API3 and runs in full-screen kiosk mode with a resolution of
(52.06 pixels). As Fig. 9 shows, the eye-tracking errors were lower than
1920 × 1080 pixels. It supports Tobii gaming eye trackers and can the foveal region of human eyes (approximately 2◦ (Snowden et al.,
support other types of eye trackers by using their APIs. 2012)), implying that the eyes cannot perform subtle control within
In the system, the dwell click was adopted to initiate commands, this degree of visual angle.
including ‘start drawing’ and ‘end drawing’, and to click on the buttons,
as previous research has shown this to be the most sophisticated gaze 7. Experiment - User study 1
selection approach (Lankford, 2000; Menges et al., 2019; Urbina and
Huckauf, 2010; Zhang et al., 2020). After dwell clicking on a certain We carried out two experiments (User Studies 1 and 2) to compare
position on the canvas, the brush is activated and the compass circus our method to the existing free-eye drawing method based on the
appears at the brush point. The system adopts a default dwell time of paradigm of ‘‘where one looks is where one draws’’ (Jacob, 1991; Gips
500 ms. and Olivieri, 1996; Streichert et al., 2020).
In User Study 1, we compared the performance of the free-eye
The system interface comprises the fundamental free-hand drawing
drawing method with one of the mechanisms introduced in the Eye-
functions such as pen, eraser, undo, redo, brush settings, grid settings,
Compass: the damping dynamics. This experiment aimed to explore
zooming, and nudging (Fig. 8). It also contains three unique functions
the hypotheses that (1) the brush-damping dynamics allow users to
comprising the compass mechanism: the brush speed setting (a medium
create figures of a higher fidelity than the previous free-eye drawing
brush speed of 1x and a decelerated speed of 0.5x); the cursor UIs (CC
method; (2) the brush dynamics render the system more intuitive and
and SC), and the directional needle styles (DN and ADN). This enables efficient by providing time lapses for drawing observation and reducing
the user to draw at different speeds and with four compounded compass involuntary eye movement errors. Two research questions were set:
UIs. RQ1: Do the damping ratios improve the drawing performance
Our system maintains two modes for drawing: a drawing mode compared to the existing free-eye drawing method?
(default) and a viewing mode (Fig. 8), and the user can switch between RQ2: What are the optimal damping ratio settings?
modes through the left-hand menu button View/Draw. In the Draw
mode, the user can draw on the canvas and access the brush settings 7.1. Apparatus
in the middle of the right-hand menu. In the View mode, the user
cannot draw but can nudge the canvas (left, right, up, and down) The user study was carried out in the office of the Department of
and examine the sketches. The two modes help the user differentiate Computer and System Sciences, Stockholm University. We used the
between ocular observation and drawing, which prevents the user from low-cost Tobii Eye Tracker 4C as this (or similar trackers) is what
activating the brush when observing the drawing (i.e., the ‘Midas- consumers would use. The eye tracker was mounted on the bottom of
the monitor. The participants sat in front of the monitor at a distance of
touch’ problem (Chin et al., 2008; Jacob, 1991; Velichkovsky et al.,
69 ± 15 cm. The system ran on a desktop computer with an Intel core
2014). The menus are built with gaze-sensitive buttons in width and a
i5-9400f CPU @ 2.90 GHz (processor) using 16 GB RAM and Gigabyte
height of 100 pixels (Hansen et al., 2018) for gaze accuracy.
GeForce GTX 1660 Super Gaming Oc 6G (Graphics Card) and Windows
Gaze-selection mistakes are a common problem for users of eye- 10. The interface was displayed on a 27-inch monitor with a 16:9 aspect
tracking interaction applications, and hence are addressed in our design ratio and a resolution of 1920 × 1080 pixels.
and implementation. Among previous studies, Creed et al. (2020) iden-
tified a button-selection issue leading to users clicking on the wrong 7.2. Participants
buttons where they are placed in close proximity on an interface. Sim-
ilarly, Zhang et al. (2020) discovered that increasing the key gap from In total, 60 participants (28 females and 32 males, aged between
0.2 cm to 1.2 cm between frequently typed keys in a 23-inch screen 20–35 years, non-disabled) from Stockholm University, KTH Royal
significantly improved (nearly doubled) typing speed and reduced the Institute of Technology, and Karolinska Institutet were recruited via
error rate (decreased to one-third to one-fifth). In order to address this email, word of mouth, and social media. None of them had any eye-
challenge, we placed the most frequently selected pairwise controls, related diseases or visual deficiencies; 38 of them had normal visual
i.e., Undo and Redo, Zoom In and Zoom Out, at the left and right-side acuity, 16 wore glasses, and 6 wore contact lenses. Each participant
corners of the screen, respectively. The intention of this design was to was given 100 SEK (∼9 USD) as a token of appreciation for their
minimize the selection error rate. participation. All of the participants in the drawing tasks had drawing

4
EN 12464-1:2021 (Light and lighting. Lighting of work places —
3
https://developer.tobii.com/pc-gaming/unity-sdk/api-overview/ Classroom)

8
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 8. EyeCompass interface overview. CC stands for the circular cursor; SC stands for the spiral cursor; DN stands for the directional needle; ADN stands for the directional
needle array.

Fig. 9. The eye-tracking error distribution after calibration (in degrees of visual angle and pixel). The pixel coordinates of the eight benchmarks on the screen are (180, 180), (180,
530), (180, 880), (960, 180), (960, 530), (960, 880), (1740, 180), (1740, 530), (1740, 880). The heatmap denotes the overlap percentages of the gaze points in the distribution
area and shows that most gaze points fell at the center of the benchmarks.

experience but had no experience using eye trackers. None of the partic- the internal validity of the drawing task results. Thus, a between-
ipants in the drawing task participated in both studies (e.g., User Study subjects design was adopted to minimize the learning and transfer
1 and 2). Five external drawing experts (beyond the 60 participants) across groups.
were recruited via the same approach; they were responsible for rating There were four groups, each comprising of 15 participants. The
the sketches drawn via Mean Opinion Score (MOS) (Streijl et al., 2016). independent variable was the damping ratios between groups. Group
1 was the control group, which used the existing free-eye drawing
7.3. Experimental setup method (in the form of drawing where you look) (Jacob, 1991). Groups
2, 3, and 4 were experimental groups with damping ratios of 10−1
A between-subjects experiment was conducted to compare the per- (a fast brush speed), 10−2 (a medium brush speed), and 10−3 (a slow
formance of the damping function to the existing free-eye drawing brush speed), respectively. The drawing task was to draw five types of
method. If a participant is exposed to multiple drawing conditions, geometric primitives (Barasch, 2013; Goodrich, 1954; Ruskin, 1876):
this will cause a carryover effect, such as practice or learning effects, a circle, a square, a wavy line, a zigzag line, and a five-pointed star.
in subsequent tests (Maxwell et al., 2017). Carryover effects threaten The circle and the wavy lines were included to study the feasibility of

9
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

value when the damping ratio equaled 10−3 . The MOS of each shape
produced via the method of ‘‘where you look at is what you get (draw)’’
was the lowest within the four groups. Specifically, the MOSs of the
circle and the wavy line were the lowest (approx. 2). The square, the
zigzag, and the star obtained higher scores (around 4 to 5.5). For all
the shapes, the mean MOSs increased when the participants used our
method. The scores increased significantly when the damping ratio was
10−1 (a high brush speed) and 10−2 (a medium brush speed), and
increased gradually when the damping ratio was 10−3 (a slow brush
speed).
Fig. 12 shows that the drawing speeds and the standard deviations
(SDs) of the previous method was significantly high among all the
tested figures. The drawing speed decreased when our method was
used, decreasing exponentially with the decreasing damping ratio. The
SD of the drawing speed also decreased with the decreasing damping
Fig. 10. A sample of the experiment implementation with the Tobii 4C eye tracker
ratios. The drawing speeds were almost the same for the different
below the screen.
shapes drawn, particularly when the damping ratios were 10−2 (a
medium brush speed) and 10−3 (a slow brush speed).

controlling direction and drawing smooth curves. The square and the
7.5.2. Qualitative results
five-pointed star were included to study the feasibility of controlling
Fig. 13 represents the drawing performances of different groups.
direction and drawing corners.
The drawing outputs improved with each group with the increased
The 60 participants were randomly assigned to one of the groups.
damping ratio. Specifically, all the shapes drawn via the previous
All participants were informed of the goal of the experiment and were
method were distorted and had many sharp corners. The shapes im-
asked to sign a consent form. Each participant was given a 10-minute
proved with each group using the damping function and were almost
training session, during which they were instructed on how to use
as well-proportioned and smooth as the reference shapes once the
the eye tracker in the test version for free-eye drawing, including
damping ratio reached 10−3 (a slow brush speed). The images drawn
how to use the eye-tracking calibration, gaze-control modes, and free-
using a lower damping ratio had fewer involuntary lines, more gliding
eye drawing commands based on dwell click and dragging the cursor
details, and higher fidelity. In general, it could be concluded that the
(Fig. 10). After a 2-minute break, the participant performed a 7-point
drawings that most closely resembled the reference images were drawn
eye-tracking calibration and completed the drawing task using only
by the group using our method with a damping ratio of 10−3 (a slow
their eyes. The drawing processes were recorded using a Windows
brush speed).
10 built-in screen recorder. The drawing results were taken as the
During the interviews, all the participants who used the previous
final images when the participants confirmed that they had finished
method reported extreme difficulty in controlling the brush and unsat-
drawing.
isfactory drawing results, as can be seen by the following quotes: ‘‘I
At the end of the experiment, each participant participated in a
could not control eye movements without looking around’’, and ‘‘The brush
semi-structured interview with open-ended questions, such as ‘‘What
did you think?’’ and ‘‘How did you get on with the drawing?’’ Partici- was totally out of control’’. They were also annoyed by serious brush
pants were asked to give feedback on the method used and their user jitters.
experience. The interview last for five minutes for each participant. The group with the damping ratio of 10−1 gave similar comments
to those who used the previous method. In contrast, in the groups with
7.4. Data analysis the damping ratio of 10−2 (a medium brush speed) and 10−3 (a slow
brush speed), the participants stated that they had felt comfortable
The drawing performance and graphical fidelity of the results were with the time lapse between the brush and the gaze: ‘‘The speed was
evaluated using Mean Opinion Score (MOS), a well-known method useful to control the brush as I could direct the brush easily’’. They also
of quantifying the subjective quality of visualization (Aldridge et al., highlighted that they were able to glance at the drawn lines within the
1995; Cranley et al., 2006; Sun and Huang, 2011; Vázquez and Stein- time lapse between the brush point and the gaze. However, most of the
feld, 2014). The drawing results were randomly shuffled and rated by participants in the group with a damping ratio of 10−3 reported that
a blind reviewer based on an 11-point MOS standard ( Table 2). The the slow brush speed caused severe eye strain: ‘‘My eyes were exhausted
reviewers were fully aware of the evaluation standard and were asked when I gazed at the screen for such a long time’’.
to evaluate which painting most resembled the shape sample, without Among the four groups, some of the participants suffered from a
knowing the method used to create it. Each figure was reviewed by ‘‘cursor drifting’’ problem: ‘‘I could not gaze straight at the brush point,
the five reviewers, and an average score was calculated. The MOSs or the brush would move away. In this way, the cursor kept drifting and
of each shape were analyzed using one-way ANOVA and descriptive my gaze could never reach it ’’, and ‘‘I found the brush was not in the right
analysis. The drawing outputs were re-evaluated by post-hoc analysis position, wherever I looked’’. Drawing mistakes caused by involuntary
of the screen recordings of the drawing processes. The system auto- eye movements were also reported: ‘‘When I looked at what I draw, it
matically recorded the length of each line and the task completion produced redundant lines on the canvas’’.
time, defined as the duration between ‘brush activated’ to ‘brush end’.
The drawing speed of each group was calculated and analyzed using 7.6. Discussion
one-way ANOVA. The interview records were assessed using content
analysis. 7.6.1. The effect of damping function
The shapes drawn using the previous method could hardly be
7.5. Results recognized. Notably, smooth curves were very difficult to draw. Thus,
the previous method for gaze-control drawing is evidently not feasible.
7.5.1. Quantitative results In contrast, our method using the damping function successfully cre-
As shown in Fig. 11, the average MOS of each shape increased ated smooth and geometrically well-drawn shapes. Thus, the damping
significantly with the decreasing damping ratio, reaching the highest ratios helped to obtain positive drawing results in free-eye drawing

10
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Table 2
The evaluation standard for the drawing results.

Which painting is more similar to the shape sample? MOS

The shape drawn is geometrically perfect, almost coincides >8, ≤10


with the reference image, and its details are drawn precisely
(smooth and gliding lines, sharp corners, well-proportioned
circles, and horizontal or vertical-sided squares).
It is easy to recognize the shape drawn, the shape is >6, ≤8
well-proportioned, the curves are smooth, and the straight
lines are horizontal or vertical-sided.
The shape is skewed or distorted slightly. The drawn lines in
the shape are tidy and gliding.
The shape can be recognized correctly, but its details are not >4, ≤6
well drawn.
The shape is asymmetrical, twisted, skewed, or distorted.
There are small amounts of irrelevant lines in the shape.
The shape can just about be recognized, but the details are >2, ≤4
cluttered. The shape is twisted, skewed or distorted
significantly, and there are many irrelevant lines in the
shape.
It is very difficult to recognize the shape and the quality of >0, ≤2
its details is inferior.
The shape drawn cannot be recognized at all. =0

A higher MOS indicates a better image quality. A score lower than 4 signifies an unrecognizable and low-fidelity image, and a
score higher than 6 signifies a recognizable, high-fidelity image. This MOS standard was used to quantify the drawing outputs
in User Study 1 and User Study 2.

Fig. 11. The Mean Opinion Scores (MOS) of the groups. The 𝑥-axis denotes the four groups: (1) the previous free-eye drawing method, (2) a damping ratio of 10−1 (a fast brush
speed), (3) a damping ratio of 10−2 (a medium brush speed), (4) a damping ratio of 10−3 (a slow brush speed).

Fig. 12. The drawing speed of the four groups. The 𝑥-axis denotes the four groups: (1) the previous free-eye drawing method, (2) a damping ratio of 10−1 (a fast brush speed),
(3) a damping ratio of 10−2 (a medium brush speed), (4) a damping ratio of 10−3 (a slow brush speed).

and improved the drawing performance remarkably. Specifically, two draw, and the other is assisting the hand movements to execute the
factors related to the damping function were shown to be effective in drawing precisely. To implement the dual-task, the eyes engage in two
controlling the brush and refining the lines drawn in free-eye drawing. primary activities throughout the drawing process: ‘‘close pursuit’’ and
First, this method suits the perceptional nature of eyes when drawing. ‘‘target locking’’ (Tchalenko, 2007). When a brush point approximately
Second, the brush damping dynamics stabilize the brush by alleviating equals the gaze point, as in the previous method, the user cannot make
the influence of involuntary eye movements.
saccades to observe and conceive the next position to draw as the gaze
(1) Emulation of Eye Movement Patterns in Eye-Hand Drawing.
trajectories are transmitted immediately to drawn lines. This type of
The inability to observe the sketches during the drawing process was
drawing mechanism goes against physiological eye movement patterns
noted as an issue in the previous gaze-control drawing research (Gips
and Olivieri, 1996; Hornof et al., 2003). This issue originates from natu- and hinders usability.
ral eye behavior when drawing. In a real-world drawing process using Furthermore, it is rather difficult for the user to fix their eyes
hands, the eyes fulfill two tasks (Tchalenko, 2007): one is obtaining precisely on a blank space on the canvas. Natural eye movements
visual information to help the painter decide how and what lines to happen in response to contextual stimulus in external environments,

11
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 13. Comparison of the drawing results between groups. Each row was sampled respectively by one subject’s results whose MOS was similar to the average MOS in the
corresponding group.

following a stimulus-driven pattern (Kasprowski et al., 2016; Slobo- Additionally, the damping function acts as a noise filter, reducing
denyuk, 2016). In this case, the drawn lines are the only contextual redundant lines caused by eye jitters. The images drawn indicate that
stimulus on the canvas. Consequently, the user’s eyes are likely to be drawing with a damping ratio of 10−2 (a medium brush speed) and
instinctively attracted to the drawn lines and, thus, cause involuntary 10−3 (a slow brush speed) effectively improves drawing performance
lines while drawing. and reduces drawing jitters compared to the previous free-eye drawing
Our method solves this issue by separating the positions of the method. The errors in the experimental results decreased significantly
brush and the gaze point and using the damping ratio to constrain with lower brush speeds, validating the plausibility of the damping
the displacement of the brush position toward the gaze point. In our function.
method, the gaze path of observation is not immediately outputted
as pixels on the canvas. Minor eye movements, for example, close 7.6.2. The optimal damping ratio
pursuit (Tchalenko, 2007), are filtered by the damping functions so that According to the results based on the MOS standard, the shapes
the user can observe the drawn line while drawing. The experimental drawn using the damping ratios of 10−2 and 10−3 were recognizable,
results show that a relatively low damping ratio generates a consider- which indicates that the damping ratio should not be higher than 10−2
able time lag between the brush point and the gaze and provides the to achieve acceptable drawing results. However, there are several issues
user with more chance to observe the drawing outputs. The interview with a damping ratio of 10−3 . Although involuntary sketches were
feedback confirmed that an appropriate time lapse facilitates the dual- reduced with a 10−3 damping ratio (a slow brush speed), they did
task of the eyes (‘‘close pursuit’’ and ‘‘target locking’’ (Tchalenko, not significantly differ from those obtained using a damping ratio of
2007)) by enabling the user to make observations through glances 10−2 . Nevertheless, the task completion time increased remarkably.
within a small range and to make decisions on the next drawing steps Furthermore, the participants drawing with the lowest brush speed
within the brush latency. reported severe eye fatigue as they needed to fixate on a point for a long
(2) Brush Stabilization. The lower the damping ratio, the lower time until the brush moved toward their gaze point. Most importantly,
the brush speed and the better the drawing performance. This is demon- according to Fig. 12, the drawing speed when using a damping ratio of
strated by the results: the groups using fast brush speeds lacked control 10−3 is lower than the normal free-hand drawing speed, varying from
over the direction of the brush, leading to skewed lines. In contrast, the approximately 28 mm/s to 150 mm/s (Tchalenko, 2007), while the
results obtained by participants using slower brush speeds were better drawing speed when using a damping ratio of 10−2 is within the range
proportioned. of the optimal ratio for brush damping dynamics and is helpful for
The SD trend of the brush speed explains why the damping function drawing performance, eye comfort, dwell time command, and human
can enhance brush control. The brush speed is entirely synchronized drawing nature. Hence, it can be concluded that the optimal damping
with eye movement in the previous method. Thus, it is extremely ratio most effective in filtering out eye jitters is around 10−2 .
affected by saccades and eye jitters (as indicated by the high SD), The damping dynamic can also be generalized by assigning non-
which make it unsteady. This decreases exponentially with a decreasing linear damping ratios to control the brush speed under different brush-
damping ratio, and the SD of the brush speed decreases correspond- gaze distances. For example, in Fig. 14 the damping ratio is 0 when
ingly. The decreasing brush speed and SD show that the damping ratio the brush-gaze distance is less than 97 px (a gaze-based precise limit
reduces the negative impact of unintentional eye movements and eye (Hansen et al., 2018)) and is 10−2 when the brush-gaze distance is
jitters on brush stability. Thus, the damping function enables the user more than 97 px. In this case, the brush would be still when the user
to control the brush at an appropriate and stable speed and enhances makes close pursuits (Tchalenko, 2007) on the canvas, stimulating the
the competence of direction control. start-stop action inherent to real-world drawing scenarios.

12
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 14. A non-linear brush damping dynamic.

Fig. 15. The cursor drifting problem, which hypothesizes that the eye-tracking program locates the user’s gaze at the point pgaze while the user is looking at the position of preal
with an eye-tracking error. (a) The bias from the inherent eye-tracking error. (b) When the user’s eyes look at the brush point, the eye-tracking program computes the offset as
fixation moving from 𝑝gaze to 𝑝gaze′ .

The damping function (Eq. (1), Section 4.1) summarizes a universal translated to drawn lines) in real time. Even a tiny deviation can affect
type of gaze-control dynamics and can describe many of the existing the user’s judgement and create performance inaccuracy. This issue will
gaze-control mechanisms. It can be generalized to various eye-tracking be addressed using the gaze-oriented method (the Compass) in User
applications and helps adjust the dynamics of the cursor to control Study 2.
the gaze. For example, common mouse selection events (selection and
command) (Ramirez Gomez and Gellersen, 2019; Velloso and Carter,
8. Experiment-User study 2
2016) in eye-tracking games use a damping ratio of 0, i.e., the cursor’s
position and the speed are equal to the position and speed of the gaze.
In some aiming games where the gaze is used for navigation or ori- This study tested the hypothesis that using the Compass could
entation control (Smith and Graham, 2006), the game camera (field of improve the graphical fidelity of free-eye drawing and mitigate the
vision) follows where the eyes look (Nielsen et al., 2012). The damping negative impact of eye-tracking errors and involuntary eye behaviors
function makes gaze-adaptive interaction feasible by controlling the resulting from the eyes’ stimulus-driven pattern. It will also examine
speed and trajectories of vision transformations in game scenes. whether the gaze-oriented mechanism is aligned with the eye’s per-
ceptive nature and its physiology. The experiment compared graphical
7.6.3. Cursor drifting fidelity, drawing performance, brush controllability, and usability of
The participants frequently reported that the cursor drifted in a our gaze-oriented method (i.e., different combinations of the Compass)
specific direction when their eyes pursued the brush cursor. This issue with the existing free-eye drawing method and validated the results of
points to an eye-tracking error, highlighted by Hornof et al. (2003), User Study 1 with the damping ratio dynamics.
where the user’s eyes chase their fixation (the cursor) on the screen
and cause the gaze cursor to drift.
8.1. Apparatus
According to the research on natural eye movement patterns, eyes
are target-oriented and visual attention is sensitive to moving targets.
When the user is drawing, the moving brush is a significant target The study was conducted in a laboratory at Guangxi University,
stimulus that appeals to the user’s visual attention. Furthermore, on a China, for the convenience of recruiting large number of participants.
blank canvas without auxiliary UI, the user must gaze at the endpoint The eye tracker used was Tobii Eye Tracker 4C, and the interface was
of the drawn lines to encode the to-be-drawn contents. However, due displayed on a 27-inch monitor with a 16:9 aspect ratio and a resolution
to inherent eye-tracking errors, the estimated gaze point and the actual of 1920 × 1080 pixels (the same apparatus as in User Study 1).
gaze position is consistently offset. In the previous gaze-control drawing Participants sat in front of the monitor at a distance of 73.4 ± 10.1 cm.
method, where the brush point coincides with the gaze, the brush point The drawing system ran on a desktop computer with an Intel core i5
equals the estimated gaze point, but not the actual point the user gazes 10400F CPU @ 2.90 GHz using 16 GB RAM and ASUS DUAL-RX580-8G
at (Fig. 15). When the user tries to look at the brush point, the system (graphics card) and a Windows 10 operating system.
will interpret the deviation between the estimated gaze point and the
real gaze point as moving the gaze; consequently, the brush will drift
8.2. Participants
away. This means the user is always chasing the drifting brush cursor
when drawing, much like a person chasing their shadow.
Common eye-tracking applications use selective-based interfaces de- Participants (N=90, 47 females and 43 males, aged between 25–35,
signed with a positional tolerance for the deviation between the actual nondisabled) were recruited from the students of Guangxi University,
gaze position and the estimated gaze point (cursor) (Hansen et al., China. None of the participants had any eye-related diseases or visual
2018; Menges et al., 2019). In addition, UI elements for dwell clicks deficiencies; 45 of them had normal visual acuity, 32 wore glasses, and
are fixed so the user can adjust their gaze to a proper position where 13 wore contact lenses. Each participant was given 100 CYN (∼15.5
the fixed buttons can be clicked on. However, in click-and-drag-based USD) as a token of appreciation for their participation in the study.
interfaces (e.g., free-eye drawing), a moving cursor (e.g., the brush) has The participants all had drawing experience, but were inexperienced
less positional tolerance for the eye-tracking deviation. For example, in using eye trackers. To maintain the same individual factor level, the
the user needs to gaze at the brush to observe what they are drawing MOS reviewers were the same as those in User Study 1 and the MOS
and conceive the next actions. Meanwhile, their gaze trajectories are standard was the same standard used in User Study 1.

13
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

8.3. Experimental setup than 6 with the Compass, which denotes that the figures are highly
recognizable according to the MOS standard.
A between-subjects study was conducted. The 90 participants were The mean MOS obtained using the Compass with CC+DN was ap-
randomly divided into six groups, which were tested under different prox. 6, and increased significantly in the CC+ADN and SC+DN groups.
conditions. There were two control groups without the Compass: the The SC+ADN group obtained the highest MOS and only increased
first group used the existing free-eye drawing method ‘‘what you gaze slightly compared to the results obtained by the groups using CC+ADN
at is what you get (draw)’’; the second group used the conclusive and SC+DN.
damping ratio of 10−2 without the Compass; the experimental groups The upward trend of SUS was aligned with the trend of MOS. The
used the damping ratio of 10−2 and different Compass UIs, namely: average SUS score with the previous gaze-control drawing method was
(1) the circular cursor with a directional needle (CC+DN): (2) the the lowest (approx. 46), indicating poor usability according to the
circular cursor with the directional needle array (CC+ADN): (3) the usability rankings (Bangor et al., 2009). The mean of SUS using no
spiral cursor with a directional needle (SC+DN); and (4) the spiral Compass but a damping ratio of 10−2 increased slightly (approx. 57),
cursor with the directional needle array (SC+ADN), respectively. which is considered ‘‘OK’’ in the ranking of SUS (Bangor et al., 2009).
To eliminate errors caused by the differences in individual drawing In contrast, the SUS scores of the drawings made using the Compass
skills, the participants were assigned to draw universal figures such as increased significantly. The group using CC+DN obtained an average
English letters (A, B, C, a, b), Arabic numerals (1, 2, 3), and symbols usability score of 75, which is rated ‘‘Good’’ (Bangor et al., 2009).
(%, &, ⊠) and four simple icon-like figures (a smiley face, a piece The other three groups using different combinations of the Compass
of leaf, a piece of cloud, and a house). A handwriting font (Lucida obtained usability scores higher than 80 (rated ‘‘Excellent’’ (Bangor
Handwriting) was chosen for the letters and numerals, as this font most et al., 2009) The highest SUS score was obtained by the group using
closely resembles human handwriting. Each participant was shown the SC+ADN (approx. 83).
reference images and asked to reproduce the images to the best of their
ability using free-eye drawing. The figures contained the geometric 8.5.2. Qualitative results
compositions of the figures used in Experiment 1, that is: curved,
Fig. 17 represents the drawing results of the different groups. The
straight, horizontal, vertical, zigzag, and diagonal lines but comprising
drawings were retrospectively compared with the interview results to
more variables that rendered them more similar to hand-drawn figures.
evaluate the drawing performance as below.
Before the test, the participants were informed of the goal of the
(1) Previous Free-Eye Drawing Method. The results were severely
experiment and asked to sign a consent form. Before the test session, the
distorted and out of shape; they also contained a large number of
participants were introduced to the drawing system and the Compass
redundant lines. Most of the figures were unrecognizable, for example,
UI they would use, depending on their assigned group. They each spent
the figures of ‘‘C’’, ‘‘3’’, the leaf, the cloud, and the house.
10 min practicing using the UI tool. Before starting drawing, each
The participants found it ‘‘very difficult to draw’’. They emphasized
participant performed a seven-point eye-tracking calibration. To ensure
the difficulty of moving their gaze precisely to the desired point,
that they were familiar with the figures used, they were asked to imitate
highlighting the fact that the interaction was unnatural. The other
the reference images on paper by hand before they completed the test.
notable issue was the cursor drifting: ‘‘the position of the lines was not
After a 1-minute break, each participant re-calibrated the eye tracker
the place I gazed at, and every time I gazed at the brush, there was a wrong
and initiated the drawing test using only their eyes. The drawing
sketch.’’
process was recorded using the Windows 10 built-in screen recorder.
(2) No Compass. The results showed a slight improvement on those
The drawings were taken as final images when the participants con-
obtained with the previous method. However, the results still revealed
firmed that they had finished drawing. Following the drawing tests,
each participant was asked to take part in a 5-minute semi-structured many ill-drawn strokes, obvious distortion, and significant skewness,
interview, in which they were asked open-ended questions, including especially in the curved and round figures, such as the figures of ‘‘a’’,
about drawing competence, accuracy, and the challenges they faced, ‘‘3’’, the smiley face, and the leaf.
such as ‘‘Did you make mistakes when you were drawing? If yes, why?’’ The participants reported drawing mistakes due to unintentional
and ‘‘What do you think of the method, and why?’’ Lastly, a System strokes: ‘‘I could not keep myself from looking at my sketches in the
Usability Scale (SUS) test (Brooke, 1996) for the drawing usability was middle of the drawing.’’ Another factor causing unintentional drawing
conducted with each participant. mistakes was cursor drifting: ‘‘When I looked at the endpoint of the lines
to examine what I had drawn, the endpoint drifted away and drew lines
8.4. Data analysis everywhere’’. Notably, several participants expressed their desire to have
some assisted tool to guide the gaze: ‘‘It was difficult to stabilize my gaze
The drawn images were assessed by a blind review based on the without anything to gaze at ’’, and ‘‘I wish there was a grid to help me look
evaluation standard ( Table 2 in Section 7.4) by the same reviewers at the canvas’’.
used in User Study 1. The figures drawn by each participant were rated (3) CC+DN. The drawn figures showed fewer mistakes and less
with one MOS by the five reviewers and the mean MOS was calculated. redundancy. The lines were tidier than those drawn by the first two
The drawing outputs and drawing process recordings were re-examined control groups. All the figures were recognizable, though they were
and analyzed. The results of the interview were assessed using content slightly distorted with twisted strokes and skewness.
analysis. The participants expressed their appreciation of the interface: ‘‘It
was very helpful to navigate the brush’’. Two major problems were
8.5. Results reported. One was distortion: ‘‘The circle I drew often turned out to be
spiral, the beginning and the end points of the circle could not close up’’,
8.5.1. Quantitative results and ‘‘When I drew by gazing at the edge of the circular cursor, the brush
Fig. 16 shows the MOS and SUS scores produced by the mean and always moved faster and was too sensitive on the right-hand side and could
SD for the different groups. The average MOS using the previous free- hardly move on the left-hand side’’. Another problem was the angular bias
eye drawing method was the lowest (approx. 2), which is deemed for straight lines: ‘‘The vertical lines were skewed when I draw the square’’.
unrecognizable according to the MOS standard (see Table 2). The (4) CC+ADN. There was a significant improvement compared to
mean MOS increased significantly with a damping ratio of 10−2 (a the results in the three aforementioned groups. All the figures were
medium brush speed) (approx. 5). The experimental groups using the well-proportioned and had tidy lines. There were no twisted strokes
Compass obtained higher MOS values. The means of MOS were higher or distortion in the figures. Some figures had short redundant lines at

14
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 16. (a) MOS of the different groups. (b) SUS score of the different groups. In the two figures; (1–6) denotes the six groups: (1) the existing free-eye drawing method ‘‘what
you look at is what you get (draw)’’; (2) no Compass and with the damping ratio 10−2 ; (3) the circular cursor with a directional needle with a damping ratio of 10−2 (CC+DN);
(4) the circular cursor with the directional needle array with a damping ratio of 10−2 (CC+ADN); (5) the spiral cursor with a directional needle with a damping ratio of 10−2
(SC+DN); and (6) the spiral cursor with the directional needle array with a damping ratio of 10−2 (SC+ADN).

Fig. 17. Comparison of the drawing results between groups. Each row respectively samples one subject’s results whose MOS was similar to the average MOS in the corresponding
group.

the endpoint of the strokes, but the redundant lines did not hinder the The participants reported very few, if any, drawing mistakes by the
overall graphical structure. interface: ‘‘It [the Compass] is a smart design, I have never imagined how
The participants revealed a distortion issue similar to the group with people can draw with eyes before’’, and ‘‘I figured out some skills in how to
CC+DN. However, they did not report any errors related to angular gaze through the Compass to draw satisfactory figures.’’
bias.
(5) SC+DN. The figures presented a similar completion level to 8.6. Discussion
those drawn by the CC+ and group. The arcs were slightly rounder,
for example, in the figures of ‘‘B’’, ‘‘3’’, ‘‘&’’, and the smiley face. The drawing tests and the interview validated the hypothesis that
The participants commented that SC made it easy to draw circles the gaze-oriented compass cursor can significantly improve brush con-
and arcs: ‘‘The spiral was a useful tool to draw full circles, as I could trol, graphical fidelity, and user experiences in free-eye drawing.
shift the speed by moving my gaze between the inner loop and the outer
loop’’. However, they reported difficulty in drawing straight lines: ‘‘I 8.6.1. Methods without gaze-oriented interfaces
always looked along the loops of the spiral cursor subconsciously when I The previous free-eye drawing method is the mechanical form of
was drawing straight lines, then the straight lines often bent.’’ ‘‘what you look at is what you get (draw)’’. The drawn result was
(6) SC+ADN. The figures were well-proportioned, with gliding lines similar to that obtained in Experiment 1, with the conclusion being
and the highest level of completion out of the experimental groups. The that it is challenging to create precise figures using this method. In
details in the figures closely reflected the reference image. the drawing results, the drawn images included many distortions,

15
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

skewness, and redundant lines at the endpoints of the strokes. Most of user to correct distortion and skewness and draw well-rounded circles
the drawn images with the previous gaze-control drawing method were and arcs when the eye-tracking error was relatively severe, while the
unrecognizable. Furthermore, this method demonstrated poor usability directional needle array enhanced directional control and gave the user
and unnatural interaction. the ability to correct graphical angular bias caused by the eye-tracking
The second group drawing using the damping ratio of 10−2 vali- error. Hence, there was less graphical skewness in the drawn lines in the
dated the conclusion drawn in Experiment 1 that the damping ratio results produced by the groups using the spiral cursor and directional
effectively improves free-eye drawing accuracy. Notably, the results needle array.
using this damping ratio revealed different scores in the two experi- Specifically, the Compass works as a moving target triggering the
ments. There score in Experiment 2 was lower as the figures tested in eyes’ smooth pursuit and generating fluent gaze trajectories (Soecht-
Experiment 2 were compounded and consisted of many strokes, while ing et al., 2010). This mechanism is based on the perceptive nature
the figures tested in Experiment 1 were mostly completed with one of eye movements and their stimulus-driven pattern (Duchowski and
stroke; thus, the figures tested in Experiment 2 required more precise Duchowski, 2017; Verghese and Beutter, 2002). Since the paths of
control of the brush position, direction, and graphical proportion. smooth pursuits closely align with the trajectories of a target-oriented
In the figures drawn by the two groups without the Compass, the stimulus (Soechting et al., 2010), the gaze trajectories are affected
primarily by the outlines of the cursor’s UI. This is why the directional
polyline-alike strokes were similar to the scanpath of saccades (Jacob,
needle and directional needle array are better at guiding the user to
1991). This phenomenon verifies that eye movements are mainly sac-
draw straight lines, while the circular and spiral cursors enable the user
cades when drawing without gaze-oriented visual stimulus. Thus, it
to draw curves and circles. It was also demonstrated that the Compass
is challenging to obtain smooth and well-proportioned shapes, as the
enabled the user’s eyes to correct the observational error associated
user finds it difficult to locate the gaze precisely on the desired spatial
with eye-tracking bias.
position. The user is also likely to look at what they have drawn and
Additionally, the Compass is crucial to guide the user in drawing
the endpoint of the stroke, leading to issues such as cursor drifting and
information, including the brush direction and brush trajectories, so
involuntary eye movements that produce redundant lines. that the user is aware of what they have drawn and what needs to
At the commencement of free-eye drawing on a blank canvas, there be drawn next, making it possible for the user to encode the correct
are several constant target stimuli for eye movements: the endpoint of spatial position of the to-be-drawn shape without gazing toward the
the brush (moving target, prior to visual attention) and the drawn lines brush point. Therefore, the Compass can eliminate the cursor drift-
(still target). The eyes naturally fix on the brush point (moving target) ing in free-eye drawing and achieve accurate graphics, and different
and the drawn lines (target involving specific information) (Hayhoe and UI combinations can be tailored to and used in different drawing
Ballard, 2005; Johansson et al., 2001; Yantis, 1998) as eye movements scenarios.
are physiologically stimulus-driven. Hence, free-eye drawing is error-
prone via the previous method, as the user must force their eyes to 8.6.3. Smooth pursuit-based gaze interaction
move against the eye’s perceptive nature and gaze away from the visual Smooth pursuit eye movement has been the subject of studies for
stimuli. The other challenge is that in the previous method, the user many years (Robinson, 1965), but smooth pursuit-based gaze interac-
cannot observe what they have drawn as the gaze trajectories are tion is a fairly new focus of study. Previous research has primarily
generated as pixels in real time. This is why the participants using employed smooth pursuits as a visual guide for swift target selection
this method intuited the need for visual aids for gaze control in the and has demonstrated smooth pursuits as a viable method to improve
follow-up interview. Therefore, the method without a gaze-oriented selection accuracy (Lohr and Komogortsev, 2017; Pantanowitz et al.,
target stimulus in sight is infeasible when seeking to draw complicated 2021; Velloso et al., 2017; Vidal et al., 2015). The principle behind the
(compounded) figures at high fidelity and to a precise level. smooth-pursuit-based selection is categorized as ‘‘motion correlation’’:
to represent a target by motion and have users select the target by
8.6.2. Use of the Compass replicating its motion (Velloso et al., 2017).
The experimental groups using the Compass successfully achieved The present studies mostly focus on target selection, in which the
precise and high-fidelity figures, including variable figures. The us- smooth pursuit motions and the target locations of these methods are
ability and graphical fidelity of free-eye drawing with the Compass pre-determined. Additionally, smooth pursuit eye movements in these
increased significantly and the participants in these groups reported methods rely on an externally driven stimulus. However, gaze-control
having to make less effort and experiencing less frustration than the drawing is a completely different interaction that requires a swipe
mouse motion, namely, click-and-drag. In the free-eye drawing zone,
participants in the previous groups. Moreover, the cursor drifting issue
it is impossible to list all potential target locations for the brush cursor.
mentioned by the previous two groups was not reported by any of the
Our method leverages a novel mechanism with gaze interaction,
users drawing with the Compass. There were rare reports of loss of
i.e., the self-driven smooth pursuits. In detail, the Compass enables
observation and mistakes in the form of unintentional strokes.
the user to self-drive the stimulus while also following it in a smooth
The basic UI combination was a circular cursor with a single direc-
pursuit fashion. Smooth pursuits are used to control the gaze trajecto-
tional needle. When the results are compared to those achieved without
ries in the click-and-drag mouse event. To the best of our knowledge,
using gaze-oriented interfaces, it is evident that this type of UI notably
this type of smooth pursuit-based interaction has not been discussed in
improves free-eye drawing. However, the drawing performance of this eye-tracking literature to date. We prove that smooth pursuits can be
basic UI combination was the lowest among the Compass groups. The applied in gaze interaction that requires drag-and-click, thus bridging
images drawn via this method had twisted strokes and slight skewness. the gap in the gaze-control modality.
According to the user feedback using CC, the brush velocity when Our method also provides a generalizable smooth pursuit-based
gazing to the left-hand side of the cursor was different from that of paradigm using a constant geometric interface and eliminates the need
the right-hand side. This meant that the lines drawn toward the left to predetermine the interfaces in different scenarios. Specifically, based
and right sides were different lengths, even when the user’s gaze was on the Fourier Transformation (FT) theory (Boashash, 2015; Bochner
the same duration and distance from the brush center. Conclusively, et al., 1949; Hassanieh et al., 2012), in which various figures can
the basic UI is usable only when the eye-tracking error is minor and is be deformed into a function depending on space (gaze position) or
feasible for use to draw smooth curves. time (gaze duration), the gaze position and duration along with the
In comparison to the groups using the circular cursor or a single outlines of the Compass can create unlimited shapes. The drawing
directional needle, the users drawing with a spiral cursor and the results represent the competence of the Compass to handle creating
directional needle combination achieved the highest competence in varied shapes, including but not limited to letters, icons, and graphical
free-eye drawing. The spiral cursor made it significantly easier for the figures, via free-eye drawing.

16
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

9. Field study The system usability was tested using the SUS (Brooke, 1996)
questionnaire in the Chinese version. Following the usability test, a 10-
This section evaluates the system usability among disabled people. minute semi-structured interview was conducted with each participant,
To achieve this, we arranged a field study that involved eleven motor- which included open-ended questions such as ‘‘What challenges did you
disabled participants who could not use their hands and used eye have while drawing?’’ and ‘‘How do you feel that this system compares
trackers to access computers. This study was conducted at two loca- to current eye-tracking tools and related applications?’’ The duration
tions simultaneously, namely, Guangxi Province and Hubei Province, of the entire process varied from 35 to 180 min in total, depending on
China, in collaboration with the People’s Hospital of Guangxi Zhuang the length of time it took the participant to finish the drawing.
Autonomous Region, the local day centers, and the Disabled Persons
Federation.
9.4. Results
9.1. Apparatus
9.4.1. System usability
We used the same hardware in both locations to maintain study
The average SUS score of the participants in this experiment was
consistency. A Lenovo Legion R7000 laptop with AMD Ryzen™ 5 4600H
82.5 (SD=4.47), which, according to the SUS rating scales, falls within
3.0 GHz, Gigabyte GeForce RTX 2060 OC HDMI 3xDP 6 GB, 15.6-inch
the ‘‘Excellent’’ usability bracket (Bangor et al., 2009). All the partic-
display, and 16 GB RAM. The laptop had Tobii Eye-Tracking software,
EyeCompass, and Windows 10 installed. The screen resolution was ipants rated the system positively. Eight participants achieved scores
1920 × 1080 pixels. The laptop was set on a table and adapted to above or equal to 80.0. The participant who obtained the highest score
the participants’ needs. The eye tracker used was Tobii 4C, which was (90.0) showed a great interest in the system and stated: ‘‘I am very
mounted at the bottom of the laptop screen. The participants sat at an curious about how to draw with eyes using the system, and it is amazing
approximate distance of 65 cm from the laptop. for disabled people to draw’’. The lowest score obtained was 77.5 (which
falls within the ‘‘Good’’ usability bracket (Bangor et al., 2009) by the
9.2. Participants three participants who were drawing novices. The participants com-
mented that the gaze-control methods available in EyeCompass were:
Eleven non-paid volunteers (motor-disabled, three females and eight ‘‘very helpful for people who have special needs’’, ‘‘novel’’, and ‘‘innovative’’
males, aged 25 to 70 years) with a hand or total body physical im- and that the gaze-control mechanisms of the system were ‘‘emerging ’’
pairment were recruited via partnerships with local hospitals, day compared to the eye-tracking applications they had used. For example,
centers, and the Disabled Persons’ Federation. Four participants had ‘‘I’ve never seen this method before, most eye-tracking applications can only
had strokes, three had cerebral palsy, two had repetitive strain injuries make selections through dwell clicks. It has been a profound experience to
in their wrists, and two had high paraplegia (severe motor disability). create a drawing with my eyes.’’ One of the three drawing experts stated:
None of them had any eye-related diseases or visual deficiencies. Five of ‘‘I used to draw with Photoshop via Tobii Windows Control, but it was
them had normal visual acuity, and six wore glasses. All the participants challenging to draw accurately with the eyes. In comparison, this system
were experienced in using eye trackers to control a computer. Three allowed me to draw in a comfortable and accurate way.’’ Another drawing
participants were skilled in computer graphics, four were amateurs, and expert stated: ‘‘The drawing software I have used is Windows Paint, which
the other four were novices. Among those experienced in drawing, they has to be accessed via the eye-tracking widget Tobii Computer Control. This
commonly used Adobe Photoshop or Window Paint controlled via the
system created better graphics with more ease and fidelity than the existing
widgets of eye-tracking software.
approach.’’

9.3. Implementation
9.4.2. Interface
This study was approved by the Institutional Review Board (IRB). The participants had positive things to say about the interface,
The study was implemented in either the participants’ residences, including that it was ‘‘intuitive’’, ‘‘straightforward’’, ‘‘easy to learn’’, and
workplaces, or local day centers. Before the test, the participants were ‘‘worked well’’. One of the participants also highlighted that ‘‘The flat
informed of the procedures to be used and were asked to sign a
design of the interface is concise and professional.’’ Additionally, the
consent form. Then, demographic data, including their use of eye
participants all preferred to draw with the visually assistive grid on
trackers, drawing skills, and computer skills were collected. Next, the
the canvas: ‘‘It was more comfortable to control the gaze than a blank
participants underwent the eye-tracking calibration process and were
canvas’’, especially when executing dwell clicks (e.g., ‘‘I could at least
introduced to the EyeCompass interface in a 10-minute training session.
gaze at the grid when I wanted to start or end the brush, or I could
They were also tutored on the use of the dwell click and the manipula-
tion of the EyeCompass interface to ensure that they were familiar with not stabilize my gaze on the canvas without looking at anything ’’). The
unimodal gaze control and the system. users had different preferences regarding the possible combinations
The test commenced after the participants had confirmed they had offered by the Compass. They also found the customized settings for
learned how to control the system, following a second eye-tracking the different combinations of the Compass and the brush speed setting
calibration. During the test session, they were encouraged to draw ‘‘(very) convenient ’’, ‘‘mindful of users’’ and ‘‘helpful’’.
simple sketches they had designed themselves. The drawing novices
were permitted to look up reference images on the Internet. During
9.4.3. Drawing results
the drawing process, they were free to edit the figures using all of the
functions (e.g., Draw, Erase, Undo, Redo, Move, Grid) and different All the participants were able to command the interface and suc-
Compass interfaces of the system, and could modify or withdraw the cessfully complete free-eye drawn graphics with relatively high fidelity.
figures at any time. The participants were also allowed to take short Remarkably, the drawing experts and amateurs created paintings of
breaks (approx. 5 min) during the drawing test. Following each break, an artistic level (Fig. 18); these drawings were created with evident
they were asked to re-calibrate the eye trackers before starting drawing individual style. Additionally, one of the drawing experts commented
again. The drawn images were saved at the end of the drawing process that the system enabled the creation of various styles of illustrations,
once the participants were satisfied with the results. such as one-line-drawings, scribbles, and abstract arts.

17
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 18. The graphics drawn by the participants.

9.5. Discussion of our knowledge, there is scant research on a gaze interaction mode
that can accurately control gaze trajectories during mouse dragging.
9.5.1. Usability of EyeCompass The high usability and good drawing performance achieved by
EyeCompass represents a high-fidelity and effective free-eye draw- EyeCompass points to a feasible solution for general graphics software
ing method for motor-disabled people. In the field study, the overall that requires gaze trajectory control. The series of user studies verified
SUS score and user feedback indicated that the system has good us- the robustness of the novel gaze-control mechanisms, i.e., the brush-
ability and high acceptance among the participants. All the participants damping dynamics and the gaze-oriented mechanism, and indicated
were able to create simple artistic drawings through the guidance of the that a combination of these mechanisms could accomplish the click-
Compass with unimodal gaze control. When compared to the drawings and-drag mouse event with gaze accuracy. On the one hand, the
created via gaze-control that were reported in the literature (Gips damping dynamics stabilized the gaze trajectories and filtered the noise
and Olivieri, 1996; Heikkilä, 2013a,b; Hornof et al., 2003; Meyer and from gaze movements, thus enabling the user to control the gaze point
Dittmar, 2009; Streichert et al., 2020; Van der Kamp and Sundstedt, (mouse) dynamics quantitatively and precisely with the damping ratio.
2011; Yeo and Chiu, 2006), it is apparent that our system was able On the other hand, the gaze-oriented Compass enabled the user to con-
to obtain results that more closely represented drawings made by trol the gaze naturally and accurately by using the eyes’ physiological
hand. Specifically, previous free-eye drawing approaches (Gips and features, i.e., smooth pursuits and the stimulus-driven pattern. Various
Olivieri, 1996; Streichert et al., 2020) have been unable to make a gaze trajectories could be obtained by adapting different visual stimuli
distinction between ocular observation and drawing gaze movements, as the cursor interface. The hybrid of these mechanisms is promising for
leading to inaccuracies that reflect a lack of control over the brush use with creative design applications other than drawing, for example,
(gaze). The other gaze-control drawing applications (Heikkilä, 2013a,b; computer-aided design (CAD) software.
Hornof et al., 2003) facilitate the reproduction of preset essential
shapes (e.g., circle, ellipse, square, straight line, rectangle) rather than 9.5.3. Generalization of the unimodal gaze control
free-eye drawing, with the results featuring polylines and lacking detail EyeCompass, with its unimodal gaze-control function, was demon-
and variety. In these studies, the drawings obtained and the user strated as being accessible for severely disabled participants in this field
study. Other multimodal gaze-control applications for drawing and re-
experiences uncovered issues including lack of brush control, a conflict
lated creative programs have been created in the past. For example, Van
between ocular observation and visuomotor transformation, inaccuracy
der Kamp and Sundstedt (2011) introduced a multimodal drawing
of the gaze trajectories, and eye fatigue.
program managed via gaze and voice input. The voice commands
By integrating brush-damping dynamics and gaze-oriented mech-
started and stopped the drawing while the gaze commands determined
anism, our system addressed these challenges (i.e., involuntary eye
the cursor position. Dziemian et al. (2016) presented a gaze-controlled
movements, the conflict between visuomotor transformation and ocular
writing and drawing robotic arm that could be controlled by tilting the
observation, and inherent eye-tracking errors) in gaze-control drawing,
head forward toward and backward from the canvas to indicate the
especially free-eye drawing, enhancing the control of gaze and the
start and end of the drawing task. Creed et al. (2020) proposed a novel
ability to create more intricate figures. The participants with a motor
creative design application called Sakura that used multimodal gaze
disability were thus able to achieve illustrations with a higher graphical
interaction, within which the gaze control modality collaborated with
fidelity using our system.
a mechanical switch. These multimodal studies highlighted that there
Furthermore, the gaze-based interface of EyeCompass provides
was an access barrier for severely motor-disabled users. The unimodal
straightforward and effective access to drawing functions and addresses
EyeCompass bridges this gap and provides better accessibility to people
the gaze-control problem in manipulating interfaces with pixel-level
with motor disabilities. Furthermore, the unimodal gaze-control system
elements and multiple hierarchies, such as Photoshop. The user ex-
can easily integrate with other control modes, such as voice input, foot
periences also show that the customized options for the drawing paddles, and QuadSticks, to cater to various individual needs.
mechanism parameters are efficient and mindful of different users’
needs (Fig. 19). In particular, the users predominantly preferred draw- 10. Limitations
ing with the grid on. Our experiment validated the effectiveness of the
gaze-oriented mechanism and proved that appropriate visual aids on 10.1. Remaining problems in free-eye drawing
the canvas that follow the stimulus-driven pattern for gaze control are
helpful. In this paper, we focus on addressing some particular problems of
free-eye drawing, including the lack of gaze-control accuracy and the
9.5.2. Generalization of EyeCompass mechanisms unavailability of gaze-control drawing methods in practice. Although
Digital graphics generally depend on click-and-drag mouse interac- our method has improved gaze-control drawing significantly, some
tion. Implementing the direct click-and-drag function via gaze control problems remain. One limitation in our study is the brush tailing
remains a challenge at present. The latest contribution of the gaze- observed during the drawing tests: ending the brush via dwell clicks
control click-and-drag is Gaze+Hold, which emulates click-and-drag by would lead to redundant lines of varying degrees at the endpoint of
closing one eye (Ramirez Gomez et al., 2021). Specifically, Gaze+Hold the drawn line as the user waited for the end of the brush. This issue
can move an object on the screen in four steps: gazing at it, closing one originates from the constant eye jitters in gaze data due to the nature of
eye to pick it up, gazing at a target location, and then opening the eye eyes (Murata and Karwowski, 2018; Skovsgaard et al., 2010). Although
to drop the object. Unfortunately, Gaze+Hold is not feasible for digital the damping function relieved eye jitters, the system also employed
graphics as it cannot control gaze trajectories in real time. To the best multiple other strategies (i.e., a progress bar for the dwell click, a

18
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Fig. 19. An example of a drawing made using the customized options. The mechanisms were set to be the spiral cursor with a customized brush speed (0.5x) and the grid turned
on.

shortened dwell time); brush tailing could not be eliminated as it was 11. Conclusion and future work
rooted in the dwell click interaction.
Therefore, alternative approaches to end the brush rapidly and Previous gaze-control drawing research cannot facilitate gaze-
robustly are necessary for improved gaze-control drawing. In our ex- control drawing with high-fidelity graphics and good system accessibil-
tended research, a trial version using eye closure to activate the draw- ity. In terms of gaze-control drawing for motor-disabled people, there
ing command was tested with a small group of people. Still, the is currently no sophisticated application on the market, nor has any
approach proved to be problematic as the brush point went downward literature been produced on the matter in recent years. The research
with the closing of the eyelids, as usually occurs during eye closing. that does exist was published some time ago and is therefore outdated,
The consequence was a plethora of involuntary lines. Due to the time due to its use of simplistic gaze-control drawing methods and super-
limits and the demands on the participants, a further user study was seded interfaces (Gips and Olivieri, 1996; Heikkilä, 2013a,b; Hornof
not conducted. However, it is imperative to determine a quicker and et al., 2003; Meyer and Dittmar, 2009; Van der Kamp and Sundstedt,
more precise approach to exiting drawing execution in the future. 2011). Moreover, the graphic effectiveness and system usability of these
studies was rarely verified in the articles concerned.
10.2. System capability Several recent studies have adopted multimodal approaches to de-
veloping gaze-control drawing and achieved a promising degree of
Another limitation is that the system’s capability is vulnerable to usability. However, multimodal gaze control excludes some severely
factors such as algorithmic issues arising from eye trackers, interac- motor-disabled people from using the application; unimodal gaze con-
tion environments, and perceptual problems. Drawing with the eyes trol is the most accessible approach for all physical abilities, including
is conducted in a more complex environment than drawing with the individuals with motor disabilities.
hands. Even a perfect free-eye drawing system will fail due to multiple This paper, therefore, proposed a free-eye drawing method that
impact factors, including algorithmic issues related to eye trackers, eye-
enables high-fidelity and efficient free-eye drawing via unimodal gaze
tracking calibration, the user’s gaze interaction skill, and the perceptual
control. The drawing system (EyeCompass) was developed based on
abilities of the user. For example, the gaze trajectories will not be
cross-disciplinary bodies of work, including the physiological eye move-
correctly generated if the gaze data is not correctly captured by the
ment patterns involved, the nature of the eye in eye-hand drawing,
devices.
and eye-tracking technologies. EyeCompass consists of two novel gaze-
Furthermore, the system involves different combinations of the
control mechanisms: brush-damping dynamics and gaze-oriented inter-
Compass and brush speed settings capable of meeting different drawing
faces. This paper postulated two hypotheses to examine the system’s
needs. Mastering the tricks of conceiving what and how to draw via
usability and fidelity: (1) brush-damping dynamics enable users to draw
the Compass demands significant learning time and exposure to the
intuitively, reduce involuntary eye movement error, and stabilize the
system. This was evident when participants in the field study who were
brush speed; and (2) the gaze-oriented Compass aligns with the eyes’
drawing novices were less perceptive of the tricks of the Compass. More
target-oriented perceptive nature and makes it feasible to draw various
sophisticated methods need to be explored to simplify the learning
figures with high fidelity and efficiency.
processes of free-eye drawing.
The above hypotheses were tested via a series of user studies
Therefore, although the usability of free-eye drawing was improved
significantly by EyeCompass, there remains a need to keep pace with conducted in laboratories and fieldwork among motor-disabled people.
the capabilities and conveniences of drawing by hand. Employing tech- User Study 1 verified that the brush-damping dynamics were able to
nologies such as machine learning and deep learning could potentially significantly improve brush stability and controllability and to reduce
reduce the problematic impact factors aforementioned, pointing toward noise from involuntary eye movements and eye jitters to a considerable
interesting directions for future research. degree. The user feedback indicated that the damping function con-
formed to the real-world drawing interaction (Gowen and Miall, 2006;
10.3. System function design Tchalenko, 2007; Tchalenko et al., 2014) by providing a time lapse for
eye observation and cognition. Moreover, a damping ratio of 10−2 was
EyeCompass is a basic drawing system that focuses on solving the demonstrated to be the optimal value for both drawing performance
critical problems in current gaze-control drawing methods. The main and efficiency. User Study 2 demonstrated a significant improvement in
goal of EyeCompass is to enable high fidelity and efficient free-eye drawing performance via the Compass. By gazing through the Compass’
drawing via unimodal gaze control. As such, it has a limited set of cursors, users could draw high-fidelity figures with greater flexibility
drawing functions. When it is generalized to drawing applications in the and lower error rates than using the previous gaze-control drawing
future, additional editing functions (e.g., rotation, transformation, pixel methods. Additionally, the extended interface design of the Compass
selection, brush styles, multiple views) would be desirable, as would be could compensate for the inherent eye-tracking bias.
a greater amount of interaction design via gaze control. The extension Finally, the field study investigated the system usability among
of the solution’s functionality requires rigorous and longitudinal user motor-disabled people and the interface preferences of the users. It
tests based on the principles of gaze-control drawing (Section 3) and was found that EyeCompass has a superior usability to previous gaze-
gaze-based interface design (Hansen et al., 2018). control drawing approaches and produced optimistic reports of user

19
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

experiences among motor-disabled people. The drawing outputs in the Data availability
field study also emphasized the capability of EyeCompass to produce
flexible graphics via unimodal gaze control. The data of the user studies and the field study are confidential as
In conclusion, EyeCompass helps people to draw using only their it involves personal information. The code of EyeCompass is available
eyes, which is of value to the motor disabled. The system developed on request.
by this study has filled the void of free-eye drawing research in recent
years and addressed the main challenges of current gaze-control draw- References
ing, especially free-eye drawing, including involuntary eye movements,
the conflict of the dual-task (observation and drawing) for eyes, and Aldridge, R., Davidoff, J., Ghanbari, M., Hands, D., et al., 1995. Measurement of scene-
the effect of eye-tracking errors. The system developed in this study dependent quality variations in digitally coded television pictures. IEEE Proc.-Vision
achieved hand-sketch level drawing with high fidelity and creativity, Image Signal Process. 142 (3), 149–154.
showing the technique’s ability to complement real-world drawing Bangor, A., Kortum, P., Miller, J., 2009. Determining what individual SUS scores mean:
Adding an adjective rating scale. J. Usability Stud. 4 (3), 114–123.
tasks. The mechanisms behind the system, namely the gaze-oriented
Barasch, M., 2013. Theories of Art: 3. from Impressionism to Kandinsky. Routledge.
Compass and brush-damping dynamics, also make a scientific contribu- Boashash, B., 2015. Time-Frequency Signal Analysis and Processing: A Comprehensive
tion that advances understanding of the kind of gaze interaction that is Reference. Academic Press.
comfortable for the eyes. Furthermore, the combination of the Compass Bochner, S., Chandrasekharan, K., Chandrasekharan, K., 1949. Fourier Transforms. (19),
and brush-damping dynamics is generalizable for the typical click-and- Princeton University Press.
drag mouse event. Therefore, the gaze interaction of EyeCompass is Bozomitu, R., Păsărică, A., Cehan, V., Rotariu, C., Costin, H., 2017. Methods of control
improvement in an eye tracking based human-computer interface. In: 2017 IEEE
of compelling value for future gaze-interaction research, not only for
23rd International Symposium for Design and Technology in Electronic Packaging.
gaze-controlled drawing but also for other application scenarios, such SIITME, IEEE, pp. 300–303.
as digital design software, gaze-controlled games, and desktop-control Brooke, J., 1996. Sus: a ‘‘quick and dirty’’ usability. Usability Eval. Ind. 189.
eye-tracking programs. Cecotti, H., 2016. A multimodal gaze-controlled virtual keyboard. IEEE Trans.
Moreover, this system has fulfilled the need for universal access Hum.-Mach. Syst. 46 (4), 601–606.
via its unimodal gaze control, which is compatible with other multi- Chin, C.A., Barreto, A., Cremades, J.G., Adjouadi, M., 2008. Integrated Electromyogram
and Eye-Gaze Tracking Cursor Control System for Computer Users with Motor
modal devices, such as voice input, foot paddles, and QuadStick. As Disabilities. Electrical and Computer Engineering Faculty Publications.
human–computer interaction becomes increasingly ubiquitous, interac- Cranley, N., Perry, P., Murphy, L., 2006. User perception of adapting video quality.
tions coupling multiple modalities will provide motor-disabled people Int. J. Hum.-Comput. Stud. 64 (8), 637–647.
with greater adaption and accessibility. Creed, C., Frutos-Pascual, M., Williams, I., 2020. Multimodal gaze interaction for
There are multiple directions for further research: (1) the explo- creative design. In: Proceedings of the 2020 CHI Conference on Human Factors
in Computing Systems. pp. 1–13.
ration of a substitute for dwell click for start-drawing and stop-drawing
Duchowski, A.T., Duchowski, A.T., 2017. Eye Tracking Methodology: Theory and
commands; (2) the exploration of artificial intelligence methods to Practice. Springer.
assist in free-drawing, such as automatic revision and generation of Dziemian, S., Abbott, W.W., Faisal, A.A., 2016. Gaze-based teleprosthetic enables
sketches; (3) the expansion of the drawing functions and connections intuitive continuous control of complex robot arm use: Writing & drawing. In: 2016
to common drawing applications, such as Adobe Photoshop and Win- 6th IEEE International Conference on Biomedical Robotics and Biomechatronics
(BioRob). IEEE, pp. 1277–1282.
dows Paint; and (4) further development and improvement of gaze
Feit, A.M., Williams, S., Toledo, A., Paradiso, A., Kulkarni, H., Kane, S., Morris, M.R.,
interaction based on eye physiology, and the enablement of more 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and
motor-disabled people to produce creative works. implications for design. In: Proceedings of the 2017 Chi Conference on Human
Factors in Computing Systems. pp. 1118–1130.
CRediT authorship contribution statement Gips, J., Olivieri, P., 1996. An eye control system for persons with disabilities. In:
COMPUTER SCIENCE DEPARTMENT. the Eleventh International Conference on
Technology and Persons with Disabilities. Los Angeles, California.
Lida Huang: Conceptualization, Methodology, Software, Investi-
Goodrich, D., 1954. Imaginative design: Abstract drawing projects for beginners in art.
gation, Formal analysis, Writing – original draft. Thomas Westin: Design 55 (3), 120–142.
Proofreading. Mirjam Palosaari Eladhari: Writing – review & edit- Gowen, E., Miall, R.C., 2006. Eye–hand interactions in tracing and drawing tasks. Hum.
ing, Supervision. Sindri Magnússon: Supervision. Hao Chen: Data Mov. Sci. 25 (4–5), 568–585.
curation, Visualization, Investigation, Validation. Hansen, D.W., Ji, Q., 2009. In the eye of the beholder: A survey of models for eyes
and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32 (3), 478–500.
Hansen, J.P., Rajanna, V., MacKenzie, I.S., Bækgaard, P., 2018. A fitts’ law study of
Declaration of competing interest
click and dwell interaction by gaze, head and mouse with a head-mounted display.
In: Proceedings of the Workshop on Communication By Gaze Interaction. pp. 1–5.
One or more of the authors of this paper have disclosed potential or Hassanieh, H., Indyk, P., Katabi, D., Price, E., 2012. Simple and practical algorithm for
pertinent conflicts of interest, which may include receipt of payment, sparse Fourier transform. In: Proceedings of the Twenty-Third Annual ACM-SIAM
either direct or indirect, institutional support, or association with an Symposium on Discrete Algorithms. SIAM, pp. 1183–1194.
entity in the biomedical field which may be perceived to have potential Hayhoe, M., 2000. Vision using routines: A functional account of vision. Vis. Cogn. 7
(1–3), 43–64.
conflict of interest with this work. For full disclosure statements refer
Hayhoe, M., Ballard, D., 2005. Eye movements in natural behavior. Trends in Cognitive
to https://doi.org/10.1016/j.ijhcs.2022.102966. The first author has Sciences 9 (4), 188–194.
a survey paper entitled "A Study of the Challenges of Eye Tracking Heikkilä, H., 2013a. Eyesketch: a drawing application for gaze control. In: Proceedings
Systems and Gaze Interacton for Individuals with Motor Disabilities" of the 2013 Conference on Eye Tracking South Africa. pp. 71–74.
(Cited as Huang L, Xu C, Westin T, et al. A Study of the Challenges Heikkilä, H., 2013b. Tools for a gaze-controlled drawing application–comparing gaze
of Eye Tracking Systems and Gaze Interacton for Individuals with gestures against dwell buttons. In: IFIP Conference on Human-Computer Interaction.
Springer, pp. 187–201.
Motor Disabilities[C]//International Conference on Human-Computer
Hillstrom, A.P., Yantis, S., 1994. Visual motion and attentional capture. Percept.
Interacton. Springer, Cham, 2022: 396- 411.) published by the Inter- Psychophys. 55 (4), 399–411.
national Conference on Human-Computer Interacton, Springer. In this Hornof, A., Cavender, A., Hoselton, R., 2003. Eyedraw: a system for drawing pictures
conference paper, the challenge of gaze-control drawing was discovered with eye movements. ACM SIGACCESS Accessibility Comput. (77–78), 86–93.
among motor-disabled people. However, this conference paper does Huang, L., Xu, C., Westin, T., Dupire, J., Le Lièvre, F., Shi, X., 2022. A study of the
challenges of eye tracking systems and gaze interaction for individuals with motor
not conflict with our submitted paper to your journal. It has some
disabilities. In: International Conference on Human-Computer Interaction. Springer,
background information about gaze-control drawing practical examples pp. 396–411.
and is referred to by this journal paper. Therefore, we report this Hunter, J.S., 1986. The exponentially weighted moving average. J. Qual. Technol. 18
conference paper to avoid any possible interpretation. (4), 203–210.

20
L. Huang et al. International Journal of Human - Computer Studies 170 (2023) 102966

Istance, H., Bates, R., Hyrskykari, A., Vickers, S., 2008. Snap clutch, a moded approach Slobodenyuk, N., 2016. Towards cognitively grounded gaze-controlled interfaces. Pers.
to solving the Midas touch problem. In: Proceedings of the 2008 Symposium on Ubiquitous Comput. 20 (6), 1035–1047.
Eye Tracking Research & Applications. pp. 221–228. Smith, J.D., Graham, T.N., 2006. Use of eye movements for video game control. In:
Jacob, R.J., 1991. The use of eye movements in human-computer interaction tech- Proceedings of the 2006 ACM SIGCHI International Conference on Advances in
niques: what you look at is what you get. ACM Trans. Inf. Syst. (TOIS) 9 (2), Computer Entertainment Technology. pp. 20–es.
152–169. Snowden, R., Snowden, R.J., Thompson, P., Troscianko, T., 2012. Basic Vision: An
Johansson, R.S., Westling, G., Bäckström, A., Flanagan, J.R., 2001. Eye–hand Introduction to Visual Perception. Oxford University Press.
coordination in object manipulation. J. Neurosci. 21 (17), 6917–6932. Soechting, J.F., Rao, H.M., Juveli, J.Z., 2010. Incorporating prediction in models for
Kasprowski, P., Harezlak, K., Niezabitowski, M., 2016. Eye movement tracking as a new two-dimensional smooth pursuit. PLoS One 5 (9), e12574.
promising modality for human computer interaction. In: 2016 17th International Stawicki, P., Gembler, F., Rezeika, A., Volosyak, I., 2017. A novel hybrid mental spelling
Carpathian Control Conference. ICCC, IEEE, pp. 314–318. application based on eye tracking and SSVEP-based BCI. Brain Sci. 7 (4), 35.
Lander, C., Löchtefeld, M., Krüger, A., 2018. Heyebrid: A hybrid approach for mobile Streichert, A., Angerbauer, K., Schwarzl, M., Sedlmair, M., 2020. Comparing input
calibration-free gaze estimation. Proc. ACM Interact. Mob. Wearable Ubiquitous modalities for shape drawing tasks. In: ACM Symposium on Eye Tracking Research
Technol. 1 (4), 1–29. and Applications. pp. 1–5.
Lankford, C., 2000. Effective eye-gaze input into windows. In: Proceedings of the 2000 Streijl, R.C., Winkler, S., Hands, D.S., 2016. Mean opinion score (MOS) revisited:
Symposium on Eye Tracking Research & Applications. pp. 23–27. methods and applications, limitations and alternatives. Multimedia Syst. 22 (2),
Lohr, D.J., Komogortsev, O.V., 2017. A comparison of smooth pursuit-and dwell- 213–227.
based selection at multiple levels of spatial accuracy. In: Proceedings of the 2017 Sun, H.-M., Huang, C.-W., 2011. The effect of media richness factors on
CHI Conference Extended Abstracts on Human Factors in Computing Systems. pp. representativeness for video skim. Int. J. Hum.-Comput. Stud. 69 (11), 758–768.
2760–2766. Tchalenko, J., 2001. Free-eye drawing. Point: Art Des. Res. J. 11, 36–41.
Majaranta, P., Bulling, A., 2014. Eye tracking and eye-based human–computer Tchalenko, J., 2007. Eye movements in drawing simple lines. Perception 36 (8),
interaction. In: Advances in Physiological Computing. Springer, pp. 39–65. 1152–1167.
Maxwell, S.E., Delaney, H.D., Kelley, K., 2017. Designing Experiments and Analyzing Tchalenko, J., Nam, S.-H., Ladanga, M., Miall, R.C., 2014. The gaze-shift strategy in
Data: A Model Comparison Perspective. Routledge. drawing. Psychol. Aesthet. Creativity Arts 8 (3), 330.
Menges, R., Kumar, C., Staab, S., 2019. Improving user experience of eye tracking- Ullman, S., 1987. Visual routines. In: Readings in Computer Vision. Elsevier, pp.
based interaction: Introspecting and adapting interfaces. ACM Trans. Comput.-Hum. 298–328.
Interact. 26 (6), 1–46. Urbina, M.H., Huckauf, A., 2010. Alternatives to single character entry and dwell time
Meyer, A., Dittmar, M., 2009. Conception and development of an accessible application selection on eye typing. In: Proceedings of the 2010 Symposium on Eye-Tracking
for producing images by gaze interaction-EyeArt. Research & Applications. pp. 315–322.
Murata, A., Karwowski, W., 2018. Automatic lock of cursor movement: Implications Van der Kamp, J., Sundstedt, V., 2011. Gaze and voice controlled drawing. In:
for an efficient eye-gaze input method for drag and menu selection. IEEE Trans. Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. pp. 1–8.
Hum.-Mach. Syst. 49 (3), 259–267. Vázquez, M., Steinfeld, A., 2014. An assisted photography framework to help visually
Nielsen, A.M., Petersen, A.L., Hansen, J.P., 2012. Gaming with gaze and losing impaired users properly aim a camera. ACM Trans. Comput.-Hum. Interact. 21 (5),
with a smile. In: Proceedings of the Symposium on Eye Tracking Research and 1–29.
Applications. pp. 365–368. Velichkovsky, B.B., Rumyantsev, M.A., Morozov, M.A., 2014. New solution to the midas
Pantanowitz, A., Kim, K., Chewins, C., Rubin, D.M., 2021. Gaze tracking dataset for touch problem: Identification of visual commands via extraction of focal fixations.
comparison of smooth and saccadic eye tracking. Data Brief 34, 106730. Procedia Comput. Sci. 39, 75–82.
Parisay, M., Poullis, C., Kersten-Oertel, M., 2021. Eyetap: Introducing a multimodal Velloso, E., Carter, M., 2016. The emergence of eyeplay: a survey of eye interaction
gaze-based technique using voice inputs with a comparative analysis of selection in games. In: Proceedings of the 2016 Annual Symposium on Computer-Human
techniques. Int. J. Hum.-Comput. Stud. 102676. Interaction in Play. pp. 171–185.
Ramirez Gomez, A.R., Clarke, C., Sidenmark, L., Gellersen, H., 2021. Gaze+ hold: Eyes- Velloso, E., Carter, M., Newn, J., Esteves, A., Clarke, C., Gellersen, H., 2017. Motion cor-
only direct manipulation with continuous gaze modulated by closure of one eye. relation: Selecting objects by matching their movement. ACM Trans. Comput.-Hum.
In: ACM Symposium on Eye Tracking Research and Applications. pp. 1–12. Interact. 24 (3), 1–35.
Ramirez Gomez, A., Gellersen, H., 2019. Looking outside the box: reflecting on Verghese, P., Beutter, B.R., 2002. Motion processing. In: Ramachandran, V. (Ed.),
gaze interaction in gameplay. In: Proceedings of the Annual Symposium on Encyclopedia of the Human Brain. Academic Press, New York, pp. 117–135.
Computer-Human Interaction in Play. pp. 625–637. Vidal, M., Bulling, A., Gellersen, H., 2015. Pursuits: Spontaneous eye-based interaction
Robinson, D.A., 1965. The mechanics of human smooth pursuit eye movement. J. for dynamic interfaces. GetMobile: Mob. Comput. Commun. 18 (4), 8–10.
Physiol. 180 (3), 569–591. Yantis, S., 1998. Control of visual attention. Attention 1 (1), 223–256.
Roelfsema, P.R., Lamme, V.A., Spekreijse, H., 2000. The implementation of visual Yeo, A.W., Chiu, P.-C., 2006. Gaze estimation model for eye drawing. In: CHI’06
routines. Vis. Res. 40 (10–12), 1385–1411. Extended Abstracts on Human Factors in Computing Systems. pp. 1559–1564.
Ruskin, J., 1876. The Elements of Drawing; in Three Letters to Beginners. Zhang, S., Tian, Y., Wang, C., Wei, K., 2020. Target selection by gaze pointing
Skovsgaard, H., Mateo, J.C., Flach, J.M., Hansen, J.P., 2010. Small-target selection with and manual confirmation: performance improved by locking the gaze cursor.
gaze alone. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Ergonomics 63 (7), 884–895.
Applications. pp. 145–148.

21

You might also like