Eye Gaze Movement Studies of Control Room Operators

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Computers and Chemical Engineering 85 (2016) 43–57

Contents lists available at ScienceDirect

Computers and Chemical Engineering


journal homepage: www.elsevier.com/locate/compchemeng

Eye gaze movement studies of control room operators:


A novel approach to improve process safety
Chandresh Sharma a , Punitkumar Bhavsar a , Babji Srinivasan a,b,∗ ,
Rajagopalan Srinivasan b,∗∗
a
Department of Electrical Engineering, Indian Institute of Technology Gandhinagar, VGEC Campus, Chandkheda, Ahmedabad 382424, India
b
Department of Chemical Engineering, Indian Institute of Technology Gandhinagar, VGEC Campus, Chandkheda, Ahmedabad 382424, India

a r t i c l e i n f o a b s t r a c t

Article history: Process industries continue to suffer from accidents despite significant regulatory intervention since the
Received 16 April 2015 mid-1980s. Human error is widely considered to be the major cause for most accidents today. Detailed
Received in revised form 5 September 2015 analysis of various incidents indicates that reduced staffing levels in control rooms and inadequate opera-
Accepted 15 September 2015
tor training with complex automation strategies as common reasons for human errors. Therefore, there is
Available online 23 October 2015
a need to develop deeper understanding of human errors as well as strategies to prevent them. However,
similar to hardware failures, traditionally human error has been quantified using likelihood approaches;
Keywords:
this viewpoint abnegates the role of the cognitive abilities of the operators. Recent studies in other safety
Cognitive engineering
Human error
critical domains (aviation, health-care) show that operator’s level of situation awareness as inferred by
Eye tracking eye tracking is a good online indicator of human error. In this work, a novel attempt is made to understand
Process safety the behavior of the operator in a typical chemical plant control room using the information obtained from
eye tracker. Experimental studies conducted on 72 participants reveal that fixation patterns contain sig-
natures about the operators learning and awareness at various situations. Implications of these findings
on human error in process plant operations them are discussed.
© 2015 Elsevier Ltd. All rights reserved.

1. Introduction blowout (2010), Chevron Richmond refinery fire (2012), and the
explosion at a fertilizer storage and distribution facility at West, TX
Safety and security are of paramount importance in chemical (2013) point to the continued need to improve process safety. On
plants as highlighted by President Barack Obama recently labeling a statistical basis, of the 20 accidents that have led to the largest
them as “stationary weapons of mass destruction” (Obama, 2006). property damage losses in the hydrocarbon industry in the 40 year
This realization is now at least four decades old, and originally period from 1974 to 2013, 25% have occurred in the last 5 years from
brought out by the disasters at Flixborough, UK (1974), Bhopal, 2009 (Marsh, 2014). Thus process safety is at least as important
India (1984), and Piper Alpha (1988) among others. Since then, both today as it was in the 1970s and 80s.
governments and industry around the world have made numerous A detailed analysis of incidents (Gupta, 2002; Paté-Cornell,
interventions to improve process safety. Stringent Process Safety 1993; Carson et al., 1992) reveals that human error is one of the
Management regulations are now the norm in most industrialized principal causes of accidents in the process industries. Statistics
countries. Today’s plants use highly reliable equipment, state-of- show that about 70% of the accidents in process industries are
the-art automation and control, and deploy sophisticated safety caused by human errors (Mannan, 2004). Some of the reasons that
management regimes so as to make accidents rare. Despite these a are widely attributed to this include: (i) larger scale and com-
number of recent accidents including the fires at oil storage facili- plexity of modern chemical plants with tight mass and energy
ties in Buncefield, UK (2005) and Jaipur, India (2009), BP’s Macondo integrations, (ii) reduction in staffing levels in many control rooms
as well as increase in the proportion of relatively inexperienced
operators as older operators retire, and (iii) deployment of sophis-
ticated automation and complex automation strategies without
∗ Corresponding author at: Department of Chemical Engineering, Indian Institute a concomitant increase in operations personnel’s cognitive abil-
of Technology Gandhinagar, VGEC Campus, Chandkheda, Ahmedabad 382424, India. ity. Therefore, there is a need to develop deeper understanding
Tel.: +91 7932210155; fax: +91 7923972586.
∗∗ Corresponding author. of human error in process safety (Mearns et al., 2002; Gordon
E-mail addresses: babji.srinivasan@iitgn.ac.in (B. Srinivasan), raj@iitgn.ac.in et al., 2002; Flin et al., 2002) as well as strategies to prevent
(R. Srinivasan). them.

http://dx.doi.org/10.1016/j.compchemeng.2015.09.012
0098-1354/© 2015 Elsevier Ltd. All rights reserved.
44 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

The most common approach in practice today to tackle human apparent in eye tracking. Evidence of this is presented in the form
error is to consider human failure during process hazard analy- of illustrative examples in Section 4 and detailed results that lead
sis. The role of humans in the process and the expected types of to this conclusion in Section 5.
human failures are taken into account qualitatively in checklist
analysis, HAZOP studies, etc. Quantitatively, human reliability is 2. Eye tracking
accounted for using Human Error Probabilities (Swain, 1990) in
task-specific risk analysis (Munger et al., 1962; Noroozi et al., 2014). The human eye lets light in through the pupil, and projects the
In recent years, there has also been increased effort in accounting image on to the retina at the back of the eye. The retina contains
for human factors that may impact human failures. Examples of light-sensitive cells that transduce the incoming light into electri-
this include designing ergonomic control rooms and user-centered cal signals for further processing. The light sensitive cells are not
design of human-machine interfaces, as exemplified by the work uniformly distributed throughout the retina, rather there is a small
of the Abnormal Situation Management consortium (Cochran and area called the fovea, where they are over-represented. In order to
Bullemer, 1996; Reising et al., 2005). However, less attention has see a selected object sharply, the eye has to move so that the light
been paid in understanding the cognitive behavior of operators from the object falls directly on the fovea. Eye tracking is the pro-
under abnormal situations and real-time interventions to prevent cess of measuring the motion of the eye relative to the head and
or reduce human error. inferring the point of gaze – the location on the stimulus object that
Cognitive engineering is a multidisciplinary research area that the subject is looking at.
focuses on analyzing the basic cognitive tasks (such as perception, Research has shown that the movement of the eye contains
memory, and reasoning) of human operators to understand their specific events (Duchowski (2007), Majaranta and Räihä (2002)).
mental workload, decision-making process, planning and situation Typically, eye movement is segmented into two distinct patterns
awareness in industrial settings (Norman, 1986; Parasuraman et al., – fixations and saccades. For example, when reading, the eye tem-
2008). As defined by Wilson et al., 2013, “cognitive engineering is porarily stops at a word and remains still for a period of time. This
the application of cognitive psychology and related disciplines to pause in eye movement is called fixation and is necessary to sta-
the design and operation of human–machine systems. Cognitive bilize the image of the word on the retina. Fixations typically last
engineering combines both detailed and close study of the human between tens of milliseconds up to several seconds. The eye also
worker in the actual work context and the study of the worker in rapidly moves from word to word during reading, i.e., from one
more controlled environments.” Cognitive engineering studies in fixation to another. Such a rapid movement is called a saccade. A
many high-risk industries show that human errors typically orig- saccade takes 30–80 ms and could be executed voluntarily or reflex-
inate from failures of Situation Assessment (SA) – “the perception ively. Eye movement is measured in visual degrees which can be
of the elements in the environment within a volume of space and translated into spatial coordinates (mm or pixels on a computer
time, the comprehension of their meaning and the projection of screen) of the stimulus object based on viewing distance (Groot
their status in the near future” (Endsley, 1988). In the context of et al. (1994)). An example of the set of fixations and saccades of
process supervision, SA is the ability to perceive information from a respondent viewing a DCS screen is shown in Fig. 1. In the fig-
the process in order to detect if any abnormality exists and decide ure, fixations are shown by dark rectangles (numbers indicating
on future actions to return the process to a normal or safe state. the fixation time in milliseconds) and saccades by straight lines
When an operator is unable to perceive the necessary informa- connecting the fixations in the sequence indicated by the num-
tion required to perform a particular task or if the information bers on the lines. Besides fixations and saccades, other events in
perceived is erroneous or incomplete, it impacts the decision mak- eye movement include smooth pursuit – tracking a slow moving
ing process and leads to unexpected acts on the part of the human object in order to keep the image stable in the retina, and miniature
operator, as exemplified by the incidents at Three Mile Island movements such as tremors, drifts and microsaccades that occur
(1979), Esso’s Longford refinery (1998), and BP’s Texas City refinery during fixation and help in preventing the fading away of stationary
(2005). objects.
Recent technologies are making it possible to infer the opera- The history of eye trackers, devices for measuring eye move-
tors’ level of SA in real-time. The first step towards SA – perception ment, dates back to the late 1800s when they were mostly
– manifests itself in the operator’s visual attention, i.e., the set of custom-built, mechanical, and uncomfortable. In recent years, eye
cognitive operations that mediate the selection of relevant infor- trackers have become commercially available, non-intrusive, ade-
mation and filtering out irrelevant information (Binder et al., 2009). quately accurate and robust for widespread adaptation. A variety
Human visual attention offers a direct and real-time assessment of of eye trackers are available from companies such as Tobii, SR
SA. Consequently, researchers are now starting to use visual atten- research and SMI systems which offer non-intrusive measurement
tion measurements such as eye tracking to measure and improve of observers gaze in a variety of situations (Hermens et al. (2013)).
the human’s performance in various spheres requiring time-critical The dominant method of eye tracking relies on video-based mea-
decision making such as driving, aviation, surgery, and sports. surement of eye movements. A schematic of a typical setup is
In this paper, we explore the potential of eye tracking to measure shown in Fig. 2. Here the eye is illuminated by a pattern of infrared
and understand the cognitive state of the control room operator dur- light. The reflected light is captured by a camera and an image of
ing process disturbances. This complements the traditional focus the eye obtained. This image is processed using proprietary image
of the Process Systems Engineering community wherein real-time analysis algorithms to estimate a gaze vector. This technique of
sensor measurements from the process are utilized to understand eye tracking is known as Pupil Center Corneal Reflection (Tobii
the state of the plant equipment. The rest of the paper is organized Technology (2010)). The sampling frequency of the eye tracker,
as follows. Section 2 presents an overview of eye tracking and its i.e., the number of images acquired and processed per second, is
use for studying situation awareness in various domains. We have a rough measure of the level of detail that can be observed in
conducted observational studies of human subjects as they seek to the resulting data – higher the frequency, the more detailed eye
control a simulated chemical process in the face of disturbances. movement events that can be detected. Typical frequencies today
The details of the experimental setup and methodology used in range from about 20 Hz at the low-end to 2000 Hz. The inter-
this research is described in Section 3. Results from the experi- ested reader is referred to Tobii Technology (2010) for a more
ments reveal that loss of SA and inability of a subject to adequately detailed explanation of the underlying technology in a typical eye
counter a process disturbance is accompanied by distinct patterns tracker.
C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 45

Fig. 1. Fixation and saccades on the schematic.

• Peak saccadic velocity: highest velocity reached during sac-


cades.

At a higher level, the entire eye movement during a task or its


portion can be segmented into a sequence of fixations and sac-
cades. Correlation between sequences from different participants
can be used to identify behavioral similarity. A detailed review of
eye tracking methods and measures is given by Holmqvist et al.
(2011).
Eye movements are considered to offer a real-time window into
cognition based on the “eye-mind” hypothesis, that is, what a per-
son is looking at it at a point in time is indicative of her thought
“on top of the stack” of cognitive processes (Just and Carpenter,
1976; Cooke, 2005). Thus eye movement is considered to provide
a dynamic trace of the person’s attention. The various metrics dis-
cussed above can then be interpreted as providing an insight into
the cognition process, for instance, the fixation duration is a mea-
sure of the amount of processing being applied, fixation rate a
measure of task difficulty, dwell time of uncertainty, etc (Poole and
Fig. 2. Eye tracker setup and operation.
Ball, 2005; Irwin, 2004; Kilingaru et al., 2013). A number of appli-
cation domains have utilized eye tracking to derive insights into
human behavior.
The set of gaze vectors collected from an experiment can be
analyzed using various statistical and computational techniques.
2.1. Applications
As an illustration, a fixation of 100 ms when measured at 200 Hz
would result in a series of 20 gaze vector samples that are spatially
Research in various areas such as surgery (Hermens et al., 2013;
clustered together in an Area Of Interest (AOI). So clustering can be
Ahmidi et al., 2010), driving safety (Pradhan et al., 2005; Palinko
utilized to identify fixation. If an AOI is large, there may be multiple
et al., 2010), visual search and information processing (Glöckner
clusters (i.e., fixations) within the same AOI. A dwell is defined as
and Herbold, 2011; Toker et al., 2013), consumer search dynam-
one visit in an AOI from entry to exit. A variety of useful statistical
ics in supermarkets (Reutskaja et al., 2011), multimedia learning
measures can be computed from the eye tracking data such as:
(van Gog and Scheiter, 2010), aviation (Kilingaru et al., 2013; Maier
et al., 2014), and human–computer interactions (Drewes, 2010)
• Fixation duration: the period during which the eye is relatively have reported insights using eye tracking.
still, usually measured for specific AOIs. Recently, several studies used fixation analysis to monitor and
• Fixation rate: the number of fixations in an AOI per second. assess the performance of airplane pilots Kilingaru et al. (2013),
• Dwell time: the summation of all fixation durations in the AOI. Schulz et al. (2011), Hasse et al. (2012). As one example, Kilingaru
If the AOI is small then its dwell time is the same as the fixation et al. (2013) used fixations and glances to understand the behavior
duration. of expert and novice pilots. Based on the dwell time, they mapped
• Saccadic duration: time taken to move between two fixations. the cognitive state of the operator into three categories: attention
46 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

Table 1
Variables in ethanol plant case study.

Measured variable Description Steady state value High alarm limit Low alarm limit

C101 Ethene concentration in CSTR (m mol/l) 1378.5 1555.6 955.6


F101 CSTR feed flowrate (l/h) 700.7 950 550
F102 CSTR coolant flowrate (l/h) 130.0 200 70
F105 Distillation column feed flowrate (l/h) 733.4 993.7 575.3
L101 CSTR level (m) 1.25 1.8 0
T101 Cooling water inlet temperature (◦ C) 20.0 40.0 0.0
T102 Cooling water outlet temperature (◦ C) 29.4 32.5 15.2
T103 Temperature inside CSTR (◦ C) 30.5 33 29.5
T104 Distillation column tray 3 temperature (◦ C) 100 100.5 98.5
T105 Distillation column tray 5 temperature (◦ C) 87.4 89.5 86.5
T106 Distillation column tray 8 temperature (◦ C) 79.5 80.4 78.5

focusing, attention blurring, and misplaced attention. Attention changes in intensity of light, arousal, emotion, and cognitive loads.
focusing is characterized by a continuous dwell on a specific instru- Experiments have revealed that during decision making, pupil dila-
ment over a period of time with a few glances at other instruments. tion increases before making a decision. More recently, Jiang et al.
On the other hand, a short dwell with many saccades between (2013) reported that when participants performed a simulated sur-
instruments indicates attention blurring. A short dwell on key gical task with varying difficulty levels and different demands of
instruments but with extended dwell outside the instrument panel mental workload, the pupil size increased significantly during the
is an indication of misplaced attention. This work demonstrated harder task and decreased rapidly while performing the easier task;
that information about pilots’ cognitive states could be obtained changes in pupil size can therefore be used to measure the mental
by continuous monitoring of the fixation data. workload. Pupil size variations have also been used to assess the
Gaze distribution and gaze movement profiles have been used to level of comprehension during e-learning. Porta et al. (2012) con-
distinguish between novice and expert surgeons (Law et al., 2004; ducted experiments on the application of Pythagoras theorem with
Wilson et al., 2011; Tien et al., 2010; Eivazi et al., 2012). Studies exercises of different difficulty levels – easy, hard, and impossible
using virtual laparoscopic environments concluded that novice sur- with no solution. One of the important conclusions from this work
geons concentrated on the surgical display while failing to notice was that specific peaks in pupil dilation could be identified that
the patient’s vital signs even when the heart rate audibly changed corresponded to the moment when the user recognized the right
and attention to the vital signs became critical (Tien et al., 2010). It solution. Also, smaller pupil dilations towards the end of the task
was also found that training based on eye fixations was more effec- with less eye activity and interaction indicated signs of tiredness.
tive compared to traditional methods based on tool movements Pupillometry and rate of saccades have also been used to study
(Wilson et al., 2011). A similar conclusion was reached by Zheng the effect of emotions during a task. In a recent study Tichon et al.
et al. (2011) who studied surgeon’s vigilance in an operating room (2014), participants were asked to perform four tasks with eye
and reported that expert surgeons glanced more frequently at the tracking in a pilot simulator. The Multiple Affect Adjective Checklist
anesthetic monitor in comparison to novices who focused more on questionnaire was also used to assess their general emotional state.
the surgical task. Their results indicated that pupil size was unaffected while the
Another study used eye related measures to identify the differ- saccadic rate increased for non-anxious participants (determined
ences between expert and first-time motorbike riders. The main based on checklist) while the pupil size increased and saccade rate
objective of this study was to assess the effect of experience and decreased for anxious ones. Pupil dilation is thus an indicator of
training on risk behavior and mental workload in first-time motor- stress (Partala and Surakka, 2003; Ren et al., 2014).
cycle riders (Di Stasi et al., 2011). It was observed that, compared In summary, numerous measures from eye tracking data has
to experts, first-time riders before training had less fixation dura- been used in several disciplines to understand situation awareness,
tion on the margins of the road; majority of their focus was on expertise level, learning ability and mental state of the human sub-
irrelevant areas below and above the center of the road. However, jects. However, to our knowledge, these opportunities are yet to be
after training, their fixation pattern matched that of experts with explored in the chemical process industries. We seek to address
large fixation duration on roadsides and wing mirrors. Driver inat- this gap in this work. Specifically, we explore the potential of
tention and distraction monitoring have also been studied using eye gaze data in understanding control room operators’ attention
eye fixations (Fletcher and Zelinsky, 2009; Hammoud et al., 2008). and situation awareness through experimental studies. A detailed
Fletcher and Zelinsky, 2009 considered the driver and the vehicle description of the experiments is presented next.
as a combined system and developed a driver assistance unit that
is responsive not only to the road environment and the driver’s 3. Experimental setup and methodology
actions but also correlated their eye fixation with road events to
determine the driver’s observation. Through on-road trails, it was We used a simulated chemical process to study the cognitive
observed that driver inattentiveness (that often foreshadows an behaviors of the participants. The process consists of a simulated
accident) can be detected by monitoring eye gaze, and can be used ethanol plant comprising a Continuous Stirred Tank Reactor (CSTR)
to alert the driver. Another study (Di Stasi et al., 2012) conducted with a distillation column. Ethene reacts with water in the CSTR to
experiments to identify markers of driver fatigue. Drivers were produce ethanol:
given a fixation task both before and after two hours of driving. Eye
C2 H4 + H2 O  CH3 CH2 OH H = −45 KJ mol−1 (1)
tracking data indicated that the peak saccadic velocity decreased
significantly while performing the task after driving with increased The unreacted feed is separated from the product ethanol in the
saccadic duration. This indicates that saccadic peak velocity and distillation column. Since the reaction is exothermic, the reac-
saccadic duration are indicators of fatigue (Di Stasi et al., 2013). tor temperature is maintained by circulating cooling water in the
In addition to eye gaze data, modern day eyetrackers can also jacket. The column contains 9 trays; the feed enters at the fifth tray
measure the pupil diameter in real-time. Seminal studies in the and product ethanol is obtained in the distillate stream. A schematic
1960s established that pupils of human subjects dilate due to of the process is shown in Fig. 3.
C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 47

Fig. 3. Schematic of ethanol plant.

Table 2 Table 3
Control scenarios in ethanol plant. Corrective action for disturbance scenarios.

Scenario ID Description Scenario ID Required corrective actions

D1 Increase in feed flow to CSTR D1 Decrease feed flow rate to CSTR using V102
D2 Decrease in cooling water flow to jacket of CSTR D2 Increase coolant flow rate using V301
D3 Reduction in flow from CSTR to distillation column D3 Increase feed flow rate to distillation column using V201
D4 Imbalance in reflux ratio D4 Adjust the reflux valve using V401
D5 Decrease in feed flow to CSTR D5 Increase feed flow rate to CSTR using V102
D6 Increase in cooling water flow to jacket of CSTR D6 Decrease coolant flow rate using V301
F1 Leak in feed to CSTR
F2 Catalyst poisoning in CSTR
F3 Leak in cooling water inlet
case, the disturbance can be negated by increasing the feed flow to
F4 Leak in pipe connecting reactor to distillation column
F5 Incorrect reflux ratio the CSTR using valve V102. Details of the corrective measures for
F6 Reboiler power failure all the disturbances are listed in Table 3.

3.1. Human–machine interface


The process does not have any automatic controllers and has to
be monitored and controlled by the participants. The eleven vari- A dynamic simulation of the ethanol process was implemented
ables listed in Table 1 are measured in real-time. All of them also in Matlab. The process model equations and parameters are
have high and low alarms configured, so any deviation beyond the obtained from Ghosh et al., 2011. Matlab’s graphical user interface
specified threshold will be flagged by alarms. design environment was also used to develop a Human–machine
The main objective of this work is to understand the cognitive interface (HMI) for controlling the process that resembles a typical
behavior of the operator during disturbances. To accomplish this, DCS interface. The entire experiment is interactive and the partic-
the participants were given ten separate tasks. A total of twelve ipant can use the HMI to get process information and manipulate
control scenarios have been designed as listed in Table 2. Different the control valves. Information about the real-time values of the
participants were given a different subset of these scenarios. Of the eleven measured variables, the list of standing alarms, and trend
twelve, six scenarios related to disturbances and the remaining six of any one variable (selected by the participant) can be seen in the
to failures. The current work focuses only on the six disturbance snapshot of the HMI shown in Fig. 4. The HMI has been designed
scenarios, D1–D6. When a disturbance occurs, one or more vari- based on human factor principles and best practices (Nachreiner
ables will be significantly perturbed from their steady states and et al., 2006; Tharanathan et al., 2012). The schematic of the process
alarms may be flagged. The participant is expected to return the including the measured variables and their status (normal/alarm)
plant to a normal condition by manipulating one or more of the are shown in the center pane of the interface. The participant can
four control valves in the process. These correspond to the CSTR manipulate any of the four control valves in the process in real-
feed flow V102, cooling water flow V301, flow from CSTR to dis- time by moving their sliders. The alarm status of all the variables
tillation column V201, and reflux flow in the distillation column is also shown in the schematic by the color of its value – a green
V401. For instance, during scenario D5 a sudden decrease in the color implies that the corresponding process variable is within its
feed flow rate to CSTR leads to a high concentration of ethene in the normal limits while a red color depicts that the variable is in alarm
reactor and results in F101-low and C101-high alarms. The partici- state. The alarm summary pane on the lower part contains informa-
pant is expected to diagnose the situation in real-time and identify tion about the alarms (if any) that are currently flagged, their time
and implement the corrective action(s) necessary. In this particular of occurrence, the nature of the alarm (PV high or PV low), and a
48 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

Fig. 4. HMI with the schematic and display units. (For interpretation of the references to color in text near the reference citation, the reader is referred to the web version of
this article.)

description of the alarmed variable. The participant can also call up


a detailed historical trend of any one variable at a time by click-
ing on the corresponding tag in the schematic display (see Fig. 4).
An emergency shutdown button enables the participant to end the
scenario at any time.

3.2. Experimental protocol

Experimental studies are conducted in a controlled environ-


ment with participants interacting with the HMI to monitor and
control the process. Junior year (3rd year) students of Indian Insti-
tute of Technology Gandhinagar with a background in control Fig. 5. Typical instruction for a task.
theory were asked to play the role of control room operators. Fol-
lowing the deceptive experimental technique common in cognitive
engineering (Mitchell, 1986), the participants were not informed the instruction. After reading the instruction, the participant can
about the objective of the study in order to prevent biasing their initiate the task by pressing the “Start” button in the schematic.
behavior of subjects and any strategic response to aid or ruin the Shortly thereafter (typically in 20–40 s), unknown to the partici-
experimental hypothesis (Adhitya et al., 2014). Hence, the experi- pant, the disturbance is introduced in the process. The effect(s) of
ments were titled as “Control Room Studies”. the disturbance become evident by variable(s) deviating beyond
Before performing experiments, each participant was asked to their normal limits and indicated by alarm(s). When an alarm is
study a 10-page technical handout to educate themselves about the flagged, a beep is sounded, the corresponding variable value turns
HMI and the control tasks. The handout consisted of: (i) an overview from green to red in the schematic pane, and the details of the alarm
section, explaining the role of the participant as an operator and is listed in the alarm log pane. The entire experimental protocol is
their responsibilities, (ii) a technical section providing details of the shown with block diagram in Fig. 6.
ethanol process, and (iii) a HMI section explaining all the display During the task, the participants take various actions to assess
units and ways to interact with the plant (to access trend, manip- the situation and annul the disturbance such as searching for infor-
ulate valves, etc). This was followed by a short video that walked mation, opening/closing control valves etc as shown in Fig. 7. If
the participant through the HMI and an illustrative scenario. This successful, all the process variables will return to their normal
entire training phase lasted between 10 and 15 min. ranges and the task is completed as indicated by the “Scenario
Before beginning each task, the participant was given a task- completion” message in the schematic pane. If the participant is
specific instruction. As shown in the example in Fig. 5, the unable to return the process to normal state within 240 s, emer-
instruction for each of the disturbance mentions that the partic- gency shutdown is automatically triggered. The participant also
ipants should respond to the alarms arising from the disturbance has the option to activate a shutdown at anytime by pressing the
and manipulate the control valves using the sliders to bring the “Emergency shutdown” button in the schematic pane. Once a task
plant back to normal condition. It may be noted that the spe- is complete, a survey has to be answered by the participant before
cific control valve to be manipulated is explicitly mentioned in moving on to the next task.
C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 49

Fig. 6. Experimental protocol.

Fig. 7. Typical operator behavior during a task.

Fig. 8. Trend of flow rate F105 during scenario D3.


50 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

Fig. 9. Dwell on various AOIs during scenario D3. (For interpretation of the references to color in text near the reference citation, the reader is referred to the web version of
this article.)

The above experimental protocol (i) ensures that the partic- by restricting the total experiment duration (less than 30 min),
ipant has some basic training before start of the experiments, (iv) tests participants’ process knowledge comprehensively
(ii) improves participants’ acquaintance with the process and HMI through scenarios of varying difficulty levels and originating in
through repeated tasks, (iii) seeks to prevent participant fatigue different sections of the process, (v) randomizes the allocation of

Fig. 10. Trend of process variables during scenario D5.


C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 51

Fig. 11. Dwell on various AOIs during scenario D5.

scenarios to tasks to avoid any systematic bias, and (vi) repeats at low F105 alarm occurs at 26 s. The operator makes the first con-
least a few scenarios so as to reveal the effects of learning. trol move at t = 39 s and increases V201 from 0.25 to 0.6 which
leads to F105 increasing and stabilizing at a value of 440 l/h. This
3.3. Data collection is below the low alarm limit of 575 l/h, hence at t = 44 s and t = 46 s,
the operator again increases the flow. At t = 48 s the flow returns
The various actions taken by the participant using the HMI to the normal limits and reaches the steady state around 50 s. The
including mouse movements, mouse clicks, slider movements and process thus returns to the normal state with all variables within
responses to various survey questions are recorded during the their normal limits which signals the successful completion of the
experiment. Eye movements of the participant are captured using task.
a Tobii TX300 eye tracker with a sampling frequency of 120 Hz Now, we seek to understand the cognitive processes of the
and further down-sampled to 60 Hz for analysis. Calibration of the operator during this task. Fig. 9 shows the operator’s scan path,
eye tracker was performed for every individual at the beginning of specifically the dwell time on various AOIs, during the task. Only
the experiment. We followed the standard nine-point calibration the AOIs that the operator viewed during this task are labelled in the
approach provided in the Tobii eye tracker. During the calibration figure. Initially the operator gazes at F105 and then at T106, which
phase, several images of the eyes are collected, analyzed and the is located immediately above it in the schematic. Around t = 5 s, the
information is integrated in the eye tracker’s internal eye model operator clicks the T106 tag (not shown in the figure) to view its
(Tobii Technology, 2010). The gaze data acquired from the eye trend. Subsequently, till t = 26.4 s while the process is normal, he
tracker is first converted from XY position values to pixels based views the trends of F105 (starting t = 10 s), T104 – which is located
on the knowledge of the screen size and mapped to one of the 18 immediately below F105 (t = 15 s), T105 (t = 20 s), and F105 (t = 23 s).
AOIs in the HMI. However, if the gaze does not correspond to any of Thus, in the early stages of the task, based on the instruction, the
these AOIs then it is mapped to the background. This data is further operator appears to be comprehending the state of the process;
processed and validated as explained in Appendix A. The results his visual attention is focused on the primary variable for this task
from analyzing the data provide several insights into the cognitive (F105) and other tags that are in its immediate vicinity. When the
behavior of operators as illustrated in the next section. first alarm occurs at 26.4 s, the operator hears the beep (indicated
by the inverted red triangle at the bottom of Fig. 9) and the color of
4. Illustrative examples F105’s value changes from green to red. Soon thereafter (by t = 28 s)
he gazes at the F105 tag, the alarm summary pane and F105’s trend
Here we illustrative the insights on the cognitive tasks that can and thus becomes aware of the new state of the process in the face
be inferred through eye gaze tracking. First, we study two different of the disturbance. After this orientation and then a diagnosis phase,
cases in detail. the operator takes the first control action at t = 39 s with his eye gaze
at slider V201. After taking the control action, his gaze returns to the
4.1. Successful disturbance rejection trend pane. Subsequently, he manipulates V201 three more times
as described above. During this period, his eye gaze switches back
Let us consider a task where an operator (participant) faces dis- and forth between the V201 slider and the F105 trend till the alarm
turbance Scenario D3. In this case, the flow from the CSTR to the clears and the flow has stabilized. Finally, around t = 50 s, with the
distillation column, as measured by F105, drops unexpectedly. The process at steady state, till the task has been completed at t = 63 s,
operator has been instructed (as shown in Fig. 5) to manipulate his gaze switches between T104, T106, and F105, similar to the
the flow rate by using valve V201. The trend of F105 is shown in behavior at the beginning of the task before the disturbance was
Fig. 8. As evident there, the disturbance is initiated at 25 s and the introduced.
52 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

Table 4
Classification of task outcomes based on slider usage.

Task status Use of slider

Primary only Secondary only Primary


&Secondary

Successful completion 455 (80.11%) 0 62 (10.91%)


Failure 5 (0.88 %) 3 (0.53%) 43 (7.57%)

reverses the previous action and instead reduces V102 starting at


t = 51 s back to 0.52; this results in the re-occurrence of F101 and
F105 alarms around t = 56 s. The operator then again increases V102
to 0.60 at t = 68 s which leads to the same alarm pattern as before
(high C101 standing, other two cleared). In a sign of disorientation
setting in, at t = 87 s he decreases V301 from 0.49 to 0.32, which
is counter to the task instructions. Subsequently, after decreasing
V102 around t = 133 s from 0.60 to 0.55, decreasing V201 from 1.0
to 0.93 and then back to 1.0 around t = 160 s, and increasing V301
around t = 180 s from 0.32 to 0.51 without success in clearing the
alarms, the operator stops trying to control the process around
t = 189 s; shortly thereafter at t = 240 s, the process automatically
shuts down.
Fig. 11 shows the operators scan path during this task. As in the
previous case, only the AOIs that the operator viewed during the
task are labeled. In comparison to Fig. 9, it is immediately apparent
that in this case, the failed operator has gazed at many more tags in
contrast to the successful operator who focused predominantly on
the primary tag and a few others in its immediate vicinity. Also, the
failed operator mainly looks at the instructed tag F101 and slider
V102 before the occurrence of the alarm, a few short dwells on the
nearby tags (T101, T103 and L101). However, he has not looked at
the trend of any variable during this initial orientation period. After
the alarm at t = 27 s, he gazes at the F101, F105 and V102 as well
as the trend of F101 similar to the successful operator, however his
dwell duration in the trend pane is comparatively lower. Finally,
it can be seen that after being unable to return the process to the
normal state, the operator has a large number of saccades with
short dwells over various tags (unrelated to the disturbance) until
the end of the task.
In the following section, we demonstrate that behavioral pat-
terns observed from a typical successful and failed disturbance task
Fig. 12. Distribution of completion times: (a) Group I tasks, (b) Group II tasks, and
(c) Group III tasks.
generalize to the population level.

5. Results
4.2. Failed disturbance rejection
The experiment described in Section 3 was conducted on 72
In contrast to the above successful case, consider another task participants (in total 576 tasks) possessing basic knowledge of con-
when a disturbance could not be controlled; this results in auto- trol theory. The resulting dataset is labeled as the CS2014A dataset.
matic shutdown of the process at t = 240 s. In this case (scenario Interested readers can obtain the dataset by contacting the authors.
D5), flow to the CSTR, as measured by F101, decreases unexpect- Analysis of the dataset shows that majority of the tasks (91.02% out
edly. The operator has been instructed beforehand to handle this of 568 tasks – 8 removed due to data quality issues) are successfully
scenario by manipulating the feed flow rate using valve V102. Due completed. We seek to statistically establish significant cognitive
to decrease in feed (water) to the CSTR, concentration of the unre- differences between successful and unsuccessful task completion.
acted ethene in the reactor, as measured by C101, increases; also For this purpose, we first categorize variables involved in each sce-
the feed flow to the distillation column (F105) decreases. The trends nario into primary and secondary ones. Primary variables are those
of F101, F105 and C101 are shown in Fig. 10. As evident there, the that are either directly affected by the disturbance (similar to the
disturbance is initiated at t = 26 s resulting in a sequence of alarms controlled variable in feedback control loop, e.g. F105 in Scenario
– low F101 (at t = 27 s), low F105 (t = 27 s), and high C101 at 33 s. 3) or those that can be used to directly control them (similar to the
The operator initially selects the correct valve (as per the instruc- manipulated variable in feedback control loop, e.g: V201 in Sce-
tion) and increases V102 from 0.52 to 0.58 at t = 34 s. This results nario 3). Other variables that would be indirectly related to the
in the F101 and F105 alarms being cleared around t = 38 s but their disturbance, and may be affected by the disturbance due to mass
values stabilize near their low alarm limits. C101 is still in alarm and energy balances in the process, are called secondary variables
state. It may be noted that increasing V102 to 0.66 or a higher value here. With this classification, the Areas of Interest (AOIs) in the HMI
would have lead to C101 also returning to its normal limits. How- can also be classified into five categories: (i) trend of primary vari-
ever, the operator, perhaps puzzled by the standing C101 alarm, ables, (ii) trend of secondary variables, (iii) tag/slider of primary
C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 53

Fig. 13. Dwell time distribution during entire task.

variables in schematic pane, (iv) tag/slider of secondary variables state using just the primary variables. Therefore, a classification of
in schematic pane, and (v) alarm summary pane. With this clas- the operator performance based on his behaviour (i.e., the slider(s)
sification, the cognitive behaviours described in Section 4 can be used) is an indicator to his cognitive state.
evaluated statistically. Three distinct clusters of behavior are readily apparent from the
A detailed breakdown of the task outcomes based on the sliders table and account for about 99% of all the tasks.
used by the operator is shown in Table 4. It may be noted that the
primary tag and primary slider have been pointed out to the opera-
tor in the pre-task instruction. Further, each of the disturbances can 1 Group I: Successful completion using primary slider only (80% of
also be completely rejected and the process returned to the normal tasks)

Fig. 14. Dwell time distribution before alarm.


54 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

Fig. 15. Dwell time distribution after alarm.

2 Group II: Successful completion, but utilizing primary and sec- therefore be explained solely from the nature of the HMI element
ondary sliders (11% of tasks) used for understanding the state of the process. The total dwell on
3 Group III: Failure to complete the task (8% of tasks) HMI elements related to the primary variable(s) – trend, tag & slider
– for Group I is 68%, Group II is 49%, and Group III is 32%. This seems
There are very few cases where the use of the primary slider alone to indicate that the extent of focus on primary variables is a good
leads to task failure; also there are very few cases where only the indicator of operator performance.
secondary slider has been used. These combinations are hence not Next, we analyze the dwell time duration of the operators before
considered further. the first alarm. As explained in Section 3.2, at the beginning of each
Group 1 is similar to the illustration in Section 4.1 and Group task, the process is at a normal steady state. Shortly afterwards
III to that in Section 4.2. Group II – where the operator was able (20–40 s) after the start, unknown to the operator a disturbance
to eventually complete the task, but has used both the primary is introduced into the process, the effects of which manifests as
and secondary sliders, i.e., deviated from the pre-task instruction alarm(s). While not instructed as such explicitly, the operator is
– is an additional category that is not evident solely from the task expected to orient himself to the state of the process and the
outcomes. Fig. 12 shows the distribution of the completion times expected process dynamics (as inferred from the pre-task instruc-
for all the tasks. As expected, the mean completion time of Group I tion) during this period. The operator’s eye gaze during this period
is shorter than that of Group II which in turn is shorter than that of therefore is expected to reveal indicators of the cognitive processes
Group III. However, the distribution for Group II (after discounting during this orientation phase. Fig. 14 presents the dwell time distri-
the peak at 240 s that occurs because of automatic shutdown) is bution from the beginning of the task till the first alarm is flagged.
more similar to Group III than to Group I. In other words, when the In line with the observation made from the illustrations in Section
operator uses both primary and secondary sliders, there is a 59% 4, Group III operators have a smaller average dwell duration (2%)
probability of successful completion and 41% probability of failure on tags, while Group I have the largest (21%). This indicates that
which is in stark contrast to the other case when the operator uses the trend pane provides much more information especially for ori-
only the primary slider. The emergence of Group II and its many entation compared to the tags. Further, Group I has a much higher
behavioural similarities with Group III despite the successful task focus on the primary variables (10%) compared to Group II (4%) and
completion (i.e., same outcome as Group I) is a paradox that brings Group III (2%) which is also in accordance with the distribution for
out the need for understanding the cognitive behaviors that drive the entire task.
operators’ performance. Similar patterns are also observed in the dwell time distribu-
tion over the period ranging from the first alarm till the end of
5.1. Group-wise characteristic cognitive behaviors the task as shown in Fig. 15. While all three groups of operators
use the trend pane to a similar extent (45–52%), Group I operators
Fig. 13 shows the average dwell time distribution for each of focus predominantly (43% of time) on the trends of primary vari-
the three groups on the five classes of AOIs during the course of able(s), while Group III dwell on primary trends only sparsely (7%).
the entire task. It is clearly evident that the trend of the primary Across the various HMI elements, Group I operators seek informa-
variable(s), with the mean dwell time of 35%, is the most important tion about primary variables 75% of the time, Group II 52% of the
HMI element for Group I operators. The dwell time on the primary time, and Group III only 30%.
trend for Group II operators was 46% lower, while that for Group A recent ethnographic study (Yin, 2012) based on interviews
III operators was 81% lower. On the other hand, Group III operators with 17 expert refinery control room operators, whose work expe-
had large dwell times on the trend view of secondary variables rience ranged from 14 to 18 years, reported that that they often
(34%) while Group I had only 10.3% of dwell on these variables. relied on the trend. Our study corroborates this finding quantita-
Overall, since all the three groups used the trend pane extensively tively, but further indicates that the hallmark of an expert operator
(from 41% to 45% of the time), the difference in outcomes cannot is probably reflected in their choice of the right variables to focus on,
C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 55

approach can also engender customized, operator-specific train-


ing. Since the dwell time distribution indicates possible reasons
behind an operator’s action, scenarios can be designed to address
specific training needs. For example, an operator failing to utilize
the trend information adequately could be trained in that. Also,
key behaviours can be analyzed quantitatively. For instance, in our
experiments, it could be seen (see Fig. 16) that when an operator
repeated the same scenario twice, the average time taken to ini-
tiate the control action decreases by 39% in the second repetition
in comparison to the first. Therefore, during training, a benchmark
for operator’s response can be benchmarked at an atomic level and
training tasks repeated until a desirable level of performance is
Fig. 16. Effect of repeated tasks: Time taken to perform first control action. obtained.
Another direction for future research is development of atten-
tion aware systems that are capable of adapting to and supporting
from the 100 s or 1000 s that they have access to through the DCS human attentional processes, especially in situations involving
screens, rather than the form in which they consume information. multi-tasking and in highly dynamic environments. As an exam-
ple, the role of control room operator’s in many complex, highly
6. Conclusions and discussion automated plants is predominately on process monitoring and
supervision. The eye tracking approach pioneered in this work can
Safety in the chemical industry is as important today as at any be used online to track the operator’s situation awareness espe-
time in its history. However, unlike in the near past when low cially during disturbances and abnormal situations. Patterns in eye
equipment and instrument reliability were the main root causes gaze behaviour can be used to detect loss of situation awareness on
for accidents, today human error plays a predominant role. Devel- the part of the operator (due to information overload, fatigue, etc)
opments in both the physical and information sciences have led and appropriate actions (such as seeking a relief operator, escalat-
to improved understanding of failure modes which in turn have ing to supervisor, or providing increased decision support) can be
contributed to higher reliability of equipment as well as sophisti- triggered. Results from the current work indicate dwell duration
cated monitoring and diagnosis techniques. Chemical processes are as a reliable measure to infer the situation awareness and cogni-
therefore commonly designed trying to automate away error prone tive ability of the operators. Various other eye tracking measures
human actions, but many complex tasks still remain under the sole such as saccadic duration, saccadic peak velocity and pupil dila-
purview of human operators. There is hence a need to develop a tion have also been proposed in literature and can be studied for
fundamental understanding of human error. Recent developments their usefulness as indicators of behaviors in SA. These measures
in cognitive engineering offer new measurements and insights into could help develop a multivariate behavioral model of the oper-
the actions and behaviors of human decision makers, such as plant ator. Finally, decision support systems related to fault detection
operators. Based on these measurements, and just like sensors and diagnosis have received a lot of attention from the academic
that provide information about the state of plant equipment, it is community, but their adoption in industrial practice is still in their
possible to infer the cognitive state of the plant operators. These infancy. Eye tracking studies can be used to diagnose how operators
developments can engender new approaches to proactively obvi- utilize the information provided by these decision support systems
ate human errors. This paper reports the first development, to our and improve their value-add taking into account plant operators
knowledge, in this direction. needs in real industrial contexts.
Here, we have sought to understand some key characteris-
tics of mal-operation when an operator is controlling a chemical Acknowledgements
process. Our experiments studied the simple task of open-loop
disturbance rejection in a simulated ethanol process. Junior year We would like to thank Dr. Krishna Prasad Miyapuram, Indian
undergraduate students played the role of a control room opera- Institute of Technology Gandhinagar for his constructive sug-
tor and were assigned various tasks where the operator’s active gestions. Financial support from Indian Institute of Technology
intervention is essential to continue normal process operation. The Gandhinagar to pursue this work is also gratefully acknowledged.
cognitive behaviours of the operators were measured by an eye
tracker which measures their eye gaze location at high frequency
(120 Hz). Although 91% of the tasks could be successfully controlled Appendix A. Eye tracker data preprocessing and validation
by the operators, patterns in the eye gaze behavior indicated that
not all successful operations were alike; nearly 12% of the successful The raw data from eye tracker contains following 16 different
operators (called as Group II) closely resembled operators who had attributes (shown in Table 5):
failed to control the process (Group III). Both Group II and Group
III were characterized by relatively lower attention on the primary • Time stamps in both seconds and microseconds
variables, which indicates a lower understanding of the underly- • Horizontal and vertical gaze target position, gaze position related
ing process dynamics. Our research also confirmed quantitatively to calibration (normalized to 1 as per screen size in terms of
that the use of trend information is important especially in the pixels) for both the eyes
orientation phase of situation assessment. This finding was qualita- • Horizontal and vertical position of both eyes as seen by eye
tively reported previously by an ethnographic study of experienced tracker.
refinery operators (Yin, 2012). • Distance from eye tracker (mm) for both the eyes
The results from this work can be extended in the future in a • Pupil size (mm) of both eyes
number of significant ways. For instance, these insights can be used • Left and right eye validity codes.
during operator training to evaluate operators’ expertise in a deeper
cognitive level instead of through outcome oriented metrics such The validity codes provide information about the usefulness of the
as completion status (success or failure) or completion time. The data and is detailed in Table 6. During analysis, the data samples
56 C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57

Table 5 tracker data. Application of the above two criteria resulted in valid
Description of attributes provided by eye tracker
eye tracker data for 82 tasks which are used in our work.
Attribute Description

Timestamp (s) Time at which data is collected References


Time stamp (ms) Microsecond fraction when gaze data was
sampled Adhitya A, Cheng SF, Lee Z, Srinivasan R. Quantifying the effectiveness of an
Left eye horizontal gaze Gaze position related to current alarm management system through human factors studies. Comput Chem Eng
target position calibration. Increases towards participant’s 2014;67:1–12.
right Ahmidi N, Hager GD, Ishii L, Fichtinger G, Gallia GL, Ishii M. Surgical task and skill
classification from eye tracking and tool motion in minimally invasive surgery.
Left eye vertical gaze target Gaze position related to current
In: Medical image computing and computer-assisted intervention-MICCAI 2010.
position calibration. Increases downwards
Springer; 2010. p. 295–302.
Left eye horizontal position Eye position as registered by the eye Binder MD, Hirokawa N, Windhorst U. Encyclopedia of neuroscience. Springer; 2009.
tracker. 0 is leftmost and 1 is rightmost Carson RT, Mitchell RC, Hanemann WM, Kopp RJ, Presser S, Ruud PA. A contingent
Left eye vertical position Eye position as registered by the eye valuation study of lost passive use values resulting from the exxon valdez oil
tracker. 0 is topmost and 1 bottommost spill. Tech. rep. Germany: University Library of Munich; 1992.
Left eye distance (mm) Distance between participant’s left eye and Cochran E, Bullemer P. Abnormal situation management: not by new technology
eye tracker alone. In: Proceedings of AICHE; 1996.
Left eye pupil size (mm) Length of the longest chord of the pupil Cooke L. Eye tracking: how it works and how it relates to usability. Tech Commun
ellipse 2005;52(4):456–63.
Left eye validity code Estimate of validity of left eye data Di Stasi LL, Contreras D, Cándido A, Ca nas J, Catena A. Behavioral and eye-
Right eye horizontal gaze Gaze position related to current movement measures to track improvements in driving skills of vulnerable road
target position calibration. Increases towards participant’s users: first-time motorcycle riders. Transp Res Part F: Traffic Psychol Behav
2011;14(1):26–35.
right
Di Stasi LL, Renner R, Catena A, Ca nas JJ, Velichkovsky BM, Pannasch S. Towards a
Right eye vertical gaze Gaze position related to current
driver fatigue test based on the saccadic main sequence: a partial validation by
target position calibration. Increases downwards
subjective report data. Transp Res Part C: Emerg Technol 2012;21(1):122–33.
Right eye horizontal Eye position as registered by the eye Di Stasi LL, Catena A, Canas JJ, Macknik SL, Martinez-Conde S. Saccadic velocity as an
position tracker. 0 is leftmost and 1 is rightmost arousal index in naturalistic tasks. Neurosci Biobehav Rev 2013;37(5):968–75.
Right eye vertical position Eye position as registered by the eye Drewes H. Eye gaze tracking for human–computer interaction. LMU; 2010, Ph.D.
tracker. 0 is topmost and 1 bottommost thesis.
Right eye distance (mm) Distance between subject’s right eye and Duchowski A. Eye tracking methodology: theory and practice, Vol. 373. Springer;
eye tracker 2007.
Right eye pupil size (mm) Length of the longest chord of the pupil Eivazi S, Bednarik R, Tukiainen M, von und zu Fraunberg M, Leinonen V, Jääskelä
ellipse inen JE. Gaze behaviour of expert and novice microneurosurgeons differs during
Right eye validity code Estimate of validity of right eye data observations of tumor removal recordings. In: Proceedings of the symposium on
eye tracking research and applications. ACM; 2012. p. 377–80.
Endsley MR. Design and evaluation for situation awareness enhancement. In:
Proceedings of the human factors and ergonomics society annual meeting. Vol.
Table 6 32; 1988. p. 97–101.
Description of validity codes from the Tobii TX300 eye tracker. Fletcher L, Zelinsky A. Driver inattention detection based on eye gazeroad event
correlation. Int J Robot Res 2009;28(6):774–801.
Left eye Right eye Eyes Eye Flin R, OConnor P, Mearns K, Gordon R, Whitaker S. Factoring the human into safety:
validity code validity code detected identification translating research into practice. Crew resource management training for off-
shore operations, vol. 3; 2002, Research.
0 0 Both Correct
Ghosh K, Natarajan S, Srinivasan R. Hierarchically distributed fault detection and
4 0 Right Correct identification through dempster-shafer evidence fusion. Ind Eng Chem Res
0 4 Left Correct 2011;50(15):9249–69.
3 1 Right Estimated as probable Glöckner A, Herbold A-K. An eye-tracking study on information processing in risky
1 3 Left Estimated as probable decisions: evidence for compensatory strategies based on automatic processes.
2 2 One eye Uncertain J Behav Decis Mak 2011;24(1):71–98.
4 4 None Uncertain Gordon R, Mearns K, Flin R, OConnor P, Whitaker S. Factoring the human into safety:
translating research into practice. The development and evaluation of a human
factors accident and near miss reporting form for the offshore industry, vol. 2;
2002, Research.
which contain the combination of validity code corresponding to Groot C, Ortega F, Beltran FS. Thumb rule of visual angle: a new confirmation. Percept
Motor Skills 1994;78(1):232–4.
last two rows of Table 6 are neglected. Gupta J. The Bhopal gas tragedy: could it have happened in a developed country? J
In our experiments, to prevent the participants from knowing Loss Prev Process Ind 2002;15(1):1–4.
the objective, we have not informed that their eyes are tracked. Hammoud RI, Smith MR, Dufour R, Bakowski D, Witt G. Driver distraction monitoring
and adaptive safety warning systems. Tech. rep., SAE Technical Paper; 2008.
This has resulted in (i) participant not focusing on the calibration Hasse C, Grasshoff D, Bruder C. How to measure monitoring performance of pilots
dots during calibration step and/or, (ii) significant head movement and air traffic controllers. In: Proceedings of the symposium on eye tracking
during calibration as well as task execution. The former can lead to research and applications; 2012. p. 409–12.
Hermens F, Flin R, Ahmed I. Eye movements in surgery: a literature review. J Eye
large calibration error resulting in misinterpretation of data while Mov Res 2013;6(4):1–11.
the latter results in detection of no eyes, loss of data. To overcome Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Van de Weijer J. Eye
these difficulties, we used the following two criteria to obtain good tracking: a comprehensive guide to methods and measures. Oxford University
Press; 2011.
quality eye data: Irwin DE. Fixation location and fixation duration as indices of cognitive processing.
In: The interface of language, vision, and action: eye movements and the visual
world; 2004. p. 105–34.
• The total duration of task execution is derived from mouse click Jiang X, Zheng B, Tien G, Atkins MS. Pupil response to precision in surgical task
data with expected number of data points = Time spent in partic- execution. In: MMVR; 2013. p. 210–4.
ular task *sampling time (1/60). The tasks having less than 60% of Just MA, Carpenter PA. Eye fixations and cognitive processes. Cogn Psychol
1976;8(4):441–80.
the expected data points are excluded from further analysis. Kilingaru K, Tweedale JW, Thatcher S, Jain LC. Monitoring pilot situation awareness.
• It is assumed that if the participant is clicking on the tag, they are J Intell Fuzzy Syst 2013;24(3):457–66.
looking at that particular AOI. Data points which do not follow Law B, Atkins MS, Kirkpatrick AE, Lomax AJ. Eye gaze patterns differentiate
novice and experts in a virtual laparoscopic surgery training environment. In:
this premise are neglected in further analysis. Proceedings of the 2004 symposium on eye tracking research & applications;
2004. p. 41–8.
Maier AM, Baltsen N, Christoffersen H, Strrle H. Towards diagram understanding: a
The former ensures that there is availability of adequate data while pilot-study measuring cognitive workload through eye-tracking. In: Proc. intl.
the latter acts as an extra validation step to obtain good quality eye conf. human behavior in design; 2014.
C. Sharma et al. / Computers and Chemical Engineering 85 (2016) 43–57 57

Majaranta P, Räihä K-J. Twenty years of eye typing: systems and design issues. In: Reising DC, Errington J, Bullemer P, DeMaere T, Harris K. Establishing operator per-
Proceedings of the 2002 symposium on eye tracking research & applications; formance improvements and economic benefit for an asm© operator interface.
2002. p. 15–22. In: Paper presented at the NPRA plant automation & decision support confer-
Mannan S. Lees’ loss prevention in the process industries: hazard identification, ence; 2005, October.
assessment and control. Butterworth-Heinemann; 2004. Ren P, Barreto A, Huang J, Gao Y, Ortega FR, Adjouadi M. Off-line and on-line stress
Marsh. The 100 largest losses 1974–2013. Marsh and McLennan Companies; 2014. detection through processing of the pupil diameter signal. Ann Biomed Eng
Mearns K, Whitaker S, Flin R, Gordon R, OConnor P. Factoring the human into safety: 2014;42(1):162–76.
translating research into practice. In: Benchmarking human and organisational Reutskaja E, Nagel R, Camerer CF, Rangel A. Search dynamics in consumer choice
factors in offshore safety; 2002, Research. under time pressure: an eye-tracking study. Am Econ Rev 2011:900–26.
Mitchell RW. A framework for discussing deception. In: Deception: perspectives on Schulz C, Schneider E, Fritz L, Vockeroth J, Hapfelmeier A, Wasmaier M, et al. Eye
human and nonhuman deceit; 1986. p. 3–40. tracking for assessment of workload: a pilot study in an anaesthesia simulator
Munger S, Smith R, Payne W. An index of electronic equipment operability: environment. Br J Anaesth 2011;106(1):44–50.
data store. Tech. Rep. AIR-C43-l/62-RP(l). Pittsburgh: American Institute for Swain AD. Human reliability analysis: need, status, trends and limitations. Reliab
Research; 1962. Eng Syst Saf 1990;29(3):301–13.
Nachreiner F, Nickel P, Meyer I. Human factors in process control systems: the design Tharanathan A, Bullemer P, Laberge J, Reising DV, Mclain R. Impact of functional and
of human–machine interfaces. Saf Sci 2006;44(1):5–26. schematic overview displays on console operators situation awareness. J Cogn
Norman DA. Cognitive engineering. User Centered Syst Des: New Perspect Eng Decis Mak 2012., http://dx.doi.org/10.1177/1555343412440694.
Hum–Comput Interact 1986;3161. Tichon JG, Mavin T, Wallis G, Visser TA, Riek S. Using pupillometry and electromyog-
Noroozi A, Khan F, Mackinnon S, Amyotte P, Deacon T. Determination of human raphy to track positive and negative affect during flight simulation. Aviat Psychol
error probabilities in maintenance procedures of a pump. Process Saf Environ Appl Hum Factors 2014;4(1):23.
Protect 2014;92(2):131–41. Tien G, Atkins MS, Zheng B, Swindells C. Measuring situation awareness of surgeons
Obama B. Improving-chemical-plant-security-obama-podcast; 2006 http:// in laparoscopic training. In: Proceedings of the 2010 symposium on eye-tracking
obamaspeeches.com/059-Improving-Chemical-Plant-Security-Obama- research & applications; 2010. p. 149–52.
Podcast.htm. Tobii Technology. Tobii eye tracking. In: An introduction to eye tracking and Tobii
Palinko O, Kun AL, Shyrokov A, Heeman P. Estimating cognitive load using remote eye trackers; 2010 http://www.tobii.com/eye-tracking-research/global/library/
eye tracking in a driving simulator. In: Proceedings of the 2010 symposium on white-papers/tobii-eye-tracking-white-paper/.
eye-tracking research & applications; 2010. p. 141–4. Toker D, Conati C, Steichen B, Carenini G. Individual user characteristics and infor-
Parasuraman R, Sheridan TB, Wickens CD. Situation awareness, mental workload, mation visualization: connecting the dots through eye tracking. In: Proceedings
and trust in automation: viable, empirically supported cognitive engineering of the SIGCHI conference on human factors in computing systems; 2013. p.
constructs. J Cogn Eng Decis Mak 2008;2(2):140–60. 295–304.
Partala T, Surakka V. Pupil size variation as an indication of affective processing. Int van Gog T, Scheiter K. Eye tracking as a tool to study and enhance multimedia
J Hum–Comput Stud 2003;59(1):185–98. learning. Learn Instr 2010;20(2):95–9.
Paté-Cornell ME. Learning from the piper alpha accident: a postmortem analysis of Wilson MR, Vine SJ, Bright E, Masters RS, Defriend D, McGrath JS. Gaze
technical and organizational factors. Risk Anal 1993;13(2):215–32. training enhances laparoscopic technical skill acquisition and multi-tasking
Poole A, Ball LJ. Eye tracking in human–computer interaction and usability research: performance: a randomized, controlled study. Surg Endosc 2011;25(12):
current status and future. In: Prospects, Chapter in C. Ghaoui (Ed.): Encyclopedia 3731–9.
of human–computer interaction. Pennsylvania: Idea Group, Inc.; 2005. Wilson KM, Helton WS, Wiggins MW. Cognitive engineering. Wiley Interdiscip Rev:
Porta M, Ricotti S, Perez CJ. Emotional e-learning through eye tracking. In: Global Cogn Sci 2013;4(1):17–31.
engineering education conference (EDUCON); 2012. p. 1–6. Yin S. Proactive monitoring in process control using predictive trend displays.
Pradhan AK, Hammel KR, DeRamus R, Pollatsek A, Noyce DA, Fisher DL. Using eye Singapore: Nanyang Technological University; 2012, Ph.D. thesis.
movements to evaluate effects of driver age on risk perception in a driving Zheng B, Tien G, Atkins SM, Swindells C, Tanin H, Meneghetti A, et al. Surgeon’s
simulator. Hum Factors: J Hum Factors Ergon Soc 2005;47(4):840–52. vigilance in the operating room. Am J Surg 2011;201(5):673–7.

You might also like