Designing For Situation Awareness - Chapter 3 and 10

You might also like

Download as pdf
Download as pdf
You are on page 1of 34
SA Demons: The Enemies of Situation Awareness Building and maintaining situation awareness (SA) can be a difficult process for people in many different jobs and environments, Pilots report that the majority of - iheir time is generally spent trying to ensure that their mental picture of what is happening is current and correct. The same can be said for people in many other domains where systems are complex and there isa great deal af information to keep up with, where information changes rapidly, and where itis hard to obtain, ‘The reason why good SA is 50 challenging can be laid to rest on both features of the human information processing system and features of complex domains that interact to form what we will call SA demons. SA demons are factors that work to undermine $A in many systems and environments, By bringing these demons to light, we will take the frst step toward building a foundation for SAcoriented design. We will discuss eight major SA. demons: Attentional tunneling Requisite memory trap Workload, anxiety, fatigue, and other stressors (WAFOS) Data overload Misplaced salience Complexity creep Exrant mental models, Outof-the-loop syndrome 3.41 ATTENTIONAL TUNNELING SA within complex domains involves being aware of what is happening across many aspects of the environment. Pilots must instantancously keep up with where they are in space, the status of the aicraft systems, the effect of turbulence on passenger comfort and safety, other traffic around them, and air traffic control directives and clearances, to name a few elements. Air traffic controllers must concurrently monitor separation between many diffrent pairs of aircraft here may be as many as 30 or 40 aircraft ‘under their control at any one time), process the information required to manage aircraft flows and pilot requests, and keep up with aircraft seeking to enter ot leave their sector. A stock car driver must monitor the engine status, the fuel status, the other race cars on the track, and the pt crew signals. Successful SA is highly dependent upon constantly juggling different aspects of the environment. Sometimes multiple pieces of information are simultaneously pro- cessed in order to porform one ot more tasks; for example, monitoring the road while 3 42. Designing for Situation Awareness: An Approach (0 User-Centered Design ving and monitoring the rio for traffic information, Ths called aitention aoe people face numerous bottlenecks in altention sharing, HONE particu aay within a single modality, like vision ar sound, and thos cam only occur to a limited extent (Wickens, 1992) inne eanot access ll the needed information simultancgns people also set up systenraic scans or information sampling trsegies 9 Ine that they stay UP 1 fate in their knowledge of what is happening. A scan across ‘peeded information pay occur over period of soconds or minutes asin the le lof the pilots and the ait rane csolrs discussed Ree, or izaay take pace over a pedi’ ‘of hours, as in ara ofa powerplant operator required to 1g the stats of Tnundreds of different Syatoms goveral times over the course ofthe day tem Per hese cans, and for systems with any tevel of compen good SA is highly dependent on swicching attontion between difvent information sources. ‘Unfortunately, people often can get trapped in pheromenct called attentional nar vane or tunvcling (Baddeley, 1972: Bartle, 1943; Broads 1954), When suc- ‘Rambing to attentional tunneling, they look in on certain aSPeOk features of the sare ot they ae trying process and wil eter intensenald inadvertently sep ther seanning bebavior (Figure 3.1). 1n this cas, thes SA may be very good Or part of the environment they are concentating OD, Put ‘will quickly become emis detected, itcan take considerable time to reorient themselves fo the situation ‘enough to understand the natare ofthe problem and what should be done about it. In ‘many cases this loss of SA and the resulting lag in taking over manual performance - can be deadly. Yor example, in 1987, « Northwest Airlines MD-80 crashed during takeoff from Detroit Airport because the flaps and slats had been incorrectly configured, killing all but one passenger (National Transportation Safety Board, 1988) The investigation showed that a primary factor in the crash was the failure of an automated takeoff configuration warning system. The crew failed to properly configure the aircraft for takeoff because they were distracted by ai traffic control communications. The auto- rated sytem, designed to detect and prevent exactly this typeof problem, also filed, “The crew was unaware of both the state ofthe aireaft andthe state ofthe automation. 472. Designing for Situation Awareness: An Approach 10 User-Centered Design sn ancher example, in 1989, a U.S. Ait B-T37 filed to succssflly takeot owt Te Sunde Airport, coming torestin a nearby iver whose offs a ae re oft irra (Nationa Transportation Safety Bowe, 1990) Yo is et estrone was accidental disarmed without te captain ox frst oft ee hers erew was ou-ofsheToop, unaware thatthe automation 9s om werefe tots for tom, As a ret, dey had nsofent speedo take and ran off the end ofthe runway into the river. ee gxamples oF such probes with automaton come from avialen he onan tang history of automation implementation it tis induty an HON 10 oo jon invertgation an ecording procures require by Inv) eons dt un ner arenas forms of autmaton bome mote proven os cae raticie incident nthe war in Afghanistan was found 19 hs the Sean automation failure of a global posizoning sytem ES) dev (Loeb, 2002) ie was being used bya soldier to target opposition fess when i a8 Pe de ing the bars ily, the seis sent the needed coominaes ta hha the soldier did not realize, bowever, tbat when the sper st DOW aaerally reverted {0 reporting ts own postion, Ho was coho oe i reat the sytem was teng im. Te subeegpet bombing Kile? and injured 20 friendly forces rather than the intended ergot. San pr enarpl from te marine indoty In 1986, egy ve a erin New Orleans inuting 16 pele Hh was puted (et 210 trenp failed easing the backup pump co come on and asimaane rection ia are hs speed reduction sgnifcandy eroded the seeing como! available | aaa orn Pee the Missssipgs River, As te sip Tost engine power et Th Ute eased into a crowded shopping area on the bank. The automat worked 35 1 Soned baie the boat eapiain oo-oFtheloop and unable ke the contol sections needed (Associated Press, 1996). To ee pein 2003 a massive alae of the power grid cand 8 total blackout in the northeastern United Ste f Canada that was es sotto cost Between $4 billion and $10 bili all ota Ceetichy Consus Resource Council, 2004). Leading to this outage, the operator overseeing the 8173 Rese wom in Onio didnot realize thatthe automated sytem 8 pastes his screen had stopped working . in understanding bot’ a and te stat ofthe power gtd Tending him to fai to ake appropili ‘ections to keep the grid working properly. A ‘small problem eventually s¢ os ne ager problem tit eid spread across he nercomecte poner (US-Caneda Power System Outage Task Force, 2008), “This problem seems to be mach worse when people ste simattaneoss) por tp do mp tasks (mltasking) Ma et 2005) fond Sa imp a ceive emis conc in 0 driving task under normal conte i das weed in conjunction with a cell phone, SA degraded significant wet than when eel phones were used without additional ruse control. APPA rae on f automation allowed drives to even further eet axtentol te aa vang tsk twa the phone cal to the detriment of SA 0? [ the world they were driving through. cal _ion and situation Awareness a out-of the-oop problem can be fly prevalent. 1 098 study of automation wot ovat systems, Bio of reported problems in cruise Wows associated rs nitoring the system (Mosier eal 1994), The ‘out-of-the-loop syndrome oi Mpown to dcelly resalt from a less of SA when people are monitors of en hte & Kis, 19954). This loss of SA occurs through thes Piary yee ems Cndsley & Kiris, 19958): + Coangos in vigilance and complacency associated with montere Compton ofa passive role instead of an advo role Jn PrOCESSE infor pation for controling the system “4 anges in the quality ot form of feedback provided 10 the human operator s digouss each of these factors in more detail. 12.1 VIGILANCE, CoMPLAcENcy, AND MONITORING any pope attibate the outofthe-loop syndrome to complecsrey over-eliance Seely operator on the automation (Parasuraman tala 199%, 1996a; Parasuraman & “Riley, 1997). In genorel, people are not ‘200d at passively monitoring information for “Tong periods of time. They tend fo become less vigint ‘and alert to the signals they _sre Det oring for aftr periods a short as 30min (Davies & Poraporennes 1980; “Macworth, 1948, 1970). While vigilance problems have historically been associated rt pl teks och as monitoring aradar fran infrequent signa), Pastas ou?) sates that “vigilance effets can be found in complex monte and that opus may be poor passive monitors of an automated system, irrespecive of the [complexity of evens being monitored” TPepould be noted that there are also different forms of complacency (> consider. =e re Wickens (2006) distinguish between reliance and compliance, RATES re to the degsee to which an operator will uust in the automation perform ee re Reliance i found tobe rcleted to the degree to which the automaten (abject to misses —able to detect and handle those tasks assigned 0 ifthe system sang mises, then operators most be more vigilant in processing {eT data se man Oy sone the automation Dekavier, Compliance, on the other Nand, oie ae he gprator wl immediatly respond to an automated alert and fo'0W Tora crore. Compliance is most closely associated with the false warms icine system, Soeliance and compliance appear tobe very different aspects of fuman-automation interaction that are affected by different factors, Paes mediating the degree to which people will become reliant on aulnene: ion ie tacot ust (Lee & See, 2004), By placing too much trust in the automaton ion bclcrce that they sbift attention to other tasks and become inefficient so tors Parasuraman etal, 1993, 1994; Wiener 1985). Some ofthe bey factors infu- | tneing trast ave been summarized as involving system reliability, validity, and nde randability (Seong & Bisantz, 2002). Trust declines steadily with decreasing Syoten reliability and can drop off quite quick, particularly fhe automation ixitues are unpredictable (Lee & See, 2004). Trust is significantly more degraded 17h Designing fr Situation Awareness: An Approach to User-Centered Design by automation errors on tasks that appear tobe easy to tho operon a compare 0 rmoce difficult tasks (Madhavan etal, 2006). Te ornead entomasion is related to both operators’ subjective confidence in the ayntsveliably (though imperectiy carted with aca eysens eS iy PND Wiegmann et aly 2001) end to thc comparative Felings of sce iey & reece to perform tasks themselves (Prinze! & Pope, 2000) Wicamans & {aod foun ha peoples lance on autora exceed thi tec cones sree spoons when the automated aid was viewed to be beter than ibis own Paps inne Poul be without i. Troncall, people are more likely to detect aurorsiice ee these eile antormation, a they are lssreiant on it (nkley e200, ies Mfation of sytom confidence information thet js specific to the siteation or aaamtsondation being made is genorlly mere useful than just overs) Syston or ecty information and ean act to reduce inappropriate reliance on auonalin (teil & Sarr, 2003), ening on the causes bebind system lib eave cea has also heen found to improve the operator’ calibration of how much rs to place in automation (Mastlonis & Parasuraman, 2003). place don, it should be noted that some people are more trusting in gener! Hak rte Boope wi ae more prone fo be complazent have been shows to ve aves SA pate ct, 200%, Older people ave abo ben Joon to be less rin on ana) (eteey eka 2003; Kraner a, 2007) People wio ae more using ois oie in genera, though, have ben found tobe abe to adjust their ust mows SSPE atl forsytem lai (Lee & Moray, 1994 Singh eal, 199), es oe aay jnen tt appears te ess function ofa pesca’ predsposton and pessoal ay a, exraversin) and more aforsion of te pacsved charac of eet eps pedal, dependaiy, compstnce) (Merc & Tgen 29°, caercayu people who are mare predisposed 9 trust technology are he mot Tee sively affected by poorly performing automaton, Othe factors inivensing thedegtet a il asin or rey on automation in any partular case Be acre end subjective workload, how much people rst in their wn S07 es 0 Const tae neded to engage wit the awomation (Lee & See, 200 “Tris complacency problem is worse when other tosks are present bat ‘demand te operator's imited attention Parasuraman tal, 1993) This hs been found tobe fs sree problem (inability to look at mutipe input sourees at che sane time) bi «her an atention problem. Superimposing the information oa the stats ofte aul ra eer enutrent mabuel asks doesnot appea 0 soe CW (Duley eta, 1997; Metager & Parasuranan, 20018). ‘While complacency is often described as a human faite, aan om ererent tat people scully employ tc attonton CO an ota ne uit ikelinood of difrent information sources, People may sometimes mis important cues tat the automation Ne fail peony etre not monitoring. various information sources oF DEAT aaeeveedng, Rather, they miss detecting the problem due wo he ot coat wi pti sampling strategy usualy reliable sources need nt be monitored 9%) San or tention is itd, tis i an fictive copie dealing with excess demands. The result, however, can be 8 tack of SA.on the dealing ated system and the system parameters it governs” (EndsleS+ 199608 ‘ey 8 will spmation and Situation Awareness p.2_ ACTIVE Vensus PASSIVE PROCESSING TF op factor that underlies the out-of the-loop problem is the inherent difficulty [Ped for operators in fully undersencing what is going on when acting a Pas idee gy of automation rather than as someone actively processing information __ emia the same task manually. Cowan (1988) and Slamecka and Graf (1978) fe Peydence tha aggests thatthe very act of becoming pase in the process- fom jnformation may be inferior to ative processing. Even if menitring the _ i pte information th operate may no fly proces of wade hat noms ‘netPvorkng memory. This is analogous to siding inthe ear with someone 0 te fay ia party ae place and then fining out at the end ofthe evening shat Me wt dive home, Your memory trace ofhow you go theres frequently not the dort if you had acvely driven there yourself ilutrating the poor SA rendered ty passive processing. Mepould also be pointed out that checking the output or performance of a sys tem is often not possible without doing the task oneself. For example, something {eRmple as checking that a computer is adding correctly is almost impossible to 3p without performing the calculations oneself. To manvally perform every take oe heck the system is usually not feasible or timely, however. ‘Therefore, efforts Grchecking” an automated system may offen be only partial attempts at @ mat dit performance of the same task. Layton et al, (1994), for example, found that pilots aid not fully explore other options or do a complete evaluation of sytem Recommended solutions for route planning, but rather performed a more cursory Thaluation, This makes sense, 8 it would hardly be worth the rouble and expense st having eutomated aids ifthe operator still had to porform everything manvally inorder to check the system. Endsley and Kiris (1995) performed a study involving an automobile navigation task, They found much lover understanding (Level 2 SA) of what hed happened in the navigation task when people performed the task with an automated aid, even though their Level 1 SA (ability to report on information inthe navigation tesk) was aotaffeoted, Although they were awaze of the basic information, they didnot have @ 00d understanding of what the data meant in relation to the navigation goals. In this Study, complacency and potential differences in information preseatation between tntomated and noaauiomated conditions were ruled out as possible contributors (0 this loss of SA. Lower SA was found tobe dae to changes ia the way information is processed and stored when monitoring the performance of the automation (or ay ‘wher agent) as compared to doing the task oneself. In other studies, poor SA has been found under passive monitoring conditions with experienced air trafic con trollers CEndsley eta, 19973; Endsley & Rodgers, 1998; Metzger & Paresuraman, 20016), The air traffic controllers had much poorer SA regarding aircraft they were just monitoring as compared to airerat they were actively controlling “Thus, the out-of the-loop syndrome may not be just due to simple concepts such ss complacency or overreliance on automation, but a fundamental difficulty aoc! aied with fully understanding what the system is doing when passively monitoring it As such, automation may continue to pose challenge forthe SA of human operstos, even when they are vigilant. 176 Designing for Situation Awareness: An Approach t User-Centered Design 10.2.3 System Feesack QuALITY ithe intentionally or unintentionally, the design of many systems creates signif carr pallenges to SA because its inadequate for communicaing key information «ari clapeator (Nerman, 198), Feedback on sytem operation is eter elimina we Ghenged in such a yay a8 fo make it ficult for a person fo efeetvely monitor Saris nappening, “Without appropricfeedteck people are indeed out of theo. ‘They may not know if their requests bave been received, ifthe actions are Peing performed propery, orf problems are occurring” (Norman, 198) Somatines designers Will mistakenly remove eritical cues in the implementa, tion of automation without realizing it. For example, the development of letecnis fly bysvie fight controls in the F-16 aircraft initially caused problems for pilots ae wining airspeed and maintaining proper flight contra Because important ae erm vibration normally feltin Uo flight stick were no Tonger available through ant conses Rater, the necessary information was provided only on a talionl etal diaplay (Kuipers et al, 1990), It as much more difficalt forthe pis le vienttaneously determine this key information from visual displays when they Wie sanaratsie of the cockpit during landing, Without realizing it, designers Bae soetrerigntly deprived them of important cues they were using, To correct his Prof tow rifcial stickshakers are now routinely aGded to By-by wire systems 19 Fe wid pilots with this neoded feedback (Kantowitz & Sorkin, 1989). O'het research a paetfed the lack ofhand movement fedbac with automation of physica! at (Kesel & Wickens, 1982; Young, 1969), and loss of factors such as vibration and ceett inthe automation of process control operations (Moray, 1986). other eases, designers have intentionally removed or hidden informatife the wave of systems thatthe autometion should be taking eare of under the beliet tha perators n longer need fo know this information, Billings 1990 found tat in ome wages automation has failed to tell plots when it shuts down engines, sirealy opriving the pilot of SA. neaded to avoid accidents. Tn another incident ait sear tost 950% of alkitade over the mid-Atlantic when the capesin attempted (0 cat osteo from the autopilot. It was found thatthe autopilot effectively rmasted the approaching loss of contol conditions, hiding the sseded oues from the pilot (National Transportation Safety Board, 1986). The use ofa multitude of diffecent computer windows or sreens of informa tht | an be cafled up through menus or other buttons can also contribute 10 the problem ol poor feedback many automated systems, ifthe perstr is bus engaged in viewO8 | aoe eption, key deta on another hidden window canbe obscured. OP ace sa problem, the operator eamnt know that he needs to al 10 1 ths rs afonmation, The more comple the sation in which aperson works casi ie for that person to not notice displayed information and understand its 9 ‘ Tira cogent example, in 2001, an Ait Transat Aisbus 330 airrafe sap ova ae Biante Ovean, en route from Toronto to Lisbon (Government of or ‘Ministério Das Obras Pablicas Transportes B ‘Comunicagées, 2004). Due ro quie taking and excelent aitmanship, che pilots were able to succesfully rand ont small island of Terceira in the Azores, the only landing spot within 500 miles # {Donin of gliding with no engine power. ‘The problem leading up ths inci ation and Situation Awareness ow sk eased by th instalation ofa wrong pert ring maintenance. While in leak emation tha balances fue inthe areraf spent 25min 4 8 to balance tea response to the fue leak from one engine) befor notifying the crew nk as a problem, Compounding this problem, #6 pilots were involved in nro oll pressure an temperatare warnings onthe tit ong ‘and com reporing wit their maintenance canter to ‘roubleshoot the problem. Warnings of patance wore not rectly vise othe ce op their display until after they eine information vcreen that they were concealing 9 © troubleshoot erature and pressure problem This is @ good example of DOW the design of rfc tothe automation can contribute significantly 10 poor feedback to that is happening with the system-hey are controlling, vie wey be a zeal problem with the salience (ie, prominence) of 1 nfrmaton with many systems. “The increased display complexity ané com- ered display format reduces the perceptual slience of ‘information, even if pee 18 a complex envionment with many aces iE on, itis easy for 7 frtos to fse tec of such information” (ney, 1996e), [a many aircraft syS- tor example, only one Teter or mumber may be dferet 0” 8 TO cluttered jay to indicate which mode the system is in. Tis poor salience OO be directly ay poor awareness of antomation modes, Efforts to improve the sone of pie vnsitions trough other modalities 8 tactile cued) have ‘been successful in Fepingo reduce the SA problems asociated with pone awareness of mode informa = fin on these displays (Sklar & Sarter, 1999). oa and Parasuraman (2003) showed that zutomation based compisseey seule senticail reduced through te provision of amoresmegrated set ofdi0 9% paieted the intentions ofthe antomation more clearly. Automation isles 2 saa rade much more spesfc,in order to provide clearer indeations of Wat itis see ding or doing, and hovr confident ts i ts recommendations. Gob al. {at03, for example ond that direct cues that indicate where 2 potent! problem ina luggage sreoning tsk are signifiantly better than indirect cues that merely indicate a potential problem is present somewhere, ‘Ot the three factors associated with poor SA leading tothe out-of the-1o0P #49 drome, poor feedback and information presentation are the most exs)y remedied through the application of good human factors design principles and atenton the importance of information cues related to automation states, jhe tom 103 AUTOMATION AND LEVEL OF UNDERSTANDING Aside from the out-of the-loop syndrome, people also can have significant dieu Feats Pom eoianding what the system iS doing, even when itis functioning no tnaly, Tht is, comprehension and projection (Levels and 3 SA) cap fe seriously Meained, inaccurate, or missing. Wiener and Curty (980) first noted these Do ferns studying pilots working with automated systems. “What is it doing no ease thy tis doing that,” and “Wel I've sever seen that before” are Wide hosed comments. Altiough understanding seems to improve with expesienee 8 threat, problems ean contine to exist even efter years in working with @ system (McClumpha & James, 1994). 178 Designing for Situation Awareness: An Approach to User-Centered Design ‘A number of fotrs are behind this poor understanding of many automated sys- tems, ncliling the inherent complexity associsted with many automated ystems, poor srrerave design, and inadequate training. As systems become more comples, people find it harder to develop a mental model of how the system works (see Chapter 8) ‘Rotomated systems tend to incorporate complex logic so that th system can accom ‘ish nomerous funetions and dea wih diferent imation conditions. ten itacts i rently sven in different system modes, further complicating the logic structures that f person needs tounderstand, The more complex the system logic bocomes, the harder roca be fora person to atain and maintain Level SA (comprehension) and Level 3 SA (projection) The groving complexity of systems makes it more difialt fora pes- Se tovelop acomplete mental model ofa system. Many branches witin the system Togic a well as infrequent combinations of stations and events that evoke various rexety secn system sates, ad to the challenge of filly comprehending the system. “Vakil and Hansman (1998) point to the lack of a consistent global model to drive the design of automated systems in aviation as a major culprit in this state of affair. ‘RS ojstems evolve over the year, different designers add new fonctions that may sweet in ways that are not consistent with other fonctions or modes, leading t0 a” “entropic growth of complexity” Mode proliferation reigns as new modes are ade ‘to deal with different requirements of different customers and to provide product dif- ferentiation. Complicating this problem, Vakil and Hansman found pilots’ training to be simple and rule based, providing Tittle information on causality or connection to tne undeslying system stractare. With te lack of aclearly presented model of how the system works, pilots tend to adopt tbeir own ad hoe modes that are often inae- arate o incomplete representations of how th systems work. “adding to this problem, the state of the automation and its current functioning is often abt clearly presented through the system display. Por example, when the ‘Stem changes from one system mode to another, the change is often so subtle th the pers aking the sytem may be unsware ofthe shit, and thas becomes confused cee the syste functions differently. Without system understanding, the ability of the operator to project what the system ‘will do (and assess the compliance of those actions with current goals) is also lacking, making proactive decision making dif feult and error prone. Summarizing much of the esearch in this area, Serter 2008, 1.507) concludes that although the importance of cockpit automation displays P aphasized, “piles frequently fil) 10 verify mode selections, (6) to tice au matic mode changes, ot (¢) process mode annunciations in. sufficient depth to under Stand the implications fr aircraft behavioe” “The display of projected system actions is often insufficient or missing, simiing system understanding and exacerbating this problem. For instance, the horizontal fight path is displayed directly in automated FMSs, information cchanges inthe vertical ight path and the relationship ofthat path to terrain and key Spetatonel points s missing, When a vertical station display is provided to piae ‘a dramatic improvement in avoidance of mode errors is observed (Vakil etal 1990) ‘Other reseatch has also found that the ability of pilots to diagnose an avtomatic® ‘problem and come to a decision is negatively impacted by time pressure and how cot ao a inteomation's, Conficting information in general aso more ime need and eros check information sources toreach a diagnosis (Mosior et ly 200

You might also like