Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

TA S K A N A LY S I S &

D E S I G N E VA L U AT I O N

C h r i s t o p h e r Ta n
PSY 340 c h r i s t o p h e r. t a n @ h e l p . e d u . m y
OVERVIEW

• Task Analysis
• Hierarchical Task Analysis (HTA)
• Design Evaluation
o Heuristic Evaluation (usability heuristics/principles)
o Usability Testing (data collection methods)
TASK ANALYSIS
Why do a Task Analysis?
• “Study of what an operator needs to do to achieve system goals” (Kirwan & Ainsworth, 1992)
• Examine tasks to be performed by humans to achieve certain goals
o Includes all necessary actions for task (includes cognitive processes)

• Compares system demands and operator capabilities; alter demands to reduce error
• Systematic description of human-system interaction to understand how to match system
demands to human capabilities (Lee et al., 2017)
• Task analyses help…
o Describe tasks in detail
o Understand system/task demands on operator
o Understand users and their goals and behaviours
o Identify potential obstacles/roadblocks – what problems do users run into?
o Understand how users work around roadblocks (workarounds) – may not be ideal/safe
TASK ANALYSIS
Why do a Task Analysis? – Roadblocks
Roadblocks: Obstacles that users face when trying to complete tasks; interferes with achieving goal
• May cause users to circumvent safety features (unsafe workarounds)
o “This will take too long”, “This is a hassle”
o May cause abandonment (e.g., Expedia case study - $12m data field)
• Lack understanding of available actions
o “What now?”, “I’m clicking this, but nothing happens”
• Irreversible actions / Unforgiving to errors
o Cannot ‘go back’
o Stuck on long road in wrong direction
o Clears entire form
• Poor feedback
o Lack certainty
o “What happened?”, “Did it work?”
TASK ANALYSIS
Why do a Task Analysis? – Workarounds
Workarounds: Using other (often less ideal/roundabout) methods to overcome roadblocks;
bypassing of safety restrictions

• Using many methods to do one thing (especially when there is a lack of a unified system)
o E.g., Post-its, whiteboard, app, calendar, paper journal

• Roundabout & unnecessary methods


o Clicking multiple links to get to the desired outcome rather than available shortcut links
o Analogy – Driving from A → B → C rather than A → C

• Turn off settings to get around restrictions


o Unadvisable and unsafe actions
o E.g., Disabling safety features/software when going into websites
HIERARCHICAL TASK ANALYSIS

• HTA → One of the task analysis methods

• Identifies goals, sub-goals, operations, and plans of tasks

• Goal: What does the operator want to accomplish?


o Accomplished via completing tasks

• Sub-goal: Goals (highest-order) can be broken down into sub-goals

• Operations: Actions necessary to complete a goal/sub-goal


o Smallest individual steps (should not be broken down any further)

• Plans: Method of executing goals and operations


HIERARCHICAL TASK ANALYSIS
Example

Activity

• Goal: Send an email

• Sub-goals:

• Operations:

• Plans:
HIERARCHICAL TASK ANALYSIS
Benefits of HTA

• Structured and organized representation of tasks; documentation purposes

• Helps systematically identify sources of potential error (analyzing task processes)


o Redundancies, opportunities for errors, frustrations
o Often things don’t seem like problems until documented in a systematic manner
o Is the structure and order of activities logical?

• Helps compare between different approaches to same tasks


o No. of steps, redundancies, errors, etc.

• Useful for improving upon existing designs


HIERARCHICAL TASK ANALYSIS
Building the HTA

1. Determine overall goal of task


o Highest-order goal

2. Determine task sub-goals

3. Sub-goal decomposition
o Operations

4. Analysis of plans
HIERARCHICAL TASK ANALYSIS
Building the HTA – 1. Determine overall goal of task

• Overall goal of task that you are trying to analyze; highest-order goal

• What is the user trying to achieve?


o Clean the house
o Make an online purchase
o Cook a meal

• Level 0.0 in the HTA

• Note: Goals vs. Tasks


o Goals & sub-goals, tasks & sub-tasks
HIERARCHICAL TASK ANALYSIS
Building the HTA – 2. Determine task sub-goals

• Highest-order goals are accomplished via sub-goals

• Highest-order goals (0.0) are broken down into smaller steps


o If too many (e.g., more than 10), likely a deeper level; try regrouping to form higher
level sub-goals
o Broken down depending on how critical and/or complex sub-goals are

• Example: Buy a movie ticket on the GSC app


o Open GSC app
o Find movie schedule for _____ (movie title)
o Book ticket for selected movie time
o Pay for ticket
HIERARCHICAL TASK ANALYSIS
Building the HTA – 3. Sub-goal decomposition
Example:
• Sub-goals can be broken down further into sub-goals or
operations 0.0 Highest-order goal
o Breaking down sub-goals can be arbitrary; stop when
not useful to go deeper

• Operations → bottom level of HTA branches 1.0 Sub-goal 2.0 Sub-goal


o What actions need to be done?
o Accomplishes sub-goals
1.1 Operation
• Examples:
o Press ‘submit’
1.2 Operation
o Click on the desired movie time
o Turn ignition key
o Lift lever 1.3 Operation
HIERARCHICAL TASK ANALYSIS
Building the HTA – 4. Analysis of plans
Plans dictate how goals are achieved / tasks are executed
• Order of sub-goals/operations

Some variations with examples:


• Linear: Do 1.0 – 5.0 in that order
• Non-linear: Do 1.0 – 5.0 in any order
• Simultaneous: Do 1.1 and 1.2 simultaneously
• Cyclical: Repeat steps 1.0 – 3.0 as required
• Selection: Do either 1.0 or 2.0
• Other examples:
o Do 1.1, 1.2, and 1.4; when X occurs, do 1.3
o Do 1.1; if X occurs, do 1.1.1 – 1.1.3
HIERARCHICAL TASK ANALYSIS
Building the HTA – Exercise
1. Call a friend (mobile phone)

2. Pumping tires on car


DESIGN EVALUATION

• Designs need to be evaluated for effectiveness and usability – does it help users achieve
their goals easily and quickly?

• Recall: Double Diamond Model & UCD Cycle


o Iterative approach to design

• Iterative design and continuous evaluation during prototyping stages help identify
deficiencies early in the product life cycle
o Fixing problems later are more costly

• Common methods of evaluating design:


o Heuristic Evaluation
o Usability Testing
DESIGN EVALUATION
Heuristic Evaluation

• Heuristics – Mental shortcuts; allows people to efficiently solve problems and make
judgements/decisions

• Usability heuristics can be used to explain most problems in machine interfaces (and
other designs in general)
o The 10 usability heuristics/principles – Nielsen (1995)

• Evaluators (usability experts) individually evaluate the design


o Judge how well interface adheres to usability heuristics
o Generate potential problems by principle
o Recommended 3-5 evaluators
DESIGN EVALUATION
Usability Heuristics/Principles (Nielsen, 1995)
Heuristic Description
• Interface simplicity – interfaces should match the task
Simple & natural dialogue
• Only present relevant information – ask “does it help accomplish task?”

Speak the users’ language • Avoid jargon; use familiar terminology

Minimize the users’ memory • Allow user to rely on recognition over recall
load o E.g., Select from list/category vs. Type in keyword

• Use same commands, symbols, terminology throughout interface


Consistency
• Consistency with other interfaces – recall: conceptual models

Feedback • Inform users about how their input is interpreted – “what’s going on?”
DESIGN EVALUATION
Usability Heuristics/Principles (Nielsen, 1995)
Heuristic Description
• Allow users to easily exit interface
Clearly marked exits
• Avoid feeling “trapped”
• For frequent operations
Shortcuts
o E.g., Ctrl + C / V; Alt + Tab

Good error messages • Inform of problems and suggest solutions

Prevents error • Clear and intuitive design

• Provide help/additional info beyond what is present in interface


Help and documentation
o E.g., FAQs, “help”, “?”
DESIGN EVALUATION
Usability Testing
• Usability: Quality attribute; assess how easy user interfaces are to use (Nielsen, 2012)

• UT: Empirical way of measuring a product’s ease of use (Rubin, 1994)

• Measures of usability / task performance metrics:


o Task success/failure rate (i.e., rate of completion)
o Time taken to complete task
o No. of errors; No. of steps (vs. no. of required steps)
o Perception of ease, overall satisfaction – via scales (e.g., System Usability Scale; NASA-TLX,
etc.), observation, questioning
DESIGN EVALUATION
Usability Testing – Data Collection
• Typically involves testing a product with participants
o Facilitator instructs; no guidance – need to observe participants in a natural context
o Collect data (users’ experience) from the process
• Various forms of data (objective, subjective)

• Identify potential users


o Target user population – who’s likely to use your product/interface?
o Sample size – 5??? (Nielsen, as cited in Norman, 2013)

• Common methods of collecting data


o Observation
o Think-Aloud Protocol
DESIGN EVALUATION
Usability Testing – Data Collection
Observation

• Examine users’ behaviour/interaction with the system – natural setting

• Avoid interfering/prompting unless users are struggling severely (which is data in itself)
o E.g., “What do you notice in this area of the display?”; “Do you think there’s anything of
interest in this bar?”

• Observe non-verbal behaviours (video record if possible)


o Do they seem to understand the interface, or are they hazarding guesses?
o Are they ignoring certain features? – Salience/visibility issues?
DESIGN EVALUATION
Usability Testing – Data Collection
Think-Aloud Protocol

• Observation is rarely enough; important data is in the form of cognitive processes

• Why users struggle / behave in certain ways are revealed through their thoughts
o Especially if task is primarily cognitive – little behavioural activity to observe

• Get users to “think aloud” during task


o Reveals thought structure, knowledge, expectations, past experiences
o Reveals their goals, strategies, and intentions
DESIGN EVALUATION
Usability Testing – Data Collection
Think-Aloud Protocol

• Concurrent protocol
o Spoken during task performance

• Retrospective protocol
o After task performance; while reviewing video or by memory

• Prospective protocol
o Think aloud while imagining performing a hypothetical task
DESIGN EVALUATION
Usability Testing – Data Collection
Think-Aloud Protocol

• Retrospective protocols may yield deeper insights into usability than concurrent
(Ohnemus & Biers, 1993; Bowers & Snyder, 1990)

• Concurrent protocols → more procedural information


o “I‘m trying to find the search bar”
o “I‘m going to click this and see what happens”
o “That‘s annoying, it doesn‘t do anything”

• Retrospective protocols → more explanations


o “I was looking here because…”
o “I was confused because other websites usually…”
o “I didn’t turn that knob because it looked set in place; I didn’t know you could turn it”
DESIGN EVALUATION
Usability Testing – Data Collection
Think-Aloud Protocol

• Useful to combine both concurrent and retrospective protocols

• Concurrent → helps pinpoint exact roadblocks or good design


o May exclaim or voice frustration
o Or express happiness over doing something easily

• Retrospective → helps understand thought processes; reasons for behaviours


o Try to ask open-ended questions
o Probe on key parts of task
DESIGN EVALUATION
Usability Testing – Summarizing & Analysing Data
• Draw up task analysis – tables, charts, hierarchical diagrams, etc.
o To represent goals and required tasks/activities
o Documentation purposes
• Written notes/observations
o Important/interesting observations (verbal & non-verbal)
o Errors, roadblocks, workarounds
o Info from think-aloud protocols
• Analysing data - What patterns emerged from the task analysis data?
o Compare UT data with HTA – where do errors, roadblocks, & workarounds occur?; map onto
HTA steps
o E.g., “4 of 5 users experienced difficulties in steps 1.1.4 and 2.4.2”
• Reasons for difficulty? (elicited via observations, think-aloud protocols, or interviews)
GROUP ASSIGNMENT

HTA
• Analysis of specific task (i.e. making an e-purchase, uploading a video, planning public transport route)
• What the system requires of its users (vs. how they actually perform the tasks)
• Some systems have multiple goals
o Analysing several tasks within a system; depending on complexity

Data collection
• No. of participants?
• Concurrent & retrospective think-aloud protocols; guiding, probing, follow-up questions
• Task performance measures; emerging patterns
o Where did most errors commonly occur? (e.g. 4/5 participants failed to notice…)
o Common frustrations
• Post-task ‘interview’
o Overall satisfaction? Impressions? 3 things they liked/didn’t like, etc.
• Video recording + notes
GROUP ASSIGNMENT

Prototyping
• Communicate & document designs
• Not meant to be time-consuming or overly detailed
• Demonstrate to viewers how the product will work
• Different ways to approach prototyping:
o Wireframes – Initial layout/sketch of where things should be
o Physical prototypes – Physical products; demonstrate how they will work
o ‘Wizard’ paper prototypes – Digital displays and interfaces

• Depends on the nature of what you are redesigning


o Website, app, digital camera, microwave, etc.
GROUP ASSIGNMENT
Prototyping – Wireframes
GROUP ASSIGNMENT
Prototyping – Physical Prototypes
GROUP ASSIGNMENT
Prototyping – Physical Prototypes
GROUP ASSIGNMENT
Prototyping – ‘Wizard’ Prototypes
GROUP ASSIGNMENT
Prototyping – ‘Wizard’ Prototypes
RESOURCES

• Introduction to Human Factors: Applying Psychology to Design (Stone et al., 2017)

• Introduction to Human Factors Engineering (Wickens et al., 2003; Lee et al., 2017)

You might also like