Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 31

Human-Computer Interaction

Evaluation

Evaluation Through Expert Analysis

Shermeen Adnan
Software Development Process

Provide feedback on usability to


Users Provide input to Users
Provide bug report to

QACertify product for release


Software
Provide mandate to
Managers Provide spec to
Designers Provide code to
programmers Development
Primary responsibility: Ensure financial success Ensure customer satisfaction Ensure performance Ensure reliability Process

Initiate Design Code Test Ship


Goal-Directed Design Model
Research Modeling Requirements Framework Refinement
User and the Users and use Definition of user, business& Definition of design Of behavior, form&
domain context technical needs structure & flow content

• A process with various phases along with the market research that provide
the solution which meets the requirements and goals of end users
• 6 phases
– Research (conduct interview and observation)
– Modeling
– Requirements
– Framework
– Refinement
– support
Goal-Directed Design Model
• Research
– Consists of field survey by conducting interviews and
observation
– Interview report consists of information of people who
are involved in research, competitive product, review
market research, technology and brand strategy
– Help to identify behavior pattern of various users
• Modeling
– Output of research phase is converted into user model
– It includes information flow and work flow
Goal-Directed Design Model
• Requirement definition
– It uses scenario based design method with an
importantly innovation of focusing the scenarios
not on user task but first and foremost on meting
the goals and needs of specific user persona
– Persona provide and understanding of which task
are truly important and why
– It provides connectivity between user model and
product framework
Goal-Directed Design Model
• Framework definition
– It is used to provide actual product design and framework for
the system behavior
– Designer create overall product concept defining the basic
framework for the product behavior(what it will do), visual
design (how it will look like) and if applicable physical form
• Refinement phase
– Similar to framework definition phase
– Focuses on design details and implementation
• Support
– Fulfill futuristic requirements of the user
Evaluation
• The process of systematically collecting data that informs us
about what it is like for a particular or group of users to use a
product for a particular task in a certain type of environment
• Heart of HCI but rarely practiced
• Benefits
1. Problems are fixed before product is shipped, not after.
2. The team can concentrate on real problems, not imaginary ones.
3. Engineers code instead of debating.
4. Time to market is sharply reduced.
5. Good product design that works for sales pitches
What kind of data are collecting?
• Qualitative data
– Derived from qualitative research techniques
• Quantitative data
– Derived from quantitative research techniques
Qualitative research
• Provides detailed information including knowledge about
behavior, attitude and aptitude of the user
• It helps to identify any existence of same or similar product
available
• It helps to develop ideas
• Its main aim is to provide depth of understanding
• Most popular research methods includes
– Interviews
– Case studies
– Observation
– Group discussion
– Open ended questionnaires
• They aren’t closed ended questionnaires (yes / no)
• They require more thought or more than one word answer
Quantitative Research
• Statistics of human activities that will answer question like
“how much” or “how many”
• Information in numbers
• This research helps in understanding the fact that how
much quantity of copies of products to be delivered to
certain group of user
• It will tend to make a decision that how many end user
need to be asked detailed question related to qualitative
research
• Most popular research methods includes
– closed ended questionnaires
– Correlation
– Regression analysis
Evaluation techniques
• Evaluation techniques can be defined under two
broad categories
– Expert analysis
• Cognitive Walkthrough
• Heuristic Evaluation
• Review-based evaluation
– User participation
• Empirical methods: experimental evaluation
• Observational techniques
• Query techniques
• Evaluation through monitoring physiological responses
Evaluation Through Expert Analysis
Cognitive Walkthrough
• Involve simulating a user’s problem solving
process and each step in the human computer
dialog checking to see if the users goals and
memory for action can be assumed to the next
correct action
5 Steps of Cognitive walkthroughs
• Characteristics of typical user are identified
and documented and sample tasks are
developed that focus on the aspects of the
design to be evaluated
• Designer and one or more expert evaluators
than come together to do analysis
• Evaluators walk through the action sequences
for each task placing it within the context of
typical scenario
Questions: 5 Steps of Cognitive walkthroughs

• Steps of cognitive walkthrough are followed to


answer the following questions
– Will the correct action be sufficiently evident to
the user?
– Will the user notice that the correct action is
available?
– Will the user associate and interpret the response
from the action correctly?
5 Steps of Cognitive walkthroughs
• As the walkthrough is being done, record of
critical information is compiled
– Assumption about what would cause problems?
Why are they recorded?
– Notes about site issues and design changes are
made
– Summary of the result is compiled
• Design is revised to fix the problems
presented
Inputs of Cognitive Walkthrough
• To do walk through four things are needed.
– Prototype
– Task
– Sequence of actions to do the task in the
prototype
– User analysis
Example
• Task: Set the alarm clock to 8:30
Example
Pluralistic Walkthrough
• Can be conducted by following sequence of steps
– Scenarios are developed in the form of series part of the screen
representing a single path to the interface
– Scenarios are presented to the panel of evaluators and panel is
asked to write down sequence of actions they would take to move
from one screen to another
– When evaluator has written down their actions the panelist discuss
the actions they have suggested for that ground of the review
– Usually the representative user go first so that they are not
influenced by other panel members and are not uttered from
speaking
– Usability experts present their findings and finally designers offer
their comments
– Panel moves on the next ground of the screen. This process
continues until all the scenarios have been evaluated
Heuristic Evaluation
• Proposed by Nielsen and Molich.
• What is it?
A discount usability engineering method
- Easy (can be taught in ½ day seminar)
- Fast (about a day for most evaluations)
- Cheap
• How does it work?
– Evaluators use a checklist of basic usability heuristics
– Evaluators go through an interface twice
• 1st pass get a feel for the flow and general scope
• 2nd pass refer to checklist of usability heuristics and focus on individual elements
– The findings of evaluators are combined and assessed
Heuristic Evaluation
• Example heuristics
– system behaviour is predictable
– system behaviour is consistent
– feedback is provided
• Heuristic evaluation `debugs' design.
• One expert won’t do
• Need 3 - 5 evaluators
• Exact number needed depends on cost-benefit
analysis
Heuristic Evaluation
• Debriefing session
– Conducted in brain-storming mode
– Evaluators rate the severity of all problems
identified
– Use a 0 – 4, absolute scale
• 0 I don’t agree that this is a prob at all
• 1 Cosmetic prob only
• 2 Minor prob – low priority
• 3 Major prob – high priority
• 4 Usability catastrophe – imperative to fix
Heuristic Evaluation
• What are the shortcomings of H.E.?
– Identifies usability problems without indicating
how they are to be fixed.
• “Ideas for appropriate redesigns have to appear
magically in the heads of designers on the basis of their
sheer creative powers.”
Nielsen: Revised 10 Usability Heuristics
(based on extensive empirical testing)
• *Visibility of system status • Recognition rather than recall
(i.e. feedback) (minimize memory load)
• Match between system and • *Flexibility and efficiency of
the real world (speak the use (includes shortcuts,
user’s language) macros)
• Aesthetic and minimalist
• *User control and freedom
design
(undo, redo, clear exits)
• *Help users diagnose and
• *Consistency and recover from errors
standards
• Help and documentation
• *Error prevention
Example
A few problems with the Interface
• Red is used both for help messages and for error
messages (consistency, match real world)
• “There is a problem with your order”, but no explanation
or suggestions for resolution (error reporting)
• No “Continue shopping" button (user control & freedom)
• Recalculate is very close to Clear Cart (error prevention)
• “Check Out” button doesn’t look like other buttons
(consistency, both internal & external)
• Must recall and type in cart title to load (recognition not
recall, error prevention, efficiency)
Writing Good Heuristic Evaluations
• Heuristic evaluations must communicate wel to the developers
and managers
• Include positive comments as well as criticisms
– “Good: Toolbar icons are simple with good contrast and few colors
(minimalist design)”
• Be tactful
– NOT: “The menu organization is a complete mess”
– Better: “menus are not organized by function”
• Be Specific
– Not: “Text is unreadable”
– Better:”Text is too small and has poor contrast(black text on green
background)”
Suggested Report Format
• What to include:
– Problem
– Heuristic
– Description
– Severity
– Recommendation (if any)
– Screenshot (if helpful)
Review-based evaluation
• Results from the literature used to support or
refute parts of design.
• Care needed to ensure results are transferable
to new design.
• Model-based evaluation
• Cognitive models used to filter design options
e.g. GOMS prediction of user performance.
• Design rationale can also provide useful
evaluation information
Example
• An experiment testing the usability of a
particular style of help system using novice
participants may not provide accurate
evaluation of a help system designed for
expert users.
• The review should therefore take account of
both the similarities and the differences
between the experimental context and the
design under consideration.

You might also like