Download as pdf or txt
Download as pdf or txt
You are on page 1of 40

EVALUATION OF HCI

• Introduction to HCI evaluation

• Role and goals of Evaluation

• Evaluation techniques

6/24/2022 Human Computer Interaction 1


User-Centered Design Approach

Identify needs/
establish
requirements

Final Product

(Re)Design
Evaluate

Build an
interactive
version

6/24/2022 Human Computer Interaction 2


What is Evaluation?
• Evaluation is concerned with gathering data about the
usability of a design or product
– by
• a specified group of users (with experience, age, gender,
psychological and physical characteristics)
– for
• the types of activities that the users will do (from tightly specified
tasks to tasks decided by users)
– within
• a specified environment (from a controlled laboratory situation to a
natural work setting)

6/24/2022 Human Computer Interaction 3


Who? What?? Where??? When????
• Formative evaluation is done at different
stages of development to check that the
product meets users’ needs

• Summative evaluation assesses the quality of


a finished product

6/24/2022 Human Computer Interaction BIT 4


2305
When????
• Evaluation should not be thought of as a single phase in the
design process

• Evaluation should occur throughout the design life cycle, with


the results of the evaluation feeding back into modification to
the design

• Star lifecycle model proposed by Hix and Hartson (1993)


emphasizes the need to carry out evaluation at each stage of
the system development lifecycle

6/24/2022 Human Computer Interaction BIT 5


2305
Star Life Cycle

task/functional
Implementation
analysis

Requirements
Prototyping Evaluation specification

Conceptual/
formal design

• The nature of the artifacts being evaluated, which may be anything from
series of sketches to a working software prototype or fully developed
product

6/24/2022 Human Computer Interaction BIT 6


2305
Evaluation in the Lifecycle
• Formative evaluation
– During the early design stages evaluations
tend to be done to:
• Predict the usability of the product or an
aspect of it

• Check the design team’s understanding of


user’s requirements by seeing how an already
existing system is being used in the field

• Test out ideas quickly and informally


6/24/2022 Human Computer Interaction BIT 7
2305
Evaluation in the Lifecycle
• Summative evaluation
– Later on in the design process the
focus shifts to:
• Identifying user difficulties so that
the product can be more friendly
tuned to meet their needs

• Improving an upgrade of the product

6/24/2022 Human Computer Interaction BIT 8


2305
Star Life Cycle
• Advantages
– This has the advantages that
problems can be ironed out Implementation
task/functional
analysis
before considerable effort and
resources have been expended Requirements
on the implementation itself Prototyping Evaluation specification

Conceptual/

– It is much easier to change a formal design

design in the early stages of


development than in the later
stages

6/24/2022 Human Computer Interaction BIT 9


2305
Role of Evaluation
• Evaluation is a central part of user-centered design
approach

• Without doing some form of evaluation it is


impossible to know whether or not the design or
system fulfills the needs of the users

• Evaluation tells how well a system fits the physical,


social and organizational context in which it will be
used

6/24/2022 Human Computer Interaction BIT 10


2305
Goals of Evaluation
• Four reasons (not limited to) for doing
evaluation
1. Engineering towards a target
• Is it good enough?
– Design of the system should enable users to perform their
intended tasks more easily
– To assess the extent and accessibility of the system’s
functionality

2. Understanding the world


• How do users employ the technology in the workplace?
– To assess the user’s experience of the interaction and its impact
upon him
6/24/2022 Human Computer Interaction BIT 11
2305
Goals of Evaluation
• Four main reasons for doing evaluation
3. Comparing design
• Which is the best?
– To compare two or more designs or design ideas
– With reference to number of functionalities and the usability of
the design

4. Checking conformances to a standard


• Does the product conform to the standard?
– It should satisfy some legal standard
– Safety measures

6/24/2022 Human Computer Interaction BIT 12


2305
In the World of Evaluation

What is evaluation?

Why do evaluation?

When do we do evaluation?

How do we do evaluation?

6/24/2022 Human Computer Interaction BIT 13


2305
Evaluation Techniques
• There are several evaluation techniques
• All the techniques can be broadly classified
into two categories
– Evaluation through expert analysis
– Evaluation through user participations

6/24/2022 Human Computer Interaction BIT 14


2305
Taxonomy of Evaluation Techniques

• Evaluation through expert analysis


1. Cognitive walkthrough
2. Heuristic evaluation
3. Model-based evaluation
4. Review-based evaluation

6/24/2022 Human Computer Interaction BIT 15


2305
Taxonomy of Evaluation Techniques
• Evaluation through user participations
1. Styles of evaluation
• Laboratory studies
• Field studies
2. Empirical methods
• Hypothetical methods
• Statistical methods
3. Observational techniques
• Think-aloud and cooperative evaluation
• Protocol analysis
4. Query techniques
• Interviews
• Questionnaires
5. Monitoring physiological responses

6/24/2022 Human Computer Interaction BIT 16


2305
Sub Topics
• Evaluation through Expert Analysis
– Cognitive walkthrough
– Heuristic evaluation
– Model-based evaluation
– Cognitive dimension of notations

6/24/2022 HCI BIT 2305 17


Cognitive Walkthrough
• Walkthrough is a term that refers to the detailed review of a
sequence of actions
• Walkthrough done by a group of experts (preferably, excluding
the members who actually design/implement)
Example: Code walkthrough in Software Engineering
– To review a segment of program code by the expert other than
programmer
– Sequence of actions
• Selecting a set of program codes
• Check certain characteristics, such as, coding style, spelling variables
conventions, function declarations, system-wide invariants etc.

6/24/2022 HCI BIT 2305 18


Cognitive Walkthrough
• Like cognitive walkthrough (CW) in Software Engineering,
we follow CW in HCI

• CW involves one or a group of evaluators inspecting a user


interface by going through a set of tasks and evaluate its
understandability and ease of learning

• Assess the usability of a system in situations where the user is


not an expert, and may be attempting a task that s/he has never
done before

6/24/2022 HCI BIT 2305 19


Exploratory Learning
• Exploratory learning (EL) is learning through exploration
• Many users prefer to learn how to use system by exploring its
functionality hands on and not after sufficient training or examination of
user’s manual
• Users carrying out tasks through EL involves four basic tasks
1. User sets a goal to be accomplished with the system
e.g. Spelling check
2. User searches the interface for currently available actions
e.g. Presence of menu items, buttons, availability of command-line inputs
3. User selects the action that seems likely to make progress toward the
goal
4. The user performs the selected action and evaluates the system’s
feedback for evidence that progress is being made toward the current
goal
6/24/2022 HCI BIT 2305 20
CW: Evaluation Procedure
• To do the CW, we need the following things
– A specification or prototype of the system
• It does not have to be complete, but it should be fairly detailed

– A general description of the type of users who are expected


to use the system and the relevant knowledge that these
users would be expected to have

– A description of one or more representative tasks to be used


in the evaluation

– For each of the tasks, a list of the correct actions that


should be performed in order to complete the task
6/24/2022 HCI BIT 2305 21
CW: Evaluation Procedure
• The evaluators move through each of the tasks,
considering the user interface at each step

• At each step, they examine the interface and critique


the system and tell scenario

6/24/2022 HCI BIT 2305 22


CW: What to Evaluate?

• Three things are to be evaluated


– Problems forming correct goals

– Problems identifying the actions

– Problems performing the actions

6/24/2022 HCI BIT 2305 23


Problems Forming Correct Goals?
– Failure to add goals
• If the new design contains new goals, is there a clear indication to
the user that these should be added?
Example: The user needs to add “load Hindi dictionary” to check a
document. An interface may not contain prompt or other
information in the interface indicating that this step is required to
check a document in Hindi

– Failure to drop goals


• If the design contains goal that must be dropped, is there a clear
indication to the user that these should be dropped?
Example: While editing a file the older version is being saved as
backup automatically. If the interface gives no indication of this
activity, the user won’t drop the goal of manually creating backup

6/24/2022 HCI BIT 2305 24


Problems Forming Correct Goals?
– Addition of spurious goals
• Does the system suggests any extra or incorrect goals?
Example: The user tries to save a modified file and the system
presents a dialogue box asking ‘Save as ..’. The user may add the
goal of entering a file name, even though the next correct action is
to click ‘Ok’, which saves the file under its current name

– No-progress impasse
• Does the system’s response indicate that progress has been made
toward some higher goal?
Example: No significant response (while opening a large file from
hard disk) or inappropriate response (Load the file and fix the
cursor at the end of the file showing a blank screen giving a
feedback that no file or improper file is opened)

6/24/2022 HCI BIT 2305 25


Problems Identifying the Actions
– Correct action doesn’t match goals
• Is there a problem matching a current goal to the correct action?
Example: A user with his mobile cell has the goal “Send reply to a
received message appended with the message”. The action is to
select reply message “Send” is a poor match to the user’s goal

– Incorrect actions match goals


• Are the incorrect actions available that match a user’s goal?
Example: A user working with MS Word wants to “Change the
width of column of a table”. To perform this goal user has to place
the pointer in any cell in the column and stretch a boundary of the
cell accordingly

6/24/2022 HCI BIT 2305 26


Problems Performing the Actions
– Physical disabilities
• Are there any difficulties in performing an action, such as pressing
multiple keys simultaneously or finding a hidden control?
Example: The subscript command in MS Word is “Ctrl + =“. On a
keyboard with a single control key on the far left side, a user with
small hands may have difficulty touch typing “Ctrl + =“

– Time-outs
• If the system has time-outs, how users might have difficulties
deciding on the action and performing it before a time-out?
Example: User response time can vary greatly, and it is required to
consider whether time-outs is useful or there is an obvious way to
recover from a time-out if one does occur

6/24/2022 HCI BIT 2305 27


CW: Who Are the Evaluators?
• The evaluators may include

– Human factors engineer

– Software developer

– People from marketing

– People for documentation

etc.

6/24/2022 HCI BIT 2305 28


• Evaluation through Expert Analysis
– Cognitive walkthrough
– Heuristic evaluation
– Model-based evaluation
– Cognitive dimension of notations

17 April, 2008 Human Computer Interaction 29


Spring 2008, Lecture #17
Heuristic Evaluation
• The method is cost-effective, fast, relatively simple,
and flexible approach
– Can be performed on a design specification, that is, it can
be used for evaluation at an early design

– It can also be used on prototypes, short-boards and fully


functioning systems

17 April, 2008 Human Computer Interaction 30


Spring 2008, Lecture #17
HE: Basic Concept
• Several evaluators evaluate the interface and come up with
the potential usability problems

• It is important that the evaluation be done independently

• To aid the evaluators in discovering usability problems,


Nielsen proposed 10 usability heuristics
– A number of these are recognizably derived from the principles of
Direct Manipulation by Ben Shneiderman, although they apply to a
wide range of different interaction styles

– They are called heuristic because they are more in the nature of rules
of thumb than specific usability guidelines

17 April, 2008 Human Computer Interaction 31


Spring 2008, Lecture #17
HE: 10 Usability Heuristics
1. Visibility of system status
– The system should always keep users informed about what is going
on, through appropriate feedback within reasonable time

2. Match between system and the real world


– The system should speak the users' language, with words, phrases and
concepts familiar to the user, rather than system-oriented terms.
Follow real-world conventions, making information appear in a
natural and logical order

3. User control and freedom


– Users often choose system functions by mistake and will need a
clearly marked "emergency exit" to leave the unwanted state without
having to go through an extended dialogue. Support undo and redo

17 April, 2008 Human Computer Interaction 32


Spring 2008, Lecture #17
HE: 10 Usability Heuristics
4. Consistency and standards
– Users should not have to wonder whether different words, situations,
or actions mean the same thing. Follow platform conventions

5. Error prevention
– Even better than good error messages is a careful design which
prevents a problem from occurring in the first place

6. Recognition rather than recall


– Make objects, actions, and options visible. The user should not have to
remember information from one part of the dialogue to another.
Instructions for use of the system should be visible or easily
retrievable whenever appropriate

17 April, 2008 Human Computer Interaction 33


Spring 2008, Lecture #17
HE: 10 Usability Heuristics
7. Flexibility and efficiency of use
– Accelerators -- unseen by the novice user -- may often speed up the
interaction for the expert user such that the system can cater to both
inexperienced and experienced users. Allow users to tailor frequent
actions

8. Aesthetic and minimalist design


– Dialogues should not contain information which is irrelevant or rarely
needed. Every extra unit of information in a dialogue competes with
the relevant units of information and diminishes their relative visibility

9. Help users recognize, diagnose, and recover from errors


– Error messages should be expressed in plain language (no codes),
precisely indicate the problem, and constructively suggest a solution

17 April, 2008 Human Computer Interaction 34


Spring 2008, Lecture #17
HE: 10 Usability Heuristics
10. Help and documentation
– Even though it is better if the system can be used without
documentation, it may be necessary to provide help and
documentation. Any such information should be easy to search,
focused on the user's task, list concrete steps to be carried out, and not
be too large

• Jacob Nielsen originally developed the heuristics for heuristic evaluation


in collaboration with Rolf Molich in 1990. Nielsen since refined the
heuristics based on a factor analysis of 249 usability problems to derive a
set of heuristics with maximum explanatory power, resulting in this
revised set of heuristics

17 April, 2008 Human Computer Interaction 35


Spring 2008, Lecture #17
HE: Evaluation Procedure
• Each evaluator assesses the system and notes violations of
any of theses usability heuristic that would indicate any
potential usability problem

• The evaluator also assesses the severity of each usability


problem based on four factors
1. How common is the problem
2. How easy is it for the user to overcome

3. Will be a one-off problem or persistent problem

4. How seriously will the problem be perceived

17 April, 2008 Human Computer Interaction 36


Spring 2008, Lecture #17
HE: Evaluation Procedure
• All these factors can be combined into an overall severity
rating on a scale of 0-4 (Nielsen)
0 = I don’t agree that this is a usability problem at all

1 = Cosmetic problem only; need not be fixed unless extra time is


available on project

2 = Minor usability problem; fixing this should be given low priority

3 = Major usability problem; important to fix, so should be given high


priority

4 = Usability catastrophe; imperative to fix this before product can be


released
17 April, 2008 Human Computer Interaction 37
Spring 2008, Lecture #17
Model-Based Expert Evaluation
• A third expert-based approach is the use of models.
Certain cognitive and design models provide a means of
combining design specification and evaluation into the
same framework.
• For example, the GOMS (goals, operators, methods and
selection) model predicts user performance with a
particular interface and can be used to filter particular
design options.
• Similarly, lower-level modeling techniques such as the
keystroke-level model pro-vide predictions of the time
users will take to perform low-level physical tasks.

6/24/2022 Human Computer Interaction BIT 38


2305
Expert evaluation Using
previous results
• Experimental psychology and human–computer interaction between
them possess a wealth of experimental results and empirical
evidence. Some of this is specific to a particular domain, but much
deals with more generic issues and applies in a variety of situations.
Examples of such issues are the usability of different menu types,
the recall of command names, and the choice of icons.
• A final approach to expert evaluation exploits this inheritance, using
previous results as evidence to support (or refute) aspects of the
design. It is expensive to repeat experiments continually and an
expert review of relevant literature can avoid the need to do so. It
should be noted that experimental results cannot be expected to
hold arbitrarily across contexts. The reviewer must therefore select
evidence care-fully, noting the experimental design chosen, the
population of participants used, the analyses performed and the
assumptions made.
6/24/2022 Human Computer Interaction BIT 39
2305
THE END

6/24/2022 HCI BIT 2305 40

You might also like