Professional Documents
Culture Documents
MSE UCD Unit3 WS1718
MSE UCD Unit3 WS1718
Verena Seibert-Giller
Usability inspection
Usability tests
© 2014 Seibert-Giller
3
Today‘s contents
▪ Types of Evaluations
© 2014 Seibert-Giller
6
Today‘s contents
▪ Types of Evaluations
© 2014 Seibert-Giller
8
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
9
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
10
Heuristic evaluation
▪ Usability professionals evaluate the user interface in order to
identify potential usability problems.
▪ Each evaluator goes through the interface several times and
inspects the views and dialogue elements and evaluates
them against defined list of specified heuristics
– E.g. Consistency (detailed list later)
© 2014 Seibert-Giller
11
Number of evaluators (by Jacob Nielsen)
Number of evaluators recommended: 3-5
© 2014 Seibert-Giller
12
Heuristic evaluation
▪ Potential problems:
© 2014 Seibert-Giller
13
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
14
Jabob Nielsen‘s Heuristics
1. Visibility of system status
– The system should always keep users informed about what is
going on, through appropriate feedback within reasonable
time.
© 2014 Seibert-Giller
15
Jabob Nielsen‘s Heuristics
3. User control and freedom
– Users often choose system functions by mistake and will
need a clearly marked "emergency exit" to leave the
unwanted state without having to go through an extended
dialogue. Support undo and redo.
© 2014 Seibert-Giller
16
Jabob Nielsen‘s Heuristics
5. Error prevention
– Even better than good error messages is a careful design
which prevents a problem from occurring in the first place.
Either eliminate error-prone conditions or check for them and
present users with a confirmation option before they commit
to the action.
© 2014 Seibert-Giller
18
Jabob Nielsen‘s Heuristics
9. Help users recognize, diagnose, and recover from errors
– Error messages should be expressed in plain language (no
codes), precisely indicate the problem, and constructively
suggest a solution.
© 2014 Seibert-Giller
19
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
20
Heuristics by Weinschenk and Barker
▪ Analysed various available heuristics and principles and
came up with the following 20 heuristics
© 2014 Seibert-Giller
21
Heuristics by Weinschenk and Barker
▪ 1. User Control: heuristics that check whether the user has enough
control of the interface.
▪ 2. Human Limitations: the design takes into account human limitations,
cognitive and sensorial, to avoid overloading them.
▪ 3. Modal Integrity: the interface uses the most suitable modality for
each task: auditory, visual, or motor/kinesthetic.
▪ 4. Accommodation: the design is adequate to fulfill the needs and
behavior of each targeted user group.
▪ 5. Linguistic Clarity: the language used to communicate is efficient and
adequate to the audience.
▪ 6. Aesthetic Integrity: the design is visually attractive and tailored to
appeal to the target population.
© 2014 Seibert-Giller
22
Heuristics by Weinschenk and Barker
▪ 7. Simplicity: the design will not use unnecessary complexity.
▪ 8. Predictability: users will be able to form a mental model of how the
system will behave in response to actions.
▪ 9. Interpretation: there are codified rules that try to guess the user
intentions and anticipate the actions needed.
▪ 10. Accuracy: There are no errors, i.e. the result of user actions
correspond to their goals.
▪ 11. Technical Clarity: the concepts represented in the interface have
the highest possible correspondence to the domain they are modeling.
▪ 12. Flexibility: the design can be adjusted to the needs and behaviour
of each particular user.
▪ 13. Fulfillment: the user experience is adequate.
© 2014 Seibert-Giller
23
Heuristics by Weinschenk and Barker
▪ 14. Cultural Propriety: user's cultural and social expectations are met.
▪ 15. Suitable Tempo: the pace at which users works with the system is
adequate.
▪ 16. Consistency: different parts of the system have the same style, so
that there are no different ways to represent the same information or
behavior.
▪ 17. User Support: the design will support learning and provide the
required assistance to usage.
▪ 18. Precision: the steps and results of a task will be what the user
wants.
▪ 19. Forgiveness: the user will be able to recover to an adequate state
after an error.
▪ 20.Responsiveness: the interface provides enough feedback
information about the system status and the task completion.
© 2014 Seibert-Giller
24
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
25
Cognitive Walkthrough
▪ First a task analysis is required as basis for further
activities
– Each task has to be split up in it’s subtasks
▪ Then designers and developers of the product walk
through the tasks – step by step / subtask by subtask - as
a group.
▪ While doing so they keep asking themselves questions,
normally the questions shown on the next slide.
– Originally 4 questions by C. Wharton
– Streamlined 2 questions by Spencer
© 2014 Seibert-Giller
26
Cognitive Walkthrough
▪ Potential problems:
– The evaluators not necessarily know how to perform
the task themselves, so they stumble through the
interface trying to discover the correct sequence of
actions -- and then they evaluate the stumbling
process.
– No real users are tested on the system. The
walkthrough might identify more problems than you
would find in a single user test session.
© 2014 Seibert-Giller
27
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
28
Cognitive Walkthrough - Wharton
1. Will users be trying to produce whatever effect the action
has?
See the need for the respective subtask?
2. Will users see the control (button, menu, switch, etc.) for the
action?
3. Once users find the control, will they recognize that it
produces the effect they want?
4. After the action is taken, will users understand the feedback
they get, so they can go on to the next action with
confidence?
© 2014 Seibert-Giller
29
Usability Inspection Methods
1. Heuristic Evaluation
10 Heuristics by Jacob Nielsen (originally invented the H.E.)
20 Heuristics by Susan Weinschenk and Dean Barker
2. Cognitive walkthroughs
By Wharton
By Spencer
© 2014 Seibert-Giller
30
Cognitive Walkthrough - Spencer
1. Will the user know what to do at this step?
2. If they do the right thing, will they know they did it?
© 2014 Seibert-Giller
31
Now it is your turn
▪ You will try out the methods and we will compare and
discuss your results.
▪ In 2 groups
– ½ of you takes the heuristics from Jacob Nielsen
– ½ of you takes Whartons walkthrough questions
© 2014 Seibert-Giller
32
Today‘s contents
▪ Types of Evaluations
These are perfectly fine for most systems in These are sometimes
the end user market! required for safety critical
systems
© 2014 Seibert-Giller
35
Lots of cameras and screens
People observing closly
© 2014 Seibert-Giller
37
Testing mobile appliances
© 2014 Seibert-Giller
38
Discussing testing variations
▪ Steve Krug shows a very reasonable approach to testing for
non-usability-professionals in his book „Rocket surgery
made easy“ (see next slides)
© 2014 Seibert-Giller
39
Big Honkin‘ Test Do it your self testing
41
Big Honkin‘ Test Do it your self testing
42
Organization of a Do It Yourself Test 1/3
Define what you want to learn about your system!
Problems in understanding, Time on task too long,
information/content presentation ok, …
© 2014 Seibert-Giller
44
Developing tasks
▪ Each task has to
– target /check a concrete design question
– be like a task in reality
– have clear result (e.g. not just the opening of the right page)
▪ E.g. ÖBB “the ticket costs 17euro” not “here I find the ticket
prices”
– be texted without using the terms/words represented in the
user interface
▪ E.g. You want to travel to Paris Disneyland with your family (kids 10 and
13 years) for the weekend of the 31. Oct. You want to optimize your stay
by arriving Friday evening and leaving latest on Sunday night. What is
the cheapest flight you can get?
© 2014 Seibert-Giller
45
Organization of a Do It Yourself Test 3/3
▪ Invite / recruit your users
▪ Carefully schedule the users, consider how much time per
user will you need
▪ Schedule a project team meeting right after the tests are
finished
▪ Be prepared to moderate this meeting and step out of it
with clear ToDo’s for the User Interface
© 2014 Seibert-Giller
46
Documenting results
▪ Document what is necessary for the project team to later
check or get back to specific questions
– Very basically describe the users problems and inputs
– Clear ToDos for the User Interface
– Why did the team decide upon these issues as they did?
© 2014 Seibert-Giller
47
Avoiding Bias
▪ The problems of your test person is what you are looking
for in order to optimize the system
– If you help, you get no insights
▪ So don‘t help and don‘t give hints!
▪ Watch out for your body language!
▪ Don‘t show that you are impatient!
© 2014 Seibert-Giller
49