Download as pdf or txt
Download as pdf or txt
You are on page 1of 119

UI/UX Foundations:

Research & Analysis


Meg Kurdziolek and Karen Tang
Your Goals

What would you like to learn today?


Our Goals

We want you to…


talk confidently to UX researchers
critically understand research presented to you

conduct basic UX research on your own


have a basis to continue learning about UX research
Activity (setup)

Which of these problems do you feel strongly about?


Pittsburgh public transportation
Food delivery in Pittsburgh
Finding family-friendly activities in Pittsburgh
Pittsburgh public schools
Agenda
09:00 - 09:20 Breakfast and Introductions
09:20 - 09:30 User-Centered Design
09:30 - 10:10 Surveys, Diary Studies, Interviews
10:10 - 10:55 Usability Studies, Field Studies
10:55 - 11:25 A/B Testing, Log Analysis
11:25 - 12:00 Adapting Your Methods
12:00 - 12:30 LUNCH
12:30 - 01:00 Interpreting Your Data
01:00 - 01:40 Special Topics: Dark UX Patterns
01:40 - 01:50 Case Studies
01:50 - 02:00 Group reflections & wrap-up Q&A
User-centered design (n.) - a framework of processes
in which the needs, wants, and limitations of end
users of a product are given extensive attention at
each stage of the design process.

https://en.wikipedia.org/wiki/User-centered_design
Design Process

Refine

Learn Build
UX Research & Design

Refine Build Learn


UX Research & Design

Data Collection, Validation, Evaluation

Refine Build Learn


UX Research & Design

Data Collection, Validation, Evaluation

Refine Build Learn


UX Research & Design
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Testing User Observation

Data Collection, Validation, Evaluation

Refine Build Learn


Need Finding
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Testing User Observation

Data Collection, Validation, Evaluation

Refine Build Learn


Need Finding
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Testing User Observation

Data Collection, Validation, Evaluation

Refine Build Learn


Surveys

http://www.baerpm.com/blog/what-a-customer-survey-can-do-for-your-business/
overall impressions

who your users are


Surveys (demographics)
are good
for learning: outstanding opinions

who’s might participate


in further research
Common Survey Example

How likely is it that you’d recommend [brand] to a friend?

Not at Very
Neutral
all Likely Likely

1 2 3 4 5 6 7 8 9 10
Net Promoter Score
NPS: “research has shown that your NPS® acts as a leading indicator of
growth. If your organization’s NPS is higher than those of your
competitors, you will likely outperform the market…”

https://www.netpromoter.com/know/
Surveys: Pros and Cons
Benefits Limitations
Cheap (in $ and time) Limited in type & scope of data
Easy to recruit participants Question interpretation issues
May receive high response rate Response bias
Easy to analyze Easy to misinterpret or *over*-
interpret results
Silly Survey Example

Do you like to eat lunch alone?


Yes
No
Diary Studies

http://hciresearch4.hcii.cs.cmu.edu/M-HCI/2011/BOA-PlanningTools/
day-to-day habits & patterns

Diary Studies when, how, and why they


use your product
are good for….
reflections on real problems
encountered and how they
were solved
Diary Studies: Pros and Cons
Benefits Limitations

A longitudinal scope of data Costly (in $ and time)

Get a look at the mundane, Difficult to recruit


every-day interactions and participants (& high attrition)
behaviors
Relies on self-report
Example Diary Study

Radar: Intellicast vs.


Weather Underground
“Hot” Radar
Interviews
user’s background

their use of technology


Interviews
their goals and motivations
are good for….
their pain points

what problems need to be


addressed or solve
Interviews: Pros and Cons
Benefits Limitations

Cheap (in $) Takes a moderately amount


of time
Can target specific users or be
opportunistic
Results indicate what people
Can engage with users
*say* they do (rather than
personally actual behavior)

Can get the answer to lots of


“why” questions
Interview Tips
Start broad, then narrow-in.
example: “Overall, how do you think Pittsburgh public
transit compares to other cities?”

Ask clarifying questions, and use their words.


example: “You said the bus system is hard to predict,
could you explain that to me?”

It’s okay to play-dumb. (But be honest.)


example: “I’ve never used public transit here. Can you
tell me how you would find out the schedules and
figure it out how to get downtown from here?”
More Interview Tips
Avoid “Yes/No” questions.

Avoid asking about feelings. Ask about behaviors instead.

Don’t number your questions. Organize by topics you


want to cover. Be prepared to skip around.

Always be prepared to go off-script.

Ask the question, then pause. Don’t rush to fill silence.


Activity - Part 1
Partner with someone who is interested in a different topic than you. You
will interview them on their chosen topic.

It’s your job to explore what the needs are and uncover the main issues,
feelings, thoughts, and pain-points.

Round 1: (5 minutes) Develop your Script


Goal: Individually, develop a rough script that you will use to interview
your partner. Start broad to gather overall impressions, then narrow in
on specific topic areas. Remember, you are trying to understand overall
impression and the biggest pain-points.

Round 2: (20 minutes) Interview (10 minutes each)


Goal: Take turns interviewing each other. Be sure to keep notes.
How do you know when you are
done conducting interviews?

Saturation (n.) - when the same topics (or themes)


keep emerging in your interviews, and conducting
more interviews results in no new themes.

Rule of thumb - 12 interviews for saturation


Example of one thing you can do with
interviews: build robust personas
Need Finding
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Testing User Observation

Data Collection, Validation, Evaluation

Refine Build Learn


Validation & Evaluation
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Studies User Observation

Data Collection, Validation, Evaluation

Build Learn
Types of Usability Testing
Formative Summative
A/B Testing ✔
Field Studies ✔
Hallway Testing ✔ ✔
Heuristic Evaluation ✔ ✔
Interviews ✔ ✔
Log Analysis ✔
Remote Testing ✔
Surveys ✔ ✔
Think-Aloud ✔ ✔
Wizard of Oz ✔
Lab Usability Studies
Formative Summative
A/B Testing ✔
Field Studies ✔
Hallway Testing ✔ ✔
Heuristic Evaluation ✔ ✔
Hypothesis Testing ✔
Interviews ✔ ✔
Log Analysis ✔
Remote Testing ✔
Surveys ✔ ✔
Think-Aloud ✔ ✔
Wizard of Oz ✔
Lab Usability Testing

http://trydevkit.com/blog-post/a-beginner-s-guide-to-usability-testing/81da0af5-fb17-fd8e-016b-536948e32ced
http://trydevkit.com/blog-post/a-beginner-s-guide-to-usability-testing/81da0af5-fb17-fd8e-016b-536948e32ced
http://usabilitygeek.com/an-introduction-to-website-usability-testing/
learning how easy or
difficult it is for users to
Usability Studies learn and use your interface
are good for….
if language and
iconography are intuitive

how users encounter and


recover from errors
Lab Usability Studies
Benefits Limitations

Cheap (in $) Usually takes a moderate


amount of time and set-up
Observe user behavior as
they encounter a design for Can sometimes feel staged,
the first time or unauthentic

See the consequences of


design decisions first-hand
Running a Usability Study
Planning: create test plan, recruit participants

Pilot: practice with internal users, resolve any technical or


logistical issues

Test session: run test plan, be present (formative) or simply


observe (summative)

Debrief: short Q&A with participants, discuss observations


with other study observers

Analysis
Example Usability Study
Contextual Usability Studies
Formative Summative
A/B Testing ✔
Field Studies ✔
Hallway Testing ✔ ✔
Heuristic Evaluation ✔ ✔
Hypothesis Testing ✔
Interviews ✔ ✔
Log Analysis ✔
Remote Testing ✔
Surveys ✔ ✔
Think-Aloud ✔ ✔
Wizard of Oz ✔
Field Studies

http://www.oracle.com/webfolder/ux/applications/getInvolved/customerFeedback.html
Field Studies learning how customers
actually use your product in
are good for…. day-to-day life
Field Studies
Benefits Limitations

Allows you to observe Significant cost ($)


authentic, contextual, user
behavior Takes more time to run

Can observe the day-to-day


experience users have with
your product, across a
longer period of time
Field Study Example
Field Study Example
Computer Lab
One laptop +
projector

Laptop Carts
Field Study Example
[00:24:19.08] Boy says to the girl
on his right: "you cheating”

[00:24:21.19] Girl to the left:


"what? Its fun. ::mumble:: the
simulation. Look.”

[00:24:25.21] The boy looks to


the girl on his left, then back to
the girl on the right, then down to
his workbook in front of him. He
puts his head on the table.
Contextual Usability Studies
Formative Summative
A/B Testing ✔
Field Studies ✔
Hallway Testing ✔ ✔
Heuristic Evaluation ✔ ✔
Hypothesis Testing ✔
Interviews ✔ ✔
Log Analysis ✔
Remote Testing ✔
Surveys ✔ ✔
Think-Aloud ✔ ✔
Wizard of Oz ✔
A/B Testing
A/B Testing

http://www.smashingmagazine.com/2010/06/the-ultimate-guide-to-a-b-testing/
asking “how much”, “how
A/B Testing many”, “which one is better”
is good for…. sampling from actual users

testing live apps/services


A/B Testing
Benefits Limitations

Low maintenance: release and Missing context of why users


wait for data take an action

Can measure very specific May have no results, not


questions guaranteed to be conclusive

Live testing, measures actual Only measures certain user


vs. self-reported user behavior interactions
Running an A/B Test
Make sure to:
test conditions simultaneously (fewer confounding factors)
be consistent, keep track of which users see which version
deploy tests cautiously; user research can help inform

Things to watch out for:


don’t jump to conclusions, wait for statistical significance
waiting too long could cost you potential conversions
you might be interfering with user habit
Running an A/B Test
Make sure to:
test conditions simultaneously (fewer confounding factors)
be consistent, keep track of which users see which version
deploy tests cautiously; user research can help inform

Things to watch out for:


don’t jump to conclusions, wait for statistical significance
waiting too long could cost you potential conversions
you might be interfering with user habit
https://medium.com/@adlon/threats-of-a-b-tests-and-ux-research-adoption-time-and-incrementalism-991c0c3c61b6
A/B Testing Tools

Optimizely

Visual Website Optimizer

Unbounce*
Log Analysis
seeing page views, entry/
Log Analysis exit, platforms, engagement
is good for…. sampling from actual users

testing live apps/services


Log Analysis
Benefits Limitations

Low maintenance: release and Missing context of why users


wait for data take an action

Flexibility, can measure a wide Often requires initial


range of data development overhead

Live testing, measures actual


vs. self-reported user behavior
Example: Google Analytics
Question: how many mobile users does my app have?

http://www.smallbox.com/blog/quick-and-dirty-ux-3-things-google-analytics-can-tell-you-about-your-users
Example: Google Analytics
Question: how many mobile users does my app have?

http://www.smallbox.com/blog/quick-and-dirty-ux-3-things-google-analytics-can-tell-you-about-your-users
Example: Google Analytics
Question: what paths do users take on my site/app?
Example: Google Analytics
Question: what paths do users take on my site/app?
Example: Google Analytics
Question: what paths do users take on my site/app?

http://www.smallbox.com/blog/quick-and-dirty-ux-3-things-google-analytics-can-tell-you-about-your-users
Example: Google Analytics
Question: how long are users spending on my site?

http://www.smallbox.com/blog/quick-and-dirty-ux-3-things-google-analytics-can-tell-you-about-your-users
Example: Google Analytics
Question: how long are users spending on my site?

http://www.smallbox.com/blog/quick-and-dirty-ux-3-things-google-analytics-can-tell-you-about-your-users
Example: Google Analytics
Custom logging: track any event you want (links, performance, etc.)

http://www.sitepoint.com/5-ways-use-google-analytics-ux-research/
Usability Studies
Formative Summative
A/B Testing ✔
Field Studies ✔
Hallway Testing ✔ ✔
Heuristic Evaluation ✔ ✔
Hypothesis Testing ✔
Interviews ✔ ✔
Log Analysis ✔
Remote Testing ✔
Surveys ✔ ✔
Think-Aloud ✔ ✔
Wizard of Oz ✔
How to Choose a Usability Study?

Triangulation (n.) - using two or more methods to


discover and validate a finding.
Blind Men and the Elephant Parable
Consider Tradeoffs and Select
Methods that Meet Your Needs
Usability Studies
Formative Summative
A/B Testing ✔
Field Studies ✔
Hallway Testing ✔ ✔
Heuristic Evaluation ✔ ✔
Hypothesis Testing ✔
Interviews ✔ ✔
Log Analysis ✔
Remote Testing ✔
Surveys ✔ ✔
Think-Aloud ✔ ✔
Wizard of Oz ✔
Usability Studies
what are you testing? what kind of results do you want?

Formative Summative Quantitative Qualitative


A/B Testing ✔ ✔
Field Studies ✔ ✔
Hallway Testing ✔ ✔ ✔
Heuristic Evaluation ✔ ✔ ✔
Hypothesis Testing ✔ ✔
Interviews ✔ ✔ ✔
Log Analysis ✔ ✔
Remote Testing ✔ ✔ ✔
Surveys ✔ ✔ ✔
Think-Aloud ✔ ✔ ✔
Wizard of Oz ✔ ✔
Usability Studies
what are you testing? what kind of results do you want?

Formative Summative Quantitative Qualitative


A/B Testing ✔ ✔
Field Studies ✔ ✔
Hallway Testing ✔ ✔ ✔
Heuristic Evaluation ✔ ✔ ✔
Hypothesis Testing ✔ ✔
Interviews ✔ ✔ ✔
Log Analysis ✔ ✔
Remote Testing ✔ ✔ ✔
Surveys ✔ ✔ ✔
Think-Aloud ✔ ✔ ✔
Wizard of Oz ✔ ✔
Potential Pitfalls of Quantitative Research

Easy to make mistakes:


phantom correlations
finding may not generalize (participant selection)
requires sound experimental design

But it’s a great supplement to qualitative research


Butterfly Ballot
Activity - Part 2
Reflect on the one interview you conducted.

How has your knowledge grown? What do you still


need to learn about?

What do you need to get there? (What do you need


to do complete a full persona?)

How would you build a research plan for your topic?


Usability Studies
what are you testing? what kind of results do you want?

Formative Summative Quantitative Qualitative


A/B Testing ✔ ✔
Field Studies ✔ ✔
Hallway Testing ✔ ✔ ✔
Heuristic Evaluation ✔ ✔ ✔
Hypothesis Testing ✔ ✔
Interviews ✔ ✔ ✔
Log Analysis ✔ ✔
Remote Testing ✔ ✔ ✔
Surveys ✔ ✔ ✔
Think-Aloud ✔ ✔ ✔
Wizard of Oz ✔ ✔
Validation & Evaluation
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Studies User Observation

Data Collection, Validation, Evaluation

Learn
Understanding Your Data

http://anotheruxguy.com/2015/07/06/analysis-is-cool/
So Why Doesn’t My UI Work?

Seven Stages of Action

Mental Models

Psychological Biases

Dark UX Patterns
Seven Stages of Action
Evaluate system
state EVALUATION
Establish
goal
Gulf of Evaluation Interpret
What does it mean? system
state
Form
intention
Perceive
USER system
state
Specify
action sequence Gulf of Execution
How do I do it?
SYSTEM
EXECUTION
Execute action
The Gulfs
Gulf of Execution
Does your app have good mappings? Can they easily
figure out how to execute on their desired goal?

Gulf of Evaluation
Does your app provide good feedback and visual
cues? Can users easily interpret what the data the
app is conveying to them?
Mental Models

Designer Users
Experimental Biases

Selection Bias

Confirmation Bias

Diagnosis Bias

Regression Towards the Mean


confirmation bias (n) - the tendency to search for or
interpret information in a way that confirms one’s
preconceptions

possible pitfalls:
you focus your questioning on behaviors that you
expected to see, that confirm or validate your design

you discount negative comments about your design


diagnosis bias (n) - the tendency to label things
based on initial impressions, and the difficulty or
inability to change minds after the initial impression

possible pitfalls:
discounting a participant’s responses based on their
initial responses to selected questions
regression towards the mean (n) - when a non-
random sample is selected, the average of that
sample tends to regress towards the mean

possible pitfalls:
thought your intervention was the reason for an
improvement, but it was simply due to sampling
Cognitive Biases
Anchoring

Framing

Change Blindness

Illusion of Control

Loss Aversion
Anchoring
Anchor (n) - something that serves as a reference point
Anchoring
Frame (n) - the way we present a decision may
highlight different attributes

A pound of meat that is 90% lean


or
A pound of meat that is 10% fat
Framing
Frame (n) - the way we present a decision may
highlight different attributes

This treatment has a 90% chance of saving your life


or
This treatment has a 10% chance of failure, resulting in death
Change Blindness
change blindness (n) - the tendency to overlook
alterations, especially when they appear immediately
after a visual interruption
change blindness (n) - the tendency to overlook
alterations, especially when they appear immediately
after a visual interruption
change blindness (n) - the tendency to overlook
alterations, especially when they appear immediately
after a visual interruption
Illusion of Control
Loss Aversion
the tendency that loss is more acutely felt than gain
Dark UX Patterns
Dark UX Patterns

Privacy: Should it be opt-in or opt out?


UX is a holistic approach,
driven by process & iterations
Case-Study: Anemia in Cambodia

Iron deficiency is a global problem

In the US: affects 3.5 million Americans each year

In Cambodia: affects 68% of children, 50% of adults


Solution
Uh, gross.
Solution 2
Image borrowed from: http://www.bustle.com/articles/84173-the-lucky-iron-fish-helps-fix-iron-
deficiencies-just-by-boiling-it-with-food-and-it
UX Research & Design
A/B Testing Focus Groups Log Analysis
Contextual Inquiry Hallway Testing Remote Testing
Diary Studies Heuristic Evaluation Think-Aloud
Ethnography Interviews Surveys
Field Studies Lab Studies User Observation

Data Collection, Validation, Evaluation

Refine Build Learn


Feedback & QA

Questions? Comments?

Are there topics you wished we spent more time on?

How do you see some of these topics applying to


your current work?
Thank you!

Meg Kurdziolek Karen Tang


meg.kurdziolek@gmail.com karen@kptang.com
www.megkurdziolek.com www.kptang.com
Resources
Surveys:

http://uxmastery.com/better-user-research-through-surveys/

Interviews:

http://theuxreview.co.uk/user-interviews-the-beginners-guide/

http://www.nngroup.com/articles/interviewing-users/

https://whitneyhess.com/blog/2010/07/07/my-best-advice-for-conducting-user-
interviews/

Usability Studies:

http://www.usability.gov/how-to-and-tools/methods/usability-testing.html
Resources

Lucky iron fish TED Talk

https://www.youtube.com/watch?v=0Lf6glgKt3Q
Great UX Research Books

Just Enough Research by Erika Hall

Usability Testing Essentials by Carol M. Barnum

Observing the User Experience by Elizabeth


Goodman

You might also like