Campusaccess: Design Evaluation Methods (I543) Professor Youn-Kyung Lim Usability Report 9/2007

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

CampusAccess

Design Evaluation Methods (I543)


Professor Youn-Kyung Lim
Usability Report 9/2007

Project Team:
Kshitiz Anand
Jason de Runa
Qian Huang
Adam Williams
TABLE OF CONTENTS

Executive Summary 1

Introduction 2

Methodology 2

Evaluation Method 4

Problems and Recommendations 7

Appendices 10
  1

the website as a resource for information, but


EXECUTIVE SUMMARY they specifically liked the multimedia learn-
This report documents the findings of a usability ing video and the CampusAccess Information
evaluation of the CampusAccess website (Figure Brochure.
1) carried out on September 15, 2007. The goal
was to improve the usability of the system for all The top four problems about the website:
registered students at Indiana University. 1. No use of clear and simple language that
promotes effective communication: Users be-
came confused on depositing money to your
account online. The terms “Express Deposit”
and “On-line” were ambiguous.
2. Visited links were not clearly defined: Users
get confused and moved in circles because
the website used the same link color for vis-
ited and new destinations.
3. Site had no search feature or at least no site
map. As users drilled down lower the naviga-
tion hierarchy it wasn’t clear where the user
was in the site’s information architecture. Us-
ers often looked for a search function to find
their information or a site map.
Figure 1: Indiana University’s CampusAccess Wesbite
http://www.cacard.indiana.edu. 4. Process to report a “Lost card” online is con-
voluted. If a user clicks on the link in the main
navigation, it gives them a description on
The top three successes about the system: what to do by phone. However, users eventu-
ally clicked the ‘Manage your account’ link but
1. Standardized layout; Obvious 2 places of from there, it becomes unclear where to go
interaction. All test participants commented next.
positively on the standardized layout. The
site is a simple two-column layout with a left
menu box. Recommendations
Based on the usability problems identified in
2. The site’s main navigation was easy to un- this evaluation, our team recommends that the
derstand: All test participants commented University Information Technology Services group
positively on the main navigation. The users should:
thought the main navigation system gave a
clue as to what page of the site they are cur- 1. To clarify terminology used on the website.
rently on and where they can go next. On the Manage Your Account page ‘On-line’
is very vague. All aspects and functions on the
3. Various learning mediums for inexperienced site are obviously done online.
CampusAccess card users: Users not only had
2

2. To add some easy method of getting back to Test Objectives


the top of the pages. Users were frustrated The test objectives in this testing are:
scrolling up on lengthy pages just to get
back to the main navigation. Either add links ♦♦ Information architecture of the site: is the infor-
(Return to top) in each section or add a text- mation organization of the website reasonable
based navigation at the footer. and easy for browsing

3. Add a search feature. Users are accustomed ♦♦ How quickly and efficiently the user can find
to find their information through searches. their information
Though the site had no search feature, all ♦♦ If the website information assisted the user to
users utilized their browser’s built in search make better use of their CampusAccess card
feature to find keywords.
After the qualitative and quantitative evaluation
of the test data, our team will identify the usability
problems and provide recommendations.
INTRODUCTION
System Description Summary of User Profiles
The CampusAccess website (www.cacard.indi- There were several parameters for our target
ana.edu) is a resource for all Indiana University group. Our target user group are undergraduate
students, faculty, and staff who own a CampusAc- students who were not familiar or have never used
cess card. Not only is the card used for a variety of the CampusAccess website. Our ideal user group
purposes on campus, but also provides other ben- are not technically savvy since our usability goal is
efits outside the campus. For instance, it serves as to make all users from novices to experts comfort-
your university official photo ID, Residential and able with the website. Our team thought more
Program Services meal card, library card, universal usability problems would originate by selecting
bus pass, access key, and discount card for some the user group that would most likely encounter
merchants. more difficulties with the website.

The CampusAccess website is not only an informa-


tion resource, but its also a financial management
system where you can view your transactions and
account balances, make deposits, and obtain your METHODOLOGY
monthly statements.
Testing Logistics
During this evaluation, we tested to see if our par- Our usability testing of the CampusAccess website
ticipants were able to accomplish the tasks as ex- was carried out on September 15, 2007 at the
pected, then studied if their goals were achieved. Informatics Conference Room 001.
In summary, our team evaluated how well the
website served its intended purposes based on The participants for our usability test were
the overall design. recruited from a wide range of subjects – Indi-
ana University student, faculty, and staff mem-
ber. Each recruited person filled out a pre-test
3

questionnaire part I. Based on the questionnaire with the website is videotaped and analyzed to
results, our team filtered our recruits to closely re- improve the product.
semble our target user group. In total we selected
Users are given instruction to think aloud as they
four participants..
are performing a set of specified tasks. We used
The test was 30 minutes total for each tester - 5 the method of thinking aloud because it provided
minutes for the pre-test questionnaire part II, 15 the facilitator valuable insight about cognitions
minutes for performance of tasks, and 10 minutes and emotions of a user while (s) he performs a task
for post-questionnaire and debriefing questions. or solves a problem.
The user is instructed to articulate what (s) he is
Equipment thinks and what (s) he feels while working with
The usability test was conducted on an IBM Think- the website. We probe subjects as infrequently
Pad laptop, and the data was captured using the as possible because subjects are easily distracted
following equipment: during problem-solving activities.

♦♦ A Logitech Orbit MP QuickCam was used to


capture user facial expressions and behavioral Data Collection
structures while performing the tasks. Both the session facilitator and two observers col-
lected data during the evaluation session.
♦♦ Camtasia software was used to capture the
users’ mouse clicks, mouse navigation, and The test room contained five people. One was
“thinking-aloud”. the participant and the others usability team
members conducting the test. The usability team
♦♦ A stopwatch that measured the time in mil- members consisted of one facilitator, two observ-
liseconds, seconds and minutes was used to ers, and one time recorder. The observers split
time the overall time for the usability test. the tasks between themselves. One observer was
Test preparation: Prior to each session, we placed responsible for logging the data collected by the
the following items by the participant computer: time-recorder. The other observer was responsible
for documenting the user’s actions and noting
♦♦ Task-based scenario list the user’s comfort level, user’s facial expressions,
♦♦ Folder containing pre-test questionnaires unusual body movements, and the speech uttered
by the participant during the think-aloud process.
♦♦ Scratch paper and pen
The facilitator informs the participants that the en-
♦♦ Debriefing and post-test interview question- tire test is supposed to take a maximum of fifteen
naires minutes. The above step would be carried out for
♦♦ Bottled water offered to the participant at the all the tasks.
beginning of the session.
Task-Based Scenarios
Think Aloud Protocol Our goals were to ensure the site navigation was
The think aloud protocol was used for testing the clear and the information was easily found. We
CampusAccess website. The user’s interaction arranged out tasks based around four primary
4

groups of user-centered scenarios: tasks relating card for this? What if you want to go to the SRSC
to lost cards, off-campus use, auxiliary functions, afterwards?
and basic functions. Auxiliary functions related
♦♦ Task 7: Can you check out library books or get
to bus-pass usage, admittance to SRSC, etc. Basic
into the SRSC?
functions included how to put money on the card,
how to report the card lost, etc. While some of the ♦♦ Task 8: What buses can you board with your
tasks included in the “lost card” scenario could be Campus Access Card?
universal, we felt that grouping the tasks as such
would test the real-world functionality of the site
Scenario 4
to a greater degree. This allowed us to test the site
You want to put money are on your card but it ex-
in both a general and specific sense.
pired. Find out what to do and how to put money
on it.
Scenario I
♦♦ Task 8: What buses can you board with your
You cannot find your card and are afraid it might
Campus Access Card?
have been stole. Review your purchases, report
your card lost, and find out how to get a new one. ♦♦ Task 9: Add money to your card.
♦♦ Task 1: Report your lost card online. ♦♦ Task 10: Find out what do if your card expires.
♦♦ Task 2: Review transaction made with your
card.
♦♦ Task 3: Find out where to get a new card.
EVAUATION METHOD
Scenario 2 Usability Goals
You want to go off campus for dinner, but do The usability goals are to make the users more ef-
not have cash. Find out what restaurants accept ficient and not frustrated with the site navigation
CampusAccess, which ones offer discounts and and finding information.
whether you can use RPS meal points.
♦♦ Task 4: Find out if you can use the campus card Quantative Results
to receive discounts anywhere? The quantitative data was analyzed according the
♦♦ Task 5: If you were a local wanting to offer following method:
CampusAccess as a form payment where
Based on the individual times associated with the
would find this information?
tasks and the number of assists required to do a
♦♦ Task 6: Can you use meal points to pay for food task completely, we defined some parameters.
at local restaurants?
We define the “Task Completion Rate” by this
formula.
Scenario 3
You live far from the library and need to check Test Completion Rate (TCR) = [100 – 100(n/3)] %.
out a book. Will you need your CampusAccess Where n= number of assists and its value could be
5

Figure 2: Efficiency value graph.

0,1,2,3. the ratio of Task Completion Rate (TCR) and the


time(t) taken by the user to perform a task.
Thus for a task that requires no assistance for the
participant, the value of n is 0. This means that the Task Efficiency Value (EVn i ) = TCR n i / t n i
task completion rate is 100%.
Where t is the time, n is the task number and i is
Similarly, for a task that requires one assist, we say the user number.
that we have a 33.3% hint given to the user and
the task completion rate is 66.7%. Thus for two as- Mean time, Tm= ( Σ ti ) / 4
sists, the task completion rate would be 66.6% and
Where i is 1,2,3,4 and an index for the respective
hence the task completion rate would be 33.3%.
user performing that task.
Finally, for three assists, the task completion rate
would be 0% and the third assist meant that the Mean TCR; TCRm = ( Σ TCRi ) / 4
user could not perform the task.
Where i is 1,2,3,4 and an index for the respective
We also noted the time taken for each of the tasks. user performing the task.
We calculated the mean time and the mean “Task
Completion Rate”. This was defined as follows: The Mean Task Efficiency Value EVm is then de-
fined as:
The Task Efficiency Value (EV) was defined as
6

Figure 3: Post-Questionnaire data. Each question is listed across the X-axis and the Y-axis represents the user re-
sponses. upper bound and the lower bounds are represented and the mark represents the average response.

EVm = ( Σ EVi ) / 4 ety of means: Video recording and screen


captures taken during the user tests, a
Where i is 1,2,3,4 and an index for the respective post-test questionnaire, as well as written
user performing the task. observations taken during the user tests
After doing this we came up with a mean task of user responses, facial expressions,
Efficiency Value and we were able to plot a graph body language, etc. We then reviewed
(Figure 2) of the individual task efficiency value our notes as a group to discover where
versus the individual tasks for each of the four the problems might lie for individual
users. A user having a higher task efficiency value tasks and the site as a whole.
(for a specific task) from the mean efficiency Also, to help us understand the general
value (for a specific task) means than the user problems with the site, we also tabulated
was able to perform the task very easily. If more the data from the post-test questionnaire
users are closer to the mean efficiency value, and produced a graph to visualize the
then the users did the task fairly easily. If there results (Figure 3). Each question is listed
was a deviation from the mean efficiency value, across the X-axis and the Y-axis repre-
we identified a problem and made a recommen- sents the user responses. For negative
dation. questions, (8-10) the response values
were inverted for visualization. The up-
Qualitative Results per bound and the lower bounds are
We gathered our qualitative data through a vari- represented and the mark represents the
average response.
7

PROBLEMS AND RECOMMENDATIONS


There are two types of classifications a major usability problem and a minor usability problem. In the ‘Type’ col-
umn there is an indicator that distinguishes a minor and major problem. A major usability problem is marked
‘A’, and a minor usability problem is marked ‘B’.
We only listed tasks that we identified as problematic after analyzing our quantitative and qualitative results.

Task Perfomed Type Key Problem Key Use Patterns Recommendation


Review transaction B Unclear path of Users clicked on ‘Lost On ‘Lost Card’ screen add
made with your card. where to go Card’ in main navigation, a link called ‘Report a Lost
then clicked on ‘Manage card’ that directs them to
Your Account’. Users be- the account management
came confused because main screen. (Figure 4)
it was not visually clear
what link to click for lost/
cards.
Add money to your B 2 different methods Users were confused Should only have one
card to add money where to deposit money method to make deposits
online confused online. ‘On-line’ link is am- online. Remove ‘Express
users. biguous. Express deposit Deposit’ and rename
is not intuitive. ‘On-line’ link to something
more descriptive such as
‘Account Access’ or ‘Ac-
count Login’. (Figure 5)
B No clear guideline Not all headers and sub Use bold fonts consis-
on the usage of headers were bold, and tently throughout site
bold type. the word ‘CampusAccess’ to increase the contrast
was placed in bold incon- between headlines or
sistently throughout the headers and body text.
site. This inconsistency Use bold fonts for empha-
would decrease the users sis, to highlight important
reading speed, taking points.
them longer to find their
information.
8

B No change in color Users easily got lost and To reduce navigational


for visited links moved in circles when confusion, select different
websites use the same colors for the two types of
link color for visited and links. This frees users from
new destinations. unintentionally revisiting
the same pages over and
over again.
What buses can you B No search feature Users tend to find their Add a search bar at the
board with your information through top right of the website to
Campus Access searches. Though the site increase search efficiency
Card? had no search feature, us-
ers utilized their browser’s
Can you check out
built in search feature to
library books or get
find keywords. (Figure 6)
into the SRSC?
B Web page design Test partcipants felt they Redesign all pages to
not consistent were redirected to an out- conform
side site. Many thought
they clicked something
incorrectly (Figure 7)
A Difficult to get to Users found it annoy- Add textbase navigation
main navigation ing getting to the main at the page footer
quickly navigation on pages
that contained a lot of
information. Noticed that
users where constantly
scrolling up and down the
pages.
9

Figure 6: Content was found using


browser search.

Figure 4: Lost Card screen.

Figure 7: Design not consistent

Figure 5: Manage Your Account contains unclear terms, often misinterpreted by the user.
10

APPENDICES
User Consent Form
The Human Computer Interaction Design class at the School of Informatics is doing a usability test for
the different UITS applications used across the campus. The class is divided into many teams.

Our team is doing a ‘Usability Evaluation’ of the “Campus Access Card” website (http://cacard.indiana.
edu). The goal of this exercise is to look into the effectiveness of the site, the problems and identify
the scope of improvements based on the usability tests.

We would like to invite you to take part in a usability test for this project. You will be asked to perform
a series of tasks on the interface of this site, at the computer lab in the Informatics department. All
the tasks would be simple in nature and does not require a knowledge of anything apart from what is
mentioned on the website.

All your feedback will be kept confidential and not shared with anyone else. It will however be video
recorded, for further analysis at a later stage. Your interactions and feedback will be documented for
educational purpose and it will not be kept for any individual purpose. We will be taking video record-
ings of your interactions and interviews.

You will be informed of the time and the exact location of the test in advance and you will be re-
quested to be present accordingly. After the test you will be subjected to a short interview about the
usability test in general.

You will be later informed about the test results.

If you have any questions please contact any of the following contacts:
Kshitiz Anand (kshanand@indiana.edu)
Jason de Runa (jderuna@indiana.edu)
Qian Huang (qiahuang@indiana.edu)
Adam Wiliams (adjwilli@indiana.edu)

Please print your name and sign, if you agree to participate.

Prnt Name: _______________________________ Date: _________

Signature: ________________________________
11

Pre-Test Questionnaire - Part I


The Human Computer Interaction Design class at the School of Informatics is doing a usability test for the
different UITS applications used across the campus. The class is divided into many teams. Our team is doing a
‘Usability Evaluation’ of the “Campus Access Card” website (http://cacard.indiana.edu). The goal of this exercise
is to look into the effectiveness of the site, the problems and identify the scope of improvements based on the
usability tests.

We would like to invite you to take part in a usability test for this project. We are carrying out a survey to select
users for this test. The test will be held next week and the dates will be decided based on the information from
this survey. We will inform you the exact dates.

Please fill this form and return in person, or mail it to either of mail ids mentioned below.

Name: _______________________________ Gender: a) Male b) Female

Email: _______________________________ Phone Number: _______________________________

Contact Preference:
a) Email b) Phone

How many years have you been at Indiana University?


a) 0-1 b) 1-2 c) 2-3 d) 3-4 e) greater than 4

Do you possess a Campus Access Card?


a) Yes b) No

In what ways do you use your Campus Access Card? (Circle all that apply)
a) Official University Photo ID Card e) Recreational Sports Pass
b) RPS Meal Card f) Discount Card
c) Indiana University Library Card g) Make payments at stores
d) Universal Bus Pass h) Door Key

Have you ever particpated in a usability test before?


a) Yes b) No

Availability?
a) Saturdays 9:00am-12:00pm d) Sundays 9:00am-12:00pm
b) Saturdays 12:00pm-3:00pm e) Sundays 12:00pm-3:00pm
c) Saturdays 3:00pm-6:00pm f) Sunday 3:00pm-6:00pm
12

Pre-Test Questionnaire - Part 2

Name : ___________________________

You have been short-listed based on the previous questionnaire; to be a part of our usability test that
we are conducting for the “Campus Access Card” website. In order to further support our results we
need this information from you. This would identify your comfort level with the usage of computers
and the Internet.

All the questions are simple and your honest answers would be highly appreciated. If you have any
questions, please feel free to contact any of the team members, mentioned below.

Please answer the following questions:

Have you ever used a computer before coming to IU? a) Yes b) No

Have you ever used the Internet before coming to IU? a) Yes b) No

How do you classify yourself as an Internet user? a) Novice b) Intermediate c)


Expert

Have you visited the CampusAccess site before? a) Yes b) No

Rate yourself on a scale of 1 to 5 on these questions. Please circle the ratings that apply.
(1 the lowest and 5 the highest.)

Your ability to find a required website on the Internet? 1 2 3 4 5

Your ability to look for required information on a website? 1 2 3 4 5

Your ability to distinguish between the different sections? 1 2 3 4 5

What are your top three activities when using the Internet?
a ) Emails f) Social Networking
b) Photo-sharing g) Gamein
c) Reading content h) Video
d) Music i) Movies
e) Chat j) Telephone
  13

as well as the preamble that will be given to the


participants.
Documents:
Plan for Test Set up Schedule for test
Usability testing will be conducted on Saturday, Script for tests
September 15, 2007 at 12:00 pm - 2:30 pm in the List of tasks and scenarios
Informatics Building room 001 (conference room). Debriefing questions and post questionnaire
Informatics Building Observation sheets
901 E. 10th St. Set-up sheet - plan for the test setup
Bloomington, IN 47408-3912 Equipment:
There will be a facilitator, two data loggers, and a Desktop running Windows OS
time recorder in the testing process (Figure 8). Laptop running Windows OS
Camtasia software
Web cam - Logitech QuickCam Orbit MP

Miscellaneous:
Bottled water
$5 gift cards
Notebook
Pen/pencil

Schedule for Test with Users


Test Location:
Informatics Building Room 001
901 E, 10th St,
Figure 8: Organizational arrangement of evaluation room
Bloomington, IN 47408-3912
Roles: Time and date of all tests:
Facilitator - Jason de Runa Saturday 15th, 2007, 12pm – 2:30 pm.
Data logger 1 - Adam Williams
Data Logger 2 - Qian Huang The team members are to arrive at 11:00 am to set
Data Recorder - Kshitiz Anand up the place.

Materials:
The following materials are required for the us- General Instructions
ability testing. Our team will use this document ♦♦ Please arrive at the test location 15 minutes
to ensure that the testing room is properly set-up prior to the test. You will be given an insight
before the users arrive. It will serve as checklist into the test, before you start.
of necessary material and equipment for the test,
14

♦♦ Please switch off your mobile phones during


the test. Travis, Dr. David, “Standards Update: Usability Test
Reporting”, April 21 2006, http://www.userfocus.
Schedule for Individual Users co.uk/articles/cif.html
Participant 1 ---Saturday 15th September 2007 at
12:30 pm. User to report by 12:15 pm.
Jakob Nielsen, “Top Ten Mistakes in Web Design”,
Participant 2 ---Saturday 15th September 2007 at 2007, http://www.useit.com/alertbox/9605.html
01:00 pm. User to report by 12:45 pm.

Participant 3 ---Saturday 15th September 2007 at Weakly, Russ, “Web Standards Checklist”, 2004
01:30 pm. User to report by 01:15 pm. http://www.maxdesign.com.au/presentation/
checklist.htm
Participant 4 ---Saturday 15th September 2007 at
02:00 pm. User to report by 01:45 pm.
Brown, Dan. Communicating Design: Developing
Individual mails are sent to the user regarding Web Site Documentation for Design and Planning.
their tests. This mail contains the details about the New Riders PeachPit Press, Berkeley, CA 2007
location, the travel directions and the time when
the user is required to be present.

References
Kuniavsky, Mike. Observing the User Experience:
A Practitioner’s Guide to User Research. Morgan
Kaufmann Publisher, San Francisco, CA 2003

NIST, “Common Industry Format for Usability Test


Reports: Example Reportv3.2 (DiaryMate)”, Au-
gust 4, 1999, http://zing.ncsl.nist.gov/iusr/docu-
ments/diarymate_v32.htm#_Toc458844639

Morse, Emile, “Common Industry Format for Us-


ability Test Reports”, October 28, 2003, http://
www.ipgems.com/present/12_Morse_10-03.pdf

Bevan, Nigel, “Common Industry Format Usability


Tests”, June 29, 1999, http://www.usabilitynet.
org/papers/cifus.pdf

You might also like