Professional Documents
Culture Documents
Module-2: Managing Design Processes
Module-2: Managing Design Processes
Introduction, Organizational Design to support Usability ,The Four Pillars of Design, Development
methodologies: Ethnographic Observation, Participatory Design, Scenario Development, Social
Impact statement for Early Design Review, Legal Issues
3.1 Introduction
Fig 3.1
3.3.1 User Interface requirements
Soliciting and clearly specifying user requirements is a major key to success in any
development activity
Laying out the user-interface requirements is part of the overall requirements
development and management process
User interface requirements describe system behavior
Performance requirements:
“The website shall give users the ability to update their user profiles, e.g., name,
mail address, e-mail, phone”
“The system shall permit the ATM customer 15 seconds to make a selection. The
customer shall be warned that the session will be ended if no selection is made”
“The mobile devices shall be able to save drsft text messages when out of the
service area.”
Functional requirements
“The system shall ensure that PIN entered matches the one on file”
“The web site shall provide other, related purchase options based on past vists to
the web site”
Interface Requirements
Each project has different needs, but guidelines should be considered for:
Words, icons, and graphics
Terminology (objects and actions), abbreviations, and capitalization
Character set, fonts, font sizes, and styles (bold, italic, underline)
Icons, graphics, line thickness, and
Use of color, backgrounds, highlighting, and blinking
Screen-layout issues
Menu selection, form fill-in, and dialog-box formats
Wording of prompts, feedback, and error messages
Justification, white space, and margins
Data entry and display formats for items and lists
Use and contents of headers and footers
Action sequences
Direct-manipulation clicking, dragging, dropping, and gestures
Command syntax, semantics, and sequences
Programmed function keys
Error handling and recovery procedures
Training
Online help and tutorials
Training and reference materials Command syntax, semantics, and sequences
1. Contextual Enquiry
Plan for, prepare and then conduct field interviews to observe and
understand the work tasks being performed. Review business practices.
2. Interpretation Sessions and work modeling
Hold team discussion to draw conclusions based on the conceptual
inquiry, including gaining an understanding of the workflow processes in
the organization as well as cultural and policy impacts.
3. Model Consolidation and affinity diagram building
Present the data gathered to date from users and the interpretation and
work modeling to a larger, targeted population to gain insight and
concurrence.
4. Persona development
Develop personas to represent the different user types within a targeted
demographic that might use a site or product
5. Visioning
Review and “walk” the consolidated data, sharing the personas created.
The visioning session helps define how the system will streamline and
transform the work of the users.
6. Storyboarding
The vision guides the detailed redesign of user tasks using pictures and
graphs to describe the initial user-interface concepts, business rules, and
automation and assumptions. Storyboarding defines and illustrates the “to
be built” assumptions.
7. User environment design
The single, coherent representation of the users and the work to be
performed is expressed in the user environment design.
8. Interview and evaluations with paper prototypes and mock-ups
Conduct interview and tests with actual users, beginning with paper
prototypes and then moving onto higher-fidelity prototypes. Capturing the
results of the interview aids in ensuring that the system will meet end-user
requirements.
Fig: Druin’s model of the four levels of user participation. The blue areas(Informant and Design
partner) represent stages of participatory design.
Users: The taxonomy describes the roles of children, older adults in developing the
interface for Children, older adults in developing interfaces whose typical users will be
partners.
Testers: are merely observed as they try out novel designs.
Informants: comment to designers through interviews and focus groups.
Design Partners: are active members of a design, which in the case of children's s/w will
naturally involve participants of many ages-the intergenerational team.
4.1 Introduction
Designers can become so entranced with their creations that they may fail to evaluate
them adequately.
Experienced designers have attained the wisdom and humility to know that extensive
testing is a necessity.
The determinants of the evaluation plan include:
o Stage of design (early, middle, late)
o Novelty of project (well defined vs. exploratory)
o Number of expected users
o Criticality of the interface (life-critical medical system vs. museum exhibit
support)
o Costs of product and finances allocated for testing
o Time available
o Experience of the design and evaluation team
The range of evaluation plans might be from an ambitious two-year test to a few days
test.
The range of costs might be from 10% of a project down to 1%.
Experts reviews can be scheduled at several points in the development process, when
experts are available and when the design team is ready for feedback.
The number of experts reviews will depends on the magnitude of the project and on
the amount of resources allocated.
After choosing a review method you need to select the right approach/means and/or
reporting style to implement it. Some examples:
o Ranked Recommendation (assign priorities)
o Birds-Eye View (study printed screens from distance)
o Use of Software Tools (speed up the review process)
o General challenge: Experts may lack an understanding of the task domain
and/or user community, or may be biased. Hence it is crucial to chose
knowledgeable experts that are familiar with the project and organization.
An effective during usability testing is to invite users to think aloud about what they
are doing as they are performing the task.
The designer or tester should be supportive of the participants, not taking over or
giving instructions, but prompting and listening for clues about how they are using the
interface
For example, they may hear comments such as “This web page text is too small… so
I’m looking for something on the menus to make the text bigger…”
o Paper mockups. Early usability studies can be conducted using paper mock-
ups of screen displays to assess user reactions to wording, layout, and
sequencing. A test administrator plays the role of the computer by flipping the
pages while asking a participant user to carry out typical tasks. This informal
testing is inexpensive, rapid, and usually productive.
o Universal usability testing. This approach tests interfaces with highly diverse
users, hardware, software platforms, and networks. When a wide range of
international users is anticipated, such as for consumer electronics products,
web-based information services, or e-government services, ambitious testing is
necessary to clean up problems and thereby help ensure success. Trials with
small and large displays, slow and fast networks, and a range of operating
systems or Internet browsers will do much to raise the rate of customer
success.
o Field tests and portable labs. This testing method puts new interfaces to work
in realistic environments for a fixed trial period. Field tests can be made more
fruitful if logging software is used to capture error, command, and help
frequencies, as well as productivity measures. Portable usability laboratories
with videotaping and logging facilities have been developed to support more
thorough field testing.
Written user surveys are a familiar, inexpensive and generally acceptable companion
for usability tests and expert reviews.
The keys to successful surveys are clear goals in advance and then development of
focused items that help attain the goals.