Human Computer Interaction Reviewer

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

Chapter 1 What is interaction design?

Interactive Products
● Smartphone ● Printer
● Tablet ● IPod
● Computer ● GPSe-reader
● Remote control ● TV
● Coffee machine ● Radio
● ATM ● Games console
● Ticket machine

Good and poor design - A good design is effective and efficient, while a bad design is ineffective and
inefficient.
Interaction Design - designing interactive products to support the way people communicate and interact in
their everyday and working lives.
The Components of Interaction Design
● Academic Disciplines ● Product Design
● Human Computer Interactions (HCI) ● Graphic Design
● Information System ● Engineering
● Film Industry ● Human Factors (HF)
● Industrial Design ● Computer Science
● Artist Design

User Experience - The user experience (UX) is central to interaction design. By this it is meant how a product
behaves and is used by people in the real world.
The iPod phenomenon - Apple's classic generations of iPods have been a phenomenal success.
McCarthy and Wright propose four core threads that make up our holistic experiences:
1. The sensual thread - This is concerned with our sensory engagement with a situation and is like the visceral
level of Norman's model
2. The emotional thread - Common examples of emotions that spring to mind are sorrow, anger, joy, and
happiness.
3. The compositional thread - This is concerned with the narrative part of an experience, as it unfolds, and the
way a person makes sense of it.
4. The spatio-temporal thread - This refers to the space and time in which our experiences take place and
their effect upon those experiences.

The Process of Interaction Design


The process of interaction design involves four basic activities:
1. Establishing requirements
2. Designing alternatives
3. Prototyping
4. Evaluating.
Usability Goals - Usability refers to ensuring that interactive products are easy to learn, effective to use, and
enjoyable from the user's perspective
Usability is broken down into the following goals:
1. effective to use (effectiveness)
2. efficient to use (efficiency)
3. safe to use (safety) having good utility (utility)
4. easy to learn (learnability)
5. easy to remember how to use (memorability).

Design Principles
Design principles are used by interaction designers to aid their thinking when designing for the user
experience.

1. Visibility.
The importance of visibility is exemplified by our contrasting examples at the beginning of the chapter. The
voice mail system made the presence and number of waiting messages invisible, while the answer machine
made both aspects highly visible.
2. Feedback.
Related to the concept of visibility is feedback. This is best illustrated by an analogy to what everyday life
would be like without it.
3. Constraints.
The design concept of constraining refers to determining ways of restricting the kinds of user interaction that
can take place at a given moment.
4. Consistency.
This refers to designing interfaces to have similar operations and use similar elements for achieving similar
tasks. A consistent interface is one that follows rules, such as using the same operation to select all objects.
5. Affordance.
This is a term used to refer to an attribute of an object that allows people to know how to use it.

Chapter 2 UNDERSTANDING AND CONCEPTUALIZING INTERACTION

Understanding the problem space


What do you want to create?
What are your assumptions?
What are your claims?
problem space - Usability and user experience goals must be understood before you can understand the
problem space. Write down the various assumptions and claims and try and defend them.
Conceptual Model
It is not a description of the user interface but a structure outlining the concepts and the relationships
between them.
Design Concept
comprises scenarios, images, mood boards, or text-based documents.
INTERFACE METHAPOR - An interface metaphor is one that is instantiated in some way as part of the user
interface
example
*The desktop metaphor. Another well known as the search engine.
Material Metaphors - An interface metaphor that has become pervasive in the last few years is the card.
Many of the social media apps, such as Facebook, Twitter, and Pinterest, started presenting their content on
cards.
Interaction Types - The way of conceptualizing the design space is in terms of the interaction types that will
underlie the user experience. Essentially, these are the ways a person interacts with a product or application.
There are four main types:
*Instructing – where users issue instructions to a system.
*Conversing – where users have a dialog with a system.
*Manipulating – where users interact with objects in a virtual or physical space by manipulating
them.
*Exploring – where users move through a virtual environment or a physical space.
INSTRUCTING
Type of interaction describes how users carry out their tasks by telling the system what to do.
Example
- include giving instructions to a system to perform operations such as tell the time, print a file, and remind the
user of an appointment.
CONVERSING
It is based on the idea of a person having a conversation with a system, where the system acts as a
dialog partner.
Example :
- Include advisory systems, help facilities, and search engines.
MANIPULATING
A form of interaction involves manipulating objects and capitalizes on users’ knowledge of how they do
so in the physical world.
For example:
- Objects can be manipulated by moving, selecting, opening, and closing.
EXPLORING
The mode of interaction involves users moving through virtual or physical environments.
For example:
- Users can explore aspects of a virtual 3D environment, such as the interior of a building.
Paradigms
A paradigm is a frame of reference or theory that affects how we see and experience a situation. By
definition, it represents a "group of ideas about how something should be done or thought about.
Visions
A current vision that is driving much future technology development is the Internet of Things (IoT).
Future visions provide concrete scenarios of how society can use the next generation of imagined technologies
to make their lives more safe, comfortable, informative, and efficient
THEORIES
It is numerous theories have been imported into human–computer interaction, providing a means of
analyzing and predicting the performance of users carrying out tasks for specific kinds of computer interfaces
and systems by (Rogers, 2012).
MODELS
This typically abstracted from a theory coming from a contributing discipline, like psychology, that can
be directly applied to interaction design.
FRAMEWORK
help designers constrain and scope the user experience for which they are designing. Frameworks, like
models, have traditionally been based on theories of human behavior, but they are increasingly being
developed from the experiences of actual design practice and the findings arising from user studies.

The framework comprises three interacting components:


1. The designer's model – the model the designer has of how the system should work.
2. The system image – how the system actually works is portrayed to the user through the interface,
manuals, help facilities, and so on.
3. The user's model – how the user understands how the system works.

Chapter 3 Cognition Aspect


Cognition - the mental action or process of acquiring knowledge and understanding through thought,
experience, and the senses.
Different kind of Cognition
1. Thinking
2. Remembering
3. Learning
4. Day Dreaming
5. Seeing
6. Decision Making
7. Reading
8. Writing
9. Talking
Norman (1993) distinguishes between two general modes: experiential and reflective cognition
Kahneman (2011) describes them in terms of fast and slow thinking.

Experiential - a state of mind in which we perceive, act, and react to events around us intuitively and
effortlessly. It requires reaching a certain level of expertise and engagement.
Reflective - involve mental effort, attention, judgment, and decision making. This kind of cognition is what
leads to new ideas and creativity
Attention - This is the process of selecting things to concentrate on, at a point in time, from the range of
possibilities available. Attention involves our auditory and/or visual senses.
Information Presentation - The way information is displayed can also greatly influence how easy or difficult it
is to attend to appropriate pieces of information.
Multitasking Attention - Many of us now spend a large proportion of our time staring at a screen, be it a
smartphone, laptop, TV, or tablet.
Perception - refers to how information is acquired from the environment via the different sense organs – eyes,
ears, fingers, and transformed into experiences of objects, events, sounds, and tastes (Roth, 1986).
Memory - involves recalling various kinds of knowledge that allow us to act appropriately.
Learning - GUIs and direct manipulation interfaces are good environments for supporting this kind of active
learning.
Reading, speaking, and listening - are three forms of language processing that have similar and different
properties.
Problem solving, planning, reasoning, and decision making - are processes involving reflective cognition. They
include thinking about what to do, what the options are, and what the consequences might be of carrying out
a given action.

Cognitive Framework - three early internal frameworks that focus primarily on mental processes together
with three more recent external ones that explain how humans interact and use technologies in the context in
which they occur.
1. INTERNAL - Mental Models, Gulfs of Execution and evaluating, &Information processing

2. EXTERNAL - Distributed, External Cognition, & Embodied Cognition

CHAPTER 4 SOCIAL INTERACTION

1. Being Social - fundamental aspect of everyday life is being social – interacting with each other. We
continuously update each other about news, changes, and developments on a given project, activity, person,
or event.
2 types of conversation
1. Face to Face conversations - when two or more people interact and communicate while visible to
one another.
2. Remote Conversations - is a way of communicating with others online.
IMPLICIT OR EXPLICIT CUES
1. IMPLICIT - Signaling indirectly to the other participants that he wants the conversation to draw to a close.
2. EXPLICIT - Direct cues, these types of cues are specific and clear.

Three basic rules


Rule 1 - the current speaker chooses the next speaker by asking question, inviting an opinion, or making a
request.
Rule 2 - another person decides to start speaking.
Rule 3 - the current speaker continues talking.

1. Telepresence - allows real-time, two-way collaboration between people who are not in the same location
2. Co-presence - A communication dimension that refers to participants in a communication being located in
the same physical setting.
3. Physical Coordination - When people are working closely together, they talk to each other, issuing
commands and letting others know how they are progressing.
4. Awareness - Invqolves knowing who is around, what is happening, and who is talking with whom.
4.1 Peripheral awareness - Keeping an eye on things happening in the periphery of vision
4.2 Overhearing and overseeing – allows tracking of what others are doing without explicit cues.
5. Shareable Interfaces - Several studies have been carried out investigating whether different arrangements
of shared technologies can help co-located people work together better.

-----------------------Report ni Sir-----------------------
Gestalt Principles define some basic laws that help us understand how the human mind perceives visual
stimuli.
fundamental principle of perceptual grouping is the law of Prägnanz (also known as the law of good Gestalt).
law of Prägnanz says that we tend to experience things as:
1. regular
2. orderly
3. symmetrical
4. simple
Of several geometrically possible organizations that one will occur which possesses the best, simplest and
most stable shape. — Kurt Koffka
The principles were based on similarity, proximity, and continuity.
1. Proximity - affirms that we perceive elements that are closer to each other to belong in the same group
2. Similarity - explores the fact that similar elements are perceived as being part of the same group and
having the same function.
3. Continuity - elements positioned in a line (or curve) are perceived as a continuation, a sequence of facts
arranged in an order, or a follow-up of the previous element.
4. Closure - The human brain automatically fills in the gaps that don’t exist. This Gestalt principle states that
we use memory to convert complex objects into simpler or known shapes.
5. Figure-ground - our perception instinctively perceives objects as either being in the foreground or the
background.
6. Common region - The common region principle is related to proximity. This principle states that when
objects are positioned within the same closed region, they are perceived as part of the same group
7. Focal points - The focal point law states that any element that stands out visually captures and holds the
viewer’s attention.

Chapter 5 Emotional Interaction

1. Emotion and the user experience


Emotional interaction is concerned with how we feel and react when interacting with technology.
Ortony et al's (2005) model of emotional design showing three levels:
● Visceral - design refers to making products look, feel, and sound good
● Behavioral - design is about use and equates with the traditional values of usability.
● Reflective - design is about considering the meaning and personal value of a product in a particular
culture

2. Expressive Interfaces
Expressive forms like emoticons, sounds, icons, and virtual agents have been used at the interface
to convey emotional states and/or elicit certain kinds of emotional responses in users, such as feeling
at ease, comfort, and happiness.
Other ways of conveying the status of a system are through the use of:

Dynamic icons (e.g., a recycle bin expanding when a file is placed in it and paper disappearing in a puff when
emptied).

Animations (e.g., a beach ball whirling to say the computer is busy).

Spoken messages, using various kinds of voices, telling the user what needs to be done (e.g., GPS navigation
system instructing you politely where to go after having taken a wrong turn).

Various sonification’s indicating actions and events (e.g., whoosh for window closing, schlook for a file being
dragged, ding for new email arriving).

Vibrotactile feedback, such as distinct smartphone buzzes that specifically represent special messages from
friends and family.

3. Annoying Interfaces
computer interfaces may inadvertently elicit negative emotional responses such as anger and disgust.

Interfaces, if designed poorly, can make people look stupid, or feel insulted or threatened. The effect
can be to make them annoyed to the point of losing their temper.

4. Detecting Emotions and Emotional Technology


The approach called affective computing develops computer-based systems that try to recognize and
express emotions in the same way humans do.

5. Persuasive Technologies and Behavioral Change


A diversity of technologies is increasingly being used to draw people's attention to certain kinds of
information to change what they do or think. Pop-up ads, warning messages, reminders, prompts, personalized
messages, and recommendations are some of the methods that are being deployed on computer screens.

6. Anthropomorphism and Zoomorphism


is the propensity people have to attribute human qualities to animals and objects while zoomorphism is
the shaping of an object or design in animal form.

Chapter 6 Interfaces

Interface type

1. Command-based 11. Touch

2. WIMP and GUI 12. Air-based gesture

3. Multimedia 13. Haptic

4. Virtual reality 14. Multimodal

5. Information visualization and dashboards 15. Shareable

6. Web 16. Tangible

7. Consumer electronics and appliances 17. Augmented and mixed reality

8. Mobile 18. Wearable

9. Speech 19. Robots and drones

10. Pen 20. Brain–computer interaction (BCI)

Interface Types

Numerous adjectives have been used to describe the different kinds of interfaces that have been developed,
including graphical, command, speech, multimodal, invisible, ambient, affective, mobile, intelligent, adaptive,
smart, tangible, touchless, and natural.

1. Command-Based
Early interfaces required the user to type in commands that were typically abbreviations at the prompt symbol
appearing on the computer display, which the system responded to. Another way of issuing commands is
through pressing certain combinations of keys (e.g., Shift+Alt+Ctrl)

2. WIMP and GUI


The Xerox Star interface (described in Chapter 2) led to the birth of the WIMP and subsequently the GUI,
opening new possibilities for users to interact with a system and for information to be presented and
represented at the interface.
The original WIMP comprises:
Windows (that could be scrolled, stretched, overlapped, opened, closed, and moved around the screen using
the mouse).
Icons (to represent applications, objects, commands, and tools that were opened or activated when clicked on).
Menus (offering lists of options that could be scrolled through and selected in the way a menu is used in a
restaurant).
Pointing device (a mouse controlling the cursor as a point of entry to the windows, menus, and icons on the
screen).

Window design. Windows were invented to overcome the physical constraints of a computer display, enabling
more information to be viewed and tasks to be performed at the same screen.
Menu design. Just like restaurant menus, interface menus offer users a structured way of choosing from the
available set of options. Headings are used as part of the menu to make it easier for the user to scan through
them and find what they want.
Interface menu designs have employed similar methods of categorizing and illustrating options available that
have been adapted to the medium of the GUI. A difference is that interface menus are typically ordered across
the top row or down the side of a screen using category headers as part of a menu bar.
Icons can be designed to represent objects and operations at the interface using concrete objects and/or
abstract symbols.

3. Multimedia - as the name implies, combines different media within a single interface, namely, graphics, text, video,
sound, and animations, and links them with various forms of interactivity.

4. Virtual Reality - Virtual reality (VR) uses computer-generated graphical simulations to create “the illusion of
participation in a synthetic environment rather than external observation of such an environment” (Gigante, 1993,
p. 3). VR is a generic term that refers to the experience of interacting with an artificial environment, which makes it
feel virtually real.

5. Web Early websites were largely text-based, providing hyperlinks to different places or pages of text. Much of the
design effort was concerned with how best to structure information at the interface to enable users to navigate and
access it easily and quickly

6. Mobile - Mobile devices have become pervasive, with people increasingly using them in all aspects of their everyday
and working lives.

7. Speech - A speech or voice user interface is where a person talks with a system that has a spoken language
application, like a train timetable, a travel planner, or a phone service.

8. Pen - Pen-based devices enable people to write, draw, select, and move objects at an interface using lightpens or
styluses that capitalize on the well-honed drawing and writing skills that are developed from childhood.

9. Touch screens - such as walk-up kiosks (e.g., ticket machines, museum guides), ATMs, and till machines (e.g.,
restaurants), have been around for some time.

10. Air-Based Gestures - Camera capture, sensor, and computer vision techniques have advanced such that it is now
possible to accurately recognize people's body, arm, and hand gestures in a room.

11. Haptic interfaces - provide tactile feedback, by applying vibration and forces to the person, using actuators that are
embedded in their clothing or a device they are carrying, such as a smartphone or smartwatch.

12. Multimodal interfaces - are intended to provide enriched and complex user experiences by multiplying the way
information is experienced and controlled at the interface through using different modalities, i.e., touch, sight,
sound, speech.

13. Shareable interfaces - are designed for more than one person to use. Unlike PCs, laptops, and mobile devices – that
are aimed at single users – they typically provide multiple inputs and sometimes allow simultaneous input by
collocated groups.

14. Tangible interfaces - use sensor-based interaction, where physical objects, e.g., bricks, balls, and cubes, are coupled
with digital representations.

15. Augmented and Mixed Reality - Other ways that the physical and digital worlds have been bridged include
augmented reality, where virtual representations are superimposed on physical devices and objects, and mixed
reality, where views of the real world are combined with views of a virtual environment.

16. Wearables - Imagine being at a party and being able to access the Facebook of a person whom you have just met,
while or after talking to her, to find out more about her.

17. Robots and Drones - Robots have been with us for some time, most notably as characters in science fiction movies,
but also playing an important role as part of manufacturing assembly lines, as remote investigators of hazardous
locations (e.g., nuclear power stations and bomb disposal), and as search and rescue helpers in disasters (e.g., fires)
or far-away places (e.g., Mars).

18. Brain–Computer Interfaces Brain–computer interfaces (BCI) provide a communication pathway between a person's
brain waves and an external device, such as a cursor on a screen or a tangible puck that moves via airflow).

Natural User Interface (NUI) is a system for human-computer interaction that the user operates through intuitive
actions related to natural, everyday human behavior. 

Chapter 11 Design Prototyping and Construction

Prototype is a small-scale mode

Kinds of Prototyping

a. Low Fidelity
b. High Fidelity

Low Fidelity

• Uses a medium, for example, paper or cardboard

• Is quick, cheap, and easily changed

• Examples:

▪ Sketches of screens, task sequences, and so on

▪ ‘Post-it’ notes

▪ Storyboards

Storyboard

• It is a series of sketches showing how a user might progress through a task using the product

• Often used with scenarios, bringing in more detail and a chance to role play

Sketching

• Low fidelity prototyping often relies on sketching

• Don’t be inhibited about drawing ability — Practice simple symbols

High Fidelity

• Uses materials that you would expect to be in the final product

• Prototype looks more like the final system than a low fidelity version

• High-fidelity prototypes can be developed by integrating existing hardware and software components

Two common types of compromise in prototype:

Horizontal: Provides a wide range of functions, but with little detail

Vertical: Provides a lot of detail for only a few functions

Compromises in prototypes must not be ignored.

Conceptual model - an outline of what people can do with a product and what concepts are needed to understand and
interact with it

Approaches to develop conceptual model.


1. Interface metaphors - combine familiar knowledge with new knowledge in a way that will help the user
understand the product.
2. Interaction Styles
3. Interface Styles

Generating Prototypes

Generate a storyboard from a scenario

Break down scenario into steps

Create a scene for each step

Sketching out a storyboard prompts designers to think about design issues

Two common representations of experience map

1. Wheel
2. Timeline

Construction: Physical computing

Build and code prototypes using electronics

Toolkits available include

1. Arduino
2. Lilypad(for fabrics)
3. Sense board

Software Development Kits (SDKs) – programming tools and components to develop for a specific platform, for example
iOS. Makes development much easier

Includes:

IDE, Documentation, Drivers, sample code, and Application Programming Interfaces (API’s)

Example:

Amazon’s Alexa skills kit for voice-based services

Apple’s ARKit for augmented reality

Microsoft Kinect SDK for motion tracking

You might also like