HCI - Lec 4

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 38

Human Computer Interaction

Lecture # 04
M. Bilal Baber

1
Interface Metaphors

Interface metaphors provide users with familiar concepts,


simplifying the understanding of complex systems. They
use symbols, images, or interactions that users already
know from the physical world to represent and navigate
digital interfaces.

Example:
Icons representing a trash bin for deleting files leverage
the real-world metaphor of discarding unwanted items.
51
Metaphors in Conceptual Models

Metaphors are like tools that help us understand complex


ideas. They use familiar things to explain new concepts.

An interface metaphor, such as the desktop metaphor, is a


way to organize and make sense of things on a computer
screen.

It's like a blueprint that helps users recognize and interact


with the digital world.
51
Metaphors in Conceptual Models

Example:
the desktop metaphor takes the physical desktop we have
in offices and translates it into a digital space on the
computer screen.

So, the next time you see icons on your computer screen
organized like items on a desk, that's an example of an
interface metaphor making the digital world feel more
familiar and understandable.
51
Search Engine Metaphor

The search engine metaphor, coined in the 1990s, likens


the process of finding information to the workings of a
mechanical engine. It involves indexing, retrieving files,
and using algorithms, inviting comparisons to the action of
searching in different places.

Example:
Google's search engine mirrors the process of searching
for information in a library but in a more efficient and
algorithmic manner. 51
Interface Metaphors’ Purpose

Interface metaphors aim to offer familiar entities,


facilitating user understanding of the underlying
conceptual model. However, they may challenge
expectations, as seen with the placement of the recycle
bin on the desktop, contrary to logical and cultural norms.

Example:
The desktop recycle bin challenges physical-world
expectations, yet users accept it once they understand its
placement as part of the interface.
51
Card Interface Metaphor

The card metaphor, seen in apps like Twitter and Pinterest,


draws from physical cards' familiarity. Cards provide an
intuitive way to organize limited content in a "card-sized"
format, enabling easy sorting and browsing.

Example:
Twitter's timeline and Pinterest boards use card-based
interfaces, making it simple for users to navigate and
engage with content.
51
Card Interface Metaphor

The card metaphor, seen in apps like Twitter and Pinterest,


draws from physical cards' familiarity. Cards provide an
intuitive way to organize limited content in a "card-sized"
format, enabling easy sorting and browsing.

Example:
Twitter's timeline and Pinterest boards use card-based
interfaces, making it simple for users to navigate and
engage with content.
51
Everyday Use of Metaphors

Everyday conversations about technology are full of


metaphors, making it tricky to discuss experiences without
using them.

For example, parents might talk about 'screen time' when


discussing how much time kids spend on electronic
devices.

These metaphors have become so common that we use


them without even thinking, like when we describe our
experiences with social media platforms. 51
Everyday Use of Metaphors

Example:

Think about the idea of 'inbox zero.' When people talk


about achieving 'inbox zero,' they're using a metaphor to
express a sense of accomplishment in managing and
organizing emails. It's like reaching a clean and organized
digital workspace, even though the inbox isn't physically
empty.

51
Introduction to Interaction Models

Interactive systems aim to be clear and consistent, like


'what you see is what you get' (WYSIWYG). But how can
we be sure?

For instance, if a word processor claims to be WYSIWYG,


how can we test it?

This is where interaction models come in. They are like


blueprints for how interactive systems should behave.
51
Introduction to Interaction Models

Interaction models are like blueprints for how interactive


systems should behave.

Instead of focusing on specific systems, they help us


establish formal principles that apply to various situations.

Imagine a new social media platform, "ConnectX,"


claiming to have a user-friendly interface. Using an
interaction model, we can objectively assess whether it
meets the expected standards for a positive user
experience. 51
Value of Interaction Models

The true value of interaction models lies in the insights


gained by examining properties of interaction. They
provide a background for approaching new areas, whether
or not a formal approach is taken. These insights, though
obtained through formal analysis, can be abstracted into
informal principles that stand on their own.

Example:
Understanding the concept of 'undo' in the context of
interaction models helps designers create systems with
effective error recovery, enhancing user experience.
51
Understanding Mental models

Mental models play a crucial role in how people interact


with computers. A mental model is an individual's internal
representation of how a system works.

Mental models are the ideas and expectations users form


about a system's functionality. They are influenced by
users' experiences, beliefs, and prior knowledge. In HCI,
designers aim to align the user's mental model with the
system's actual behavior to enhance usability.
51
Understanding Mental models

Example:

Imagine a new smartphone user expecting the home


button to function similarly to a physical button. If the
virtual home button responds as expected, it aligns with
the user's mental model.

51
Importance of Mental Models

Creating a seamless interaction requires designers to


consider users' mental models. When a system aligns with
users' expectations, it reduces cognitive load and
improves the overall user experience. Users can predict
system behavior more accurately, leading to increased
satisfaction and efficiency.

Example:
A website with an intuitive navigation menu positioned
where users expect it, based on common web
conventions, enhances the alignment of the system with
51

users' mental models.


User Centered Design and Mental Models

User-centered design involves actively considering users'


mental models during the design process. Techniques such
as user testing, feedback collection, and iterative design
help designers refine interfaces to better match users'
mental models.

Example:
During user testing of a new e-commerce website,
designers discover that users expect the shopping cart
icon to be located in the top right corner, leading to a
51

design adjustment.
Norman’s Interaction Model

Norman's Interaction Model, proposed by Don Norman, is


a framework that explains how users interact with and
perceive a system.

This model focuses on two aspects: the user's


understanding of the system's status (system image) and
the system's actual state (system's reality).

51
System Image vs System Reality

Norman's model divides the user's interaction into two


levels. The System Image represents the user's mental
model or understanding of how the system works. The
System's Reality is the actual state or behavior of the
system.

Example:
In a music streaming app, the System Image may include
the user's expectation that pressing 'play' starts the song
instantly. If there's a delay due to buffering (System's
Reality), it can lead to a mismatch. 51
Affordances and Signifiers

Norman introduces the concepts of Affordances and


Signifiers. Affordances are the perceived actions that an
object or interface allows. Signifiers are indicators that
communicate these affordances.

Example:
In a touchscreen device, the button's appearance
(Signifier) communicates the affordance of being pressed.
Users intuitively know they can interact with it.
51
Mapping in Norman’s Model

Mapping in Norman's model refers to the relationship


between controls and their effects. Good mapping
ensures that users can easily understand how to achieve
their goals based on the layout and organization of
controls.

Example:
In a car, the spatial arrangement of dashboard controls
(mapping) should align with the driver's mental model of
how these controls affect the car's functions.
51
Feedback and Conceptual Models

Feedback is crucial in Norman's model, providing


information to users about the results of their actions.
Conceptual Models are the mental models users form
about how the system works based on feedback.

Example:
When typing on a smartphone, the visual feedback of
characters appearing on the screen contributes to the
user's conceptual model of the keyboard.
51
Constraints and Execution

Constraints limit the actions users can take. Execution


refers to how actions are performed. Designers use
constraints to guide users and execution to make
interactions intuitive.

Example:
In software, disabling a button until all required fields are
filled (constraint) guides users to complete the necessary
information before proceeding.
51
Visibility and Feedback Loops

Visibility ensures users can see the possible actions and


their outcomes. Feedback Loops provide continuous
information about the system's state, helping users adjust
their actions.

Example:
In a file-sharing application, a progress bar (visibility) and
real-time notifications (feedback loop) inform users about
the status of file uploads.
51
Applying Norman’s Interaction Model

To apply Norman's Interaction Model, designers should


focus on aligning the user's System Image with the
System's Reality. This involves considering affordances,
signifiers, mapping, feedback, constraints, execution,
visibility, and feedback loops.

Example:
When designing a new mobile app, the designer ensures
that the visual elements and interactions match users'
expectations, creating a seamless and user-friendly
51

experience.
Abowd and Beale’s Framework

Abowd and Beale’s Framework is like a new way of


thinking about how computers and people interact.

It's different because it pays special attention to the


surroundings and activities of users, understanding that
these things really influence how people use computers.

It's like seeing the big picture, going beyond just the
buttons on a screen, to make the whole experience better
for users.
51
Abowd and Beale’s Framework

Abowd and Beale’s Framework is a cool way of making


computers understand people better.

Imagine you're using a fitness app, and it knows when


you're at the gym or taking a jog. Instead of just showing
generic workouts, it suggests exercises based on where
you are and what you're doing.

It's like having a smart helper that pays attention to your


surroundings and activities to make your fitness
experience personalized and way more helpful! 51
Key Concepts of the Framework

The framework introduces two pivotal concepts:

Context Awareness: This involves the system's ability to


recognize and understand the current context in which the
user is operating. This includes factors such as location,
time, and the user's current activity.

Context Representation: Here, the emphasis is on


formalizing the information about the user's context,
making it interpretable for computing systems. This step is
crucial for effective context-aware computing. 51
Levels of Context
Abowd and Beale identify three levels of context:

User Context: This encompasses personal information


about the user, including preferences, habits, and
historical interactions.

Machine Context: Details about the system's state,


including its current processes, functionalities, and any
ongoing computations.

Physical Context: Information about the user's


environment, such as location, lighting conditions, and
51

surrounding objects.
Context-Aware Applications

The framework's strength lies in its application to context-


aware computing. By dynamically adapting to users'
changing circumstances, context-aware applications offer
a more personalized and adaptive user experience.

This is particularly evident in applications that adjust their


behavior based on the user's context, leading to a more
intuitive and responsive interaction.

51
Context-Aware Applications

Example:

Consider a fitness app that tailors workout


recommendations based on the user's location, time of
day, and real-time weather conditions.

Picture a smart grocery app guiding you through the store,


suggesting recipes and aisle locations based on your
location. This personalized experience, inspired by Abowd
and Beale’s Framework, showcases how context-aware
applications transform routine tasks into efficient and
51

tailored interactions.
Context Representation
Context representation involves the formal modeling of
context information. This process ensures that the system
can accurately interpret and respond to user context,
ultimately enhancing the overall interaction. Designers use
various data structures and algorithms to represent and
manipulate contextual information effectively.

Example:
In the context of a transportation app, the representation
may involve variables such as the user's location, mode of
transportation, historical travel patterns, and current
51

traffic conditions.
Challenges and Considerations

While context-aware computing offers substantial


benefits, it also presents challenges that designers must
navigate. Privacy concerns, the need for real-time and
accurate context recognition, and the inherent complexity
of managing dynamic contexts are critical considerations
in the design process.

51
Designing for Context Awareness

Designers employing Abowd and Beale’s Framework focus


on creating interfaces that seamlessly integrate with the
user's context. This involves designing interactions that
are not only responsive to immediate user input but also
take into account the broader context, providing users
with more relevant and intuitive experiences.

Example:
Imagine a smart home control app that adjusts suggested
actions based on the time of day, the user's historical
preferences, and current conditions in the home
51

environment.
Applications and Future Trends

The principles of Abowd and Beale’s Framework have


found applications in diverse domains, including
healthcare, education, and entertainment.

As technology continues to advance, the significance of


context-aware computing is expected to grow, opening up
new possibilities for enhancing user experiences across
various industries.

51
Applications and Future Trends

Example:

In healthcare, a context-aware monitoring system could


adapt alerts based on a patient's current health status,
recent activities, and the surrounding environment,
contributing to more personalized and effective care.

51
Activity
•Select an existing digital interface.

•Evaluate its design against key principles of the Norman’s


Interaction model

•System Image vs. System's Reality


•Affordances
•Signifiers
•Mapping
•Feedback
•Constraints, Execution
•Visibility, and Feedback Loops 51
38

You might also like