Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA

Using Personal Objects as Tangible Interfaces for Memory


Recollection and Sharing
Elena Mugellini Elisa Rubegni Sandro Gerardi Omar Abou Khaled
University of Applied University of Siena University of Applied University of Applied
Sciences of Western Via dei Termini 6, Sciences of Western Sciences of Western
Switzerland, Fribourg 53100, Siena, Italy Switzerland, Fribourg Switzerland, Fribourg
Bd de Pérolles 80, 1705 rubegni@media.uni Bd de Pérolles 80, 1705 Bd de Pérolles 80, 1705
Fribourg, Switzerland si.it Fribourg, Switzerland Fribourg, Switzerland
elena.mugellini@hefr.ch sandro.gerardi@hefr.ch omar.aboukhaled@hefr.ch
ABSTRACT storage, the amount of information, a person owns and
Tangible User Interfaces (TUIs) are emerging as a new handles, never stops increasing. Worst, information is
paradigm of interaction with the digital world aiming at dematerializing in our daily life and thus, people are often
facilitating traditional GUI-based interaction. Interaction experiencing the “lost in infospace” effect. What we often
with TUIs relies on users’ existing skills of interaction with miss in our daily-life are tangible shortcuts (reminders) to
the real world [9], thereby offering the promise of interfaces our information, like used to be books in our shelves [12,
that are quicker to learn and easier to use. Recently it has 13].
been demonstrated [1] that the use of personal objects as
tangible interfaces will be even more straightforward since TUIs are emerging as a new paradigm for facilitating user
users already have a mental model associated to the interaction with digital world by providing a mean for
physical objects thus facilitating the comprehension and integrating physical and digital world. This term, has been
usage modalities of that objects. However TUIs are first introduced by Ullmer and Ishii [12], which defines
currently very challenging to build and this limits their Tangible User Interfaces as follows: “TUIs couple physical
widespread diffusion and exploitation. In order to address representations (e.g. spatially manipulable physical objects)
this issue we propose a user-oriented framework, called with digital representations (e.g. graphics or audio),
Memodules Framework, which allows the easy creation and yielding user interfaces that are computationally mediated
management of Personal TUIs, providing end users with the but not generally identifiable as “computers” per se”.
ability of dynamically configuring and reconfiguring their Recently it has been demonstrated that personal objects and
TUIs. The framework is based on a model, called souvenirs can be used as tangible interfaces and ambient
MemoML (Memodules Markup Language), which intelligent objects [1]. Users have a mental model of the
guarantees framework flexibility, extensibility and links between their personal physical objects (souvenirs)
evolution over time. and the related digital information, as a consequence those
objects can be used as TUIs instead of developing new ones
Author Keywords that have to be learned by the users. In other worlds
HCI, Tangible User Interface, personal objects, user- personal objects can act as stimulus connected to the events
oriented framework, model, markup language. helping people to evoke certain circumstances and
supporting the reconstruction of memory. In this case the
ACM Classification Keywords stimulus is a tangible cue which enables the retrieval of
H.5.2 User Interfaces, H5.m. Information interfaces and information from the memory.
presentation (e.g., HCI): Miscellaneous.
According with this, the paper presents Memodules
INTRODUCTION Framework which enables personal objects as TUIs for
With the constant progress in information retrieval and memory recollection and sharing. This framework supports
the rapid composition of Memodules (augmented personal
objects) with existing devices in the home environment
(TV, Hi-Fi, PDA, etc.).
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
MEMODULES PROJECT
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. To copy otherwise, Memodules project 1 has the objective of developing,
to republish, to post on servers or to redistribute to lists, requires prior experimenting and evaluating the concept of tangible
specific permission and/or a fee.
TEI’07, February 15-17, 2007, Baton Rouge, Louisiana, USA. 1
http://www.memodules.ch
Copyright 2007 ACM ISBN 978-1-59593-619-6/07/02…$5.00.

231
TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY

shortcuts (reminders), facilitating (a) the control of devices Associative memory


in the everyday life and also (b) the information
categorization in order to improve information access and
retrieval. The project aims at facilitating user interaction Physical
scenario
Action
reminder materialization
with multimedia information by supporting the creation and
management of tangible links to digital contents. The
process of “remembering” usually consists in associating
something with a sensory cue [4]. For example, we may see Phase 1 Phase 2 Phase 3
a picture of a place visited in our childhood and the image
recalls memories associated to the same time. It appears
that humans easily access and retrieve information when it MEMODULES
Memodules Approach
architecture
is linked to other related information or objects [18].
Tangible interfaces can be useful to establish links between
Figure 1 Memodules Approach
our memory and information in order to roost abstract
information in the real world through tangible reminders. In
particular, as Hoven largely arguments [1], personal objects
can be used as memory cues for supporting memory recall. In order to explore the Memodules approach we decided to
In fact, souvenirs are connected to the memory of particular focus on traveling as a privileged application domain since
events which can be easily evoked by physically this activity is strongly related to the concept of souvenirs
manipulating the object as carrier of meaning in interaction. and memories collection.
Based on this associative process of remembering, the main The technology-driven perspective has been driven by the
objective of our research is to investigate the opportunity of following scenario.
a more complex, multi-sensorial combination of, and
interaction between, personal objects and multimedia Sabrina created a folder full of pictures from her last
information (e.g. photos, videos, audio files, text vacations in Greece. She glues an RFID (Radio Frequency
documents, web resources, etc.). Identification) on a nice seashell she brought back from
Greece and asks the system to associate it with the picture
The project is based on a methodology approach which folder. One week later she wants to show her friends a
integrates two different perspectives: user-centred design slideshow of her vacations. She places the seashell in a little
and technology-driven. hole next to the plasma screen, which activates the show. At
User-centred design aims to understand how people collect the same time the CD of traditional Greek music she bought
objects and information and investigating how personal in Athens is played on the CD player (see Figure 2).
objects and souvenirs are used for memories recalling, Nicola who enjoyed the evening and the pictures gives his
storytelling and sharing of experience. business card to Sabrina. The following day, Sabrina places
The technology-driven perspective investigates how to Nicola’ business card along with the seashell on top of the
improve those objects with embedded sensors and communicator and makes a brief gesture to send
technology, we call them Memodules, in order to enhance everything.
their capabilities and to support users to recollect and share
memories.
The paper focuses and presents the results of technology-
driven perspective.

Memodules Approach
According to the concept of associative memory [18],
objects can act as physical reminders to some memories.
Then using those physical reminders it is possible to define
some scenarios which allows materializing user memories
evoked by personal objects. In the same way, the approach
followed in Memodules project to support the process of
memory recall is based on three main phases (Figure 1):
Phase 1 - Tangible object initialization, Phase 2 - Usage
scenario description and Phase 3 - Action materialization. Figure 2 Memodules usage scenario

232
Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA

In order to support this type of interaction and to enable Our work is intended to support any type of user to use and
people 2 to use Memodules for organizing, manipulating combine augmented personal objects as a means for
and consuming information and memories it is necessary to memory recollection and experience sharing and to support
provide users an easy way to create, assemble and manage dynamic re-combination of those objects since personal
them. This means providing a user-oriented framework memories associated to objects and events can change over
which allows configuring Memodules objects and time [2]. For this reason users should be provided with a
describing their usage scenarios in a simple and intuitive mechanism for easily associating and re- associating multi-
way. Moreover the framework has to support the easy sensorial information to their personal objects. To achieve
reconfiguration of Memodules usage scenarios since this, we developed a generally applicable model describing
personal memories associated to objects and events are not the objects and the devices within the environment and the
“absolute” but rather they change over time [2]. interaction occurring between them. This model, presented
in the following, provides a common, technology-
independent, description of the whole system which can be
PERSONAL OBJECTS AS TANGIBLE CUES TO implemented using different technologies.
MEMORIES
Finally it is worth noting that even if this work was derived
The innovative works of Wellner’s DigitalDesk [14], and by investigating travelling activity application domain, the
Ishii and Ullmer’s Tangible Bits [7] have pioneered framework we are going to present in the next paragraphs is
exploration of post-WIMP (Windows, Icons, Menus, flexible to be applied in different contexts.
Pointers) tangible interfaces. Ubiquitous computing
applications have motivated these novel physical interfaces
even more. However, tangible interfaces are difficult and MEMOML MODEL
time consuming to build, requiring a specific technological
expertise that limits their widespread adoption. Several MemoML (Memodules Markup Language) has been
research efforts have been done during recent years to developed to respond to the need of providing a universally
develop tools for prototyping tangible user interfaces and applicable model which is technology-independent and
designing ubiquitous computing architectures. Some which, as a consequence, can be implemented over different
examples of those projects are: Smart-Its [10], Phidgets platforms. MemoML is an XML-based modeling language
[11], the iStuff [7] and iCAP [6]. These projects were which formalizes the previously presented Memodules
mainly focused on providing rich prototyping toolkits or approach.
programming languages for developing Tangible User MemoML is based on Document Engineering approach
Interfaces and ubiquitous devices from a developer point of [15] which is emerging as a new discipline for specifying,
view instead of focusing on end-users. Other examples of designing, and implementing systems belonging to very
interesting projects focusing more on supporting non-expert different business domains by using XML technologies.
in programming users to manage personal ubiquitous The choice of a declarative modeling approach 3 allows us
environments are Accord [3,4], Oxygen [8] and GAS [14] to describe what the framework is like, rather than how to
projects. All these three projects have focused on giving create it, enabling system flexibility, reusability and
end users the ability to construct or modify ubiquitous portability over different implementation technologies.
computing environments.
More specifically MemoML describes the components and
Our work has been inspired by the results achieved within the structure of a Memodules environment. The
the framework of those projects. However, some components are the building blocks of the environment (e.g.
fundamental conceptual differences between those projects Memodules objects, devices, communication protocols,
and our work exist: (1) our focus on personal objects and actions, etc.) which can be combined together in order to
souvenirs as tangible interfaces instead of generic domestic define Memodules usage scenarios. The structure defines
objects and devices, and (2) our focus on memory what a usage scenario is, and the rules it has to undergo
recollection and sharing instead of management of generic such as for instance the constraints associated to actions
everyday activities in the home environment. While those which restrain the types of information an action can
projects refer to the concept of smart objects and their
combination to facilitate everyday activities, we concentrate
more on the concept of reminiscence and on the association 3
between personal objects and multimedia resources in order Declarative programming is an approach to computer
to enhance the experience of memory recall. programming that involves the creation of a set of
conditions that describe a solution space, but leaves the
interpretation of the specific steps needed to arrive at that
2
In the rest of the paper the terms “user”, “end user”, “non- solution up to an unspecified interpreter. For more details
expert programmer”, “normal user” are indifferently used to on declarative programming see
refer to any type of user, e.g. people who do not have any http://www.csc.liv.ac.uk/~frans/OldLectures/2CS24/declara
PC-experience. tive.html#detail

233
TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY

operate upon (e.g. the action “play music” could not operate shown on the business card. Then when Sabrina adds the
upon some photos). business card of her best friend to the system, then this
Memodule is automatically associated to the pre-defined
The two most important elements in the model are the
send_email scenario and Sabrina is able to send e-mails to
concepts of Memodules environment (MemoEnvironment
her best friend without doing any other thing.
element 4 in Figure 3) and Memodules usage scenario
(MemoScenario element in Figure 3). The first comprises
the whole collection of available Memodule objects,
devices and resources which can be assembled in order to
define Memodules usage scenarios. The second refers to
Memodules usage scenarios and describes which
information is associated to a Memodules and how this
information is rendered to the user.
Since Memodules are physical objects that activate actions,
a usage scenario describes how a Memodule (or a
combination of Memodules) interacts with the surrounding
environment. A MemoScenario is composed of three main
parts: the Memodule which is in turn associated to the
information to which the object reminds according to the Figure 4 Memodules element
user mental model (e.g. the seashell souvenir reminds to the
pictures of some Greek holidays), the input device which is
responsible for identifying the Memodule and starting the MemoClusters are virtual folders defined by the user
scenario (InputDevice element in Figure 3), and the containing collections of information (a set of pictures, a
Operation to perform once the Memodule is identified by collection of video, a compilation of songs, etc.).
the input device. Information in a cluster is constrained to be of the same
type (e.g. the “type picture” will include “.jpg”, “.gif”,
“.bmp”, the “type music” will include “.mp3”, “.wav”) in
order to guarantee consistence of action operated upon a
MemoCluster. Alternatively, it is possible to associate
multiple MemoClusters to a single Memodule and perform
a different action upon each of them.

Figure 3 MemoML model

The collection of information (a set of pictures, a playlist,


etc.) a Memodule is associate with is called MemoCluster
(see the next paragraph). Figure 5 MemoResources element
Every Memodule can either belong to a generic Memodule
category (e.g. generic business card category) or be a single
specific object. If it belongs to a generic category, then it InputDevice is a device equipped with an RFID reader
inherits all the properties and pre-defined usage scenarios which can identify a Memodule and make the link between
defined for that category. the Memodule object and the output device where the
action associated to the Memodule is to be executed. The
Supposing the “send_email” usage scenario has been InputDevice starts the execution of a scenario upon
defined for the generic “business card” category as identification of the corresponding Memodule.
following: every time a business card is approached to the
PC, the system has to send an e-mail message at the address The combination of a Memodule and an input device allows
to uniquely identify a scenario, i.e. it is not possible to
associate two different scenarios to the same Memodule-
4 InputDevice couple. Alternatively, if multiple Memodules
words in italics correspond to the real name of elements in are combined, different scenarios can be created using the
the model. same input device:

234
Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA

• seashell Memodule + business card Memodule + Input As presented in the next paragraph, MemoML is the base
device1 = scenario1 upon which the framework is built. More specifically, the
Action Builder visual editor, presented in the following
• seashell Memodule + book Memodule + Input device1 section, implements the concept of MemoScenario,
= scenario2 supporting user creation of Memodules usage scenarios.
Operation element describes the action the user wants to
perform when interacting with the Memodule object in
order to render the associated memories and events. Each MEMODULES FRAMEWORK
Operation is identified by (Figure 6): Memodules framework is a user-oriented framework
• the Action to perform, enabling end users to create their own Memodules
environment and it is built upon the previously described
• the Application to use to materialize the action (since MemoML model. Configuring Memodules means linking
the same action could be carried out using different them to multimedia information and describing how they
applications – e.g. the “play music” action could be interact with the surrounding environment (i.e. how and
carried out using “windows media player” or “java when information associated to them is rendered). The
music player” application), framework provides a one-stop platform 5 : a single platform
that allows users to add new Memodules objects to the
• the OutputDevice where the action is to be executed,
existing Memodules environment, to describe their usage
• the MemoCluster of information the action operate scenarios and to manage the execution of those scenarios.
upon, Memodules framework allows persons to create and
manage their own Memodules environment by dynamically
• the Mode of actions execution (parallel or sequential) if arranging heterogeneous Memodules and device
multiple operations are defined within the same components with digital information. Moreover, it provides
scenario. users with the ability of easily configuring and
reconfiguring Memodules usage scenarios. Memodules
framework, which supports the 3-phases approach
described earlier, is composed of the following
components:
• Memodules Lay&Play which handles the phase 1,
corresponding to tangible object initialization
• Memodules Action Builder which supports the phase 2,
corresponding to usage scenario description
Figure 6 Operation element
• Memodules Console which handles the phase 3,
corresponding to action materialization
MemoEnvironment as shown in Figure 3 is composed of:
• the collection of Memodule objects,
Phase 1 - Tangible Object Initialization
• the collection of all the MemoClusters, called This phase is handled by Memodules Lay&Play. As stated
MemoResouces (see Figure 5), before, Memodules are augmented personal objects used to
• the set of available devices and applications which can access information and to help controlling daily life
be used within a scenario to materialize actions devices. Memodules Lay&Play allows to easily “create”
described within the MemoConfig element (see Figure Memodules and to add them to the Memodules
7) environment. Memodules Lay&Play is composed of a
webcam and an RFID reader which allows simultaneously
to take a picture of the object laying over the Lay&Play
system and to read the ID of the RFID tag attached to the
object. The digital counterpart of the physical object is
created (Memodule creation) and added to the list of

5
the term one-stop platform originates from the e-
government research domain to indicate a single access
point to all public services which provide users with
Figure 7 MemoConfig element integrated services.

235
TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY

available Memodules objects displayed on the Memodule the workspace it clones itself and becomes a symbolic link.
Action Builder editor. Useless pieces in the workspace can be thrown in the trash
(on the bottom-right of the workspace). In order to connect
puzzle pieces together it is necessary to drag a particular
piece in the vicinity of a fitting target piece. If the two
Webcam pieces are matching a visual feedback is provided to the
user (the border of the puzzle pieces become red-colored),
on the contrary if the user tries to drag a non-compatible
RFID
reader
piece the system will not allow their assembly. When a
Memodule puzzle piece is selected the MemoCluster panel
(on the right) displays information clusters associated to
that Memodule which have been created and used in
previous scenarios.
Figure 8 Memodules Lay&Play Scenarios can be edited starting from puzzle pieces
belonging to any category.
Phase 2 - Usage Scenario Description Control Puzzle pieces
panel categories
This phase is supported by Memodules Action Builder. The
Memodules Action Builder is a visual editor based on the
puzzle metaphor which handles the creation of usage
scenarios for every Memodule. The choice of the puzzle
metaphor was inspired by the results achieved within the
Available
framework of Accord project [17]. The project designed a components
visual editor based on the puzzle metaphor to enable end
users to easily configure their home ubiquitous
environment. The evaluation activity they did with real
users confirmed extensively the suitability of their approach
[5]. However, although the puzzle metaphor was easily Working space for MemoCluster
grasped, poor design choice for icons graphics made it creating scenarios panel
difficult to couple them with their physical counterpart [5].
Memodules Action Builder uses the puzzle metaphor
proposed by Accord and extends it in order to support more Figure 9 Memodules Action Builder
complex puzzle configurations combining parallel and
sequential actions. Moreover, thanks to Memodules
Lay&Play component, icons representing the digital
counterparts of personal objects in our visual editor are
created using the picture of the object itself, thus facilitating
physical object to digital representation coupling. Memodules Connectors

Memodules Action Builder allows users to create Input Devices Output Devices
interaction scenarios which describe how Memodules Actions MemoClusters
interact with the surrounding environment and which
devices are involved in the scenario. Scenarios are created Figure 10 Puzzle pieces categories
using the puzzle metaphor: the user connects components
through a series of left-to-right couplings of puzzle pieces,
providing an easy to understand mechanism for connecting Puzzle pieces are grouped into 6 different categories (see
the different pieces. Moreover, constraining puzzle Figure 10) and displayed with different colors: the blue
connections to be left to right also gives the illusion of a ones represent Memodules objects, the green ones regroup
pipeline of information flow which creates a cause-effect Input Devices (i.e. devices that can identify Memodules
relationship between the different components. objects and start corresponding actions), the pink ones
represent Actions to carry out (e.g. play music, show
The puzzle editor is composed of a number of different pictures, send an e-mail, etc.), the yellow ones stand for
panels (Figure 9). The control panel (Figure 9 on the top) information that can be associated to Memodules objects
contains the list of available puzzle pieces grouped in six (e.g. some photos, music, etc.). As described in MemoML
different categories (see below for more details about model information is grouped in “logical” clusters called
categories). Puzzle pieces can be dragged and dropped in Memo Clusters. This means that no physical clusters are
the editing panel (or workspace) and assembled in order to created but information is browsed from every where in the
create the scenarios. When a puzzle piece is dragged onto

236
Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA

PC and grouped into logical collections. Black-colored • When Sabrina approaches the seashell (souvenir of
pieces identify Output Devices, i.e. devices where the some Greek holidays) and the business card of her
action is to be carried out (a TV for showing some photos, a friend to the Memodules Console the photos of that
Hi-fi stereo for playing some music, etc.). Finally red- vacation are sent to the e-mail address of her friend via
colored pieces refer to Connectors which allow creating the PC.
complex scenarios. To date two types of connectors have
been implemented: the “AND” connector and “TIMER”
connector. The first is used to combine (1) multiple Phase 3 - Action Materialization
Memodules, for instance when Sabrina approaches the This phase is handled by Memodules Console. Memodules
seashell Memodule and the business card Memodule a Console is based on Phidgets toolkit [4], and allows the
specific action has to be performed), and/or (2) multiple user to start a scenario and to manage subsequent
actions (e.g. when Sabrina approaches the seashell both the interactions with digital contents. When a Memodule is
photos and the music have to be played). The second, the approached to the Console, the system identifies the
“TIMER” connector, adds a time constraint to the execution Memodule object and retrieves the usage scenario
of the scenario (e.g. when Sabrina approaches the seashell associated to it. Since the Console integrates three different
Memodule to the system, the system has to show the Input Devices (corresponding to the three RFID readers in
pictures and 1 minute after the slideshow has finished it has Figure 12) for each Memodule it is possible to create three
to play the music). different scenarios, as described in the MemoML model. In
Finally the newly created scenario needs to be named and order to provide a visual feedback to the user, once the
saved. Once this has been done, the user can “play” the Memodule has been identified, the green LED associated to
scenario and test it to ensure that everything works exactly the RFID reader (see Figure 12) lights on.
as it has been intended.
LCD
screen

Scenario examples Touch


sensors LEDs
Following two examples of Memodules interaction
scenarios that can be created using the Memodules Action Infrared
Builder are presented. sensor

RFID LEDs
readers

Figure 12 Memodules Console

Once the scenario has been started the console allows


managing actions execution. The Console is equipped with
an infrared sensor, activated by the user’s hand, which
allows controlling the volume of sound when playing. At
Figure 11 Scenario examples the same time some luminous Leds give a visual feedback
of the user’ hand movement while adjusting the volume
(Figure 12). Several touch sensors allow the user to interact
The first scenario (Figure 11a) describes the following with multimedia content while playing (play next, previous,
action: stop, etc.). The LCD screen shows metadata information
• When Sabrina approaches the seashell (souvenir of about the multimedia content that is played, while the
some Greek holidays) to the Memodules Console (see circular touch (on the top left) allows the user to speed up
next paragraph), the photos of that vacation are or slow down the forward and the backward motion of
displayed on the PC screen and after 5 minutes the some video or music playing.
slideshow has terminated, a video of the Parthenon is Scenarios can be executed just after they have been created
played with the visual editor, giving the possibility to the user to
The second scenario (Figure 11b) describes the action: test whether they work correctly.

237
TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA Chapter 5 - CONTEXT DEPENDENCY AND PHYSICAL ADAPTABILITY

CONCLUSION AND FUTURE WORKS Bits - User-configuration of Ubiquitous Domestic


This paper has presented a user-oriented framework, called Environments. Proceedings of the Fifth Annual
Memodules Framework, enabling the use of personal Conference on Ubiquitous Computing, UbiComp2003,
objects as tangible user interfaces facilitating memory Seattle, Washington, USA, 12-15 October 2003.
recollection. The framework is intended to support any type 4. Rodden, T, Crabtree, A.., Hemmings, T., Koleva, B.,
of user to use and combine Memodules (augmented Humble, J., Åkesson, K-P. and Hansson, P. Between the
personal objects) as a mean for memory recollection and dazzle of a new building and its eventual corpse:
experience sharing, and to support dynamic re-combination assembling the ubiquitous home. Proceedings of the
of those objects as associated memories can change over 2004 ACM Symposium on Designing Interactive
time. Memodules Framework is based on a formal model, Systems, August 1st-4th, Cambridge, Massachusetts:
called MemoML, which describes the concept of ACM Press.
Memodules environment and Memodules usage scenario 5. Hansson, P., Humble, J. and Koleva, B. Understanding
proving a common, technology-independent, description of and Using the Tangible Toolbox. Technical Report.
the whole system which can be implemented using different September 2002.
technologies. The framework is composed of three main
parts: Memodules Lay Play which allows to easily creating 6. Sohn, T. and Dey, A.K. iCAP: An Informal Tool for
Memodules, Memodules Action Builder which is a visual Interactive Prototyping of Context-Aware Applications.
editor based on the puzzle metaphor for describing Extended Abstracts of CHI 2003, 974-5.
Memodules usage scenarios and Memodules Console which 7. Ballagas, A. R., Ringel, M., Stone, M., Borchers, J.
materializes actions and executes scenarios. "iStuff: A Physical User Interface Toolkit for
Ubiquitous Computing Environments ", CHI 2003.
As next step an accurate evaluation of the framework will
be performed in order to assess how acceptable is for end 8. Oxygen project webiste: http://www.oxygen.lcs.mit.edu/
users to use it. The evaluation will be centered on two main 9. Ishii, H., and Ullmer, B. Tangible Bits: Towards
aspects: the comprehensibility of the proposed framework Seamless Interfaces between People, Bits and Atoms. In
and underlying concepts and the willingness to use such Conference on Human Factors and Computing Systems,
technology [14]. Moreover in the next future we will March 1997.
explore how the proposed framework can be enhanced by
10. Smart-its project website: http://www.smart-its.org/
adding multimodal interfaces such as speech and gesture
recognition systems. This further improvement will allow 11. Phidgets project website: http://www.phidgets.com/
the design of additional interaction modalities enhancing 12. Ullmer, B. and Ishii, H. Emerging frameworks for
user manipulation and interaction with Memodules towards tangible user interfaces. IBM Systems Journal, 39, 915-
a richer multi sensorial experience. 931. 2000.
13. Shaer O., Leland N., Calvillo-Gamez E. H. and Jacob R.
ACKNOWLEDGMENTS
J.K.. The TAC Paradigm: Specifying Tangible User
Interfaces. In Personal and Ubiquitous Computing
This research has been supported by Hasler Foundation Journal. 2004.
within the framework of Memodules project. Particular 14. Mavrommati, I., Kameas, A., and Markopoulos, P.. An
thanks to the Prof. Rolf Ingold, Dr. Denis Lalanne and all editing tool that manages devices associations in an in-
the other people involved in the project for their valuable home environment. Personal and Ubiquitous Computing
contributions. 8, 3-4. ACM Press/Springer-Verlag, London, U.K.
,2004, 255–263.
REFERENCES
1. Hoven, E. van den, and Eggen, B. Personal souvenirs as 15. Glushko ,R. J. and McGrath, T. Document Engineering:
Ambient Intelligent objects. Proceedings of the 2005 Analyzing and Designing Documents for Business
Joint Conference on Smart Objects and Ambient Informatics and Web Services. MIT Press. 2005.
intelligence: innovative Context-Aware Services, 16. Mugellini, E., Gerardi, S., Baechler, D. and Abou
Usages and Technologies, Grenoble, France, October Khaled, O. MyMemodules: a Graphical Toolkit for the
12-14, 2005. SOc-EUSAI’05, vol.121 (pp.123-128) Easy Creation of Personal TUI, 19th ACM Symposium
New York, USA, ACM Press. on User Interface Software and Technology, UIST 2006,
2. Hoven, E. van den, and Eggen, B. Design Montreux, Switzerland, October 15-18, 2006.
Recommendations for Augmented Memory Systems. 17. Accord project website: http://www.sics.se/accord/
Designing for Collective Remembering Workshop at
18. Hoven, E. van den. Graspable Cues for Everyday
CHI 2006, Montréal, Quebec, Canada.
Recollecting, PhD Thesis, Department of Industrial
3. Humble, J., Crabtree, A.., Hemmings, T., Åkesson, K- Design, Eindhoven University of Technology, The
P., Koleva, B., Rodden, T., Hansson, P. Playing with the Netherlands, 2004, ISBN 90-386-1958-8.

238

You might also like