Professional Documents
Culture Documents
CMM Collaborative Brief
CMM Collaborative Brief
Original brief
Personalised Skills Prompt.
Aim - users reflection on their own skills/experiences give them confidence and prompts them in new situations.
Functionality - An app which invites users to upload and tag around 20? photos of themselves in emotionally charged situations. The user tags these
photos with the skills shown such as resilience, commitment, achievement, fun loving. These photos can be summoned as needed later on. I.e. User
taps a tag word and a photo appears.
This should be mobile only (vision is for smart watch) and can be restricted to android although IoS nice to have. It doesnt need to be robust on older
versions.
The tool may go out on the BBCs Taster platform so will need to be technically compliant, secure and have a beautiful UX.
Background
How will an app help reinforce self-confidence, and help prompt with good ideas in new or challenging situations?
Investigation
In setting out the requirements for this app, we have made a number of assumptions. The most significant has been to model the experience of
dealing with a 'stressful' situation and how we might fit an app companion into this process.
First o, we're assuming that a stressful situation is such because we've never seen it before, or because a previous similar situation has ended with a
bad experience. However, there are positively stressful situations, too, which are counted as being exhilarating, or even hilarious.
For the app, all these sorts of situations are ones which we need to help the user assimilate. We start with modelling a feedback loop something like
this:
We are involved in an event; we react to the event; we record the event - and how it went - in our memories; we assimilate these memories into our
own experience. Sometimes, if we're aware of self-improvement, we'll want to make some sort of resolution about how we want to behave in a similar
situation in the future.
We want the app to be able to assist us at all points in this loop; especially when the event occurs the next time.
Here's how we see the app helping out:
Reflect
Once the event has happened and we have finished reacting to it and its immediate consequences, we are into a much more reflective phase, where
we are trying to make some sense of the experiences. In our memory, we can revisit them, but they are often altered by how we feel. Our memories
can become corrupted even as we try to make sense of them. We want to be able to revisit the memory as often as we like; mull it over; change our
opinion of it, without changing the facts of what happened. We want to be able to start to understand why we acted as we did - good or bad - so we
can either change for the better in future, or reinforce something which worked well.
Resolve
Obviously, our period of introspection can't last for too long. Once we've figured out all we need to, we need to distill what we have learned; capture it
and signpost it. The resolution to the event is very important; it's what we will use to re-assure ourselves when we come this way again in the future. A
short encouraging message is needed, along with a reminder of positive attributes of our character we have used, or could use.
Browse. Be Proud.
Long-term use of the app could yield many benefits, but being able to move through a narrative of personal development, visiting the most important
times in our lives is perhaps the most compelling, as it adds a great deal of value to an activity we find ourselves doing in unguarded moments already;
browsing. Instead of a photo album, though, we have a self-curated story with a journey and an ending. Care would need to be taken to ensure that
the resolution of the story is always done positively, so that revisiting it is always encouraging. Ultimately, the message given by the app is that all
experience is good experience.
A note on privacy
This app is one which deals with the most intimate details of people's lives. All information is to be kept privately within the app, although its data may
be backed up, for transfer to another device. Data from the device is not to be shared.
Conclusions
We have ascertained the following important functions of the app:
Capturing an important event. Tagging and archiving.
Revisiting the event. Browsing and searching.
Capturing reflection. Annotating the event.
Building on experience. Linking similar events.
Providing an encouraging, satisfying resolution.
Finding wisdom: delving into a library of experience.
Further we have identified the following drivers:
Interaction: the more the app is used, the better.
Content: the more content types supported, the more expressive the experience.
Convenience: should be as easy as browsing a photo album.
Longevity: the more events captured, the more experience is available to draw on.
App Description
From the above investigation, here's a description of how a mobile app might work. We will use this to give an idea of UI design, components and an
estimate to get to a minimum viable product.
Common UI Themes
We will uphold the following themes right throughout the app:
Minimal UI: a UI which get out of the way when it isn't needed
Default Settings: Reduce fiddling at crucial points
Minimal Typing: Concentrate on choosing and browsing, rather than slow input of information. Use dictation / transcribing where possible for
text entry.
Use of tags for expression and search.
Use of Audio transcription to enable audio diaries to be searchable. Free Text as a search criterion
Reuse of components in dierent contexts to get the maximum functionality from minimal development eort.
Event Capture
The nature of any event worth capturing is that we are not going to want to spend time or undue attention on capturing it. If it does occur to us to
capture it, we want to spend as little time as possible doing that and then get back to the important stu. We need the device to oer a set of capture
methods, which are practically automatic, like a set of buttons on the homescreen, which capture quickly, and then get out of the way. Wherever
possible, we want to use persistent settings on the device to hold default values, so that we don't have to fiddle at the crucial point of capture.
Picture
Button on device homescreen. Shows camera + view finder, set up for stills, with preferred settings. Takes picture, dissapears.
Video
Button on device homescreen. Shows camera + view finder, set up for video, with preferred settings. Takes video. On 'stop', dissappears.
Audio
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissappears.
Audio Transcribe
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissapears. Background processing
occurs on audio. An unadorned text document is generated. Transcription is very useful, but limited and innacurate. Could be used for things like
searching and tagging, if not actual prima facia data.
Buered Audio
Button on device homescreen. Shows custom audio capture. On start, moves to the background, and is available through the 'notifications' bar. While
in the background, records the last x minutes / seconds of audio, where x can be set in the settings.
Location
We can set the option to tag event data with the location the device was in at the time, although primary data is not explicitly tagged. For instance, the
EXIF data in a picture file will never be written to. Instead the location of newly captured event data will be associated with the data.
The Event
To the App, an event is be made up of many peices of prima facie data. For example, we could take several photos, one after the other, then some
audio, describing how we feel. It's important that these bits of data are associated with the same event, so we need a persistent setting that provides
some control over this. For example:
Capturing Reflection
We want to revisit an event, so that we can re-live it, and understand it. The App should give is the tools to do this, so that it's both easy to add the
annotations that we want to, AND easy to consume them, when we come back to the Event and want to refresh our understanding. It should be as
browsable as a scrap-book.
Resolution
Now we've done our thinking, we have made our comments and understood our experience. It's time to resolve our thoughts.
From the Curation wizard, we can choose the option to Archive the event to the Library. This will start the Archive Wizard.
Archive Wizard
The Archive Wizard will allow us to view the Event in the same ways as the Curation Wizard; browsing both the prima facie and Developing items, so
we can use them as inspiration. The Resolution Wizard allows us to:
Write a short message to our future self, who will be needing our wisdom
Build a set of positive statements about ourselves, in the light of this experience.
Add the best bits of 'treasure' from our Developing and prima facie galleries.
Link to similar events from the Library
Archive the Event to the Library
The statement builder is perhaps the most important part of the Wizard: it guides us to expressing 4 'I am...' statements, in the context of the Event.
We can choose only from a list of adjectives curated for positivity. For an example, see here
Once an Event is archived, it is listed in the 'Library'.
The Library has exactly the same format, as the InBox - we can can easily switch between the two of them.
The Library
The Library is searchable by tag and free text in exactly the way as the Inbox. It help us to find experiences with a resolution; ones which we have
worked on to a conclusion and which we value. Items are listed by the name given to them during curation, and have a thumbnail showing the first
peice of treasure.
On finding an item, we can do the following:
View
Move to the Inbox
Viewing the item allows us to View the Resolution only. If we want to go into the gory details, we must move the item back to the inbox!
Recent Events
Recent Events is a list of events which have recently been accessed, sorted by time of access, most recent first. This area is intended to be an easy
way to access the Event we are currently curating, after leaving the app, or looking at another Event, for reference. It should be accessed conveniently
in the main area of the app and complimentary to the InBox and the Library.
Proposal
We propose an Android app, utilising the following UI components, and configuration:
Home Screen
The device home screen will be populated with several icons:
* Main App * Capture Audio * Capture Still * Capture Video * Capture Buered Audio * Settings
Main Screen
UI: Tabbed UI, featuring InBox, Library and Recent allowing the user to move between them easily. When working, the Recent list allows the user to
come out of an event, use the Paste Bin from another event, and come back.
Capture Apps
UI + Action: Each capture app will be distributed with the application, but available globally, and launchable from the Homescreen.
Picture
Shows camera + view finder, set up for stills, with preferred settings. Takes picture, dissapears.
Video
Shows camera + view finder, set up for video, with preferred settings. Takes video. On 'stop', dissappears.
Audio
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissappears.
Audio Transcribe
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissapears. Background processing
occurs on audio. An unadorned text document is generated.
Buered Audio
Button on device homescreen. Shows custom audio capture. On start, moves to the background, and is available through the 'notifications' bar. While
in the background, records the last x minutes / seconds of audio, where x can be set in the settings.
Settings
UI: Addition to the standard Android Settings UI, providing settings for capture, app behaviour.
Search Component
UI: A bar-type component which allows the user to define Tags and Free Text as search criteria. Provides a 'Search' button, to start the search. Free
Text is a simple text input view
Tag Search
UI: Tags are defined using a simple text view with a set of drop-down suggestions. Suggestions are compiled the tags already used in the App. Tag
suggestions can be queried from the UI and a definition from an online dictionary, supplied in the Word Definition View.
Confirmation Component
UI: Last part of the wizard, showing name and tag cloud.
Gallery
A UI component, able to show the following media types, browsable by swipe:
Audio
Audio Transcribed
Video
Still
Event
Internet Link
Local Link
Text
Where possible, the gallery will show a prepresentative image, or text indicating the media available from each item. On selection, the gallery will play
the item, using a suitable player. Gallery may show UI to enable selection and deletion of an item, and copying a reference to the item to the Paste Bin
(see later)
Audio Player
Audio Transcribed Player
Video Player
Still Player
Event Player (Inbox Curator, or Library Viewer)
Link Player (no external linkage, content only)
Text Player
Gallery
Reused: As above.
Confirmation Component
UI: Last part of the wizard, showing name and tag cloud. With a 'finish' or 'cancel' option.
Gallery
Reused: As above.
Confirmation Component
UI: Last part of the wizard, showing name and tag cloud. With a 'finish' or 'cancel' option.
Data Components
The App will do a great deal of data archiving, searching and indexing. For this reason, it will need to utiise an SQL DB, within an Android Service,
providing a queuing mechanism, and thread safety. Marshalling and unmarshalling of data will be done using JSON, to a Schema.
Location Components
The App will need access to Location Based Services, and will use open source options.
.... Estimate Removed....