Professional Documents
Culture Documents
Report
Report
Report
Table of Contents
Introduction...........................................................................................................................................4
Problem Statement............................................................................................................................4
Scenario Map.........................................................................................................................................5
Ideation:................................................................................................................................................7
Digital Prototype:...................................................................................................................................9
Conclusion:..........................................................................................................................................11
References...........................................................................................................................................13
Table of Contents
It is impossible to overestimate the significance of user interface (UI) design in the continuously
interfaces play a crucial role in boosting user experiences as mobile devices become an essential part
of our daily lives. CSE2UI Assignment One is presented by the Department of Computer Science and
(Rustambek, 2023).
Problem Statement
How to make gesture keyboards on touchscreen mobile devices more functional is the main issue this
assignment attempts to answer. Users can enter text using gesture keyboards, a cutting-edge mode of
input, by tracing a path across the required letters on a virtual keyboard. One might utilise the key,
slide to the 'e,' 't,' 'h,' 'o,' and 'd,' then lift their finger to spell the word "method," for example. The
computer reads this gesture and selects the most likely word, which in this case is "method," which is
then typed.
Users routinely take actions outside of text input in typical settings, such as modifying text (e.g., copy,
paste, delete) or launching application features (e.g., "save"). Traditionally, menus have made it easier
to perform these tasks. Menu implementation in touchscreen mobile devices, though offers difficulties
because users must tap menu items with their fingers, which is difficult on small screens and takes up
The main issue at hand is how to improve the gesture keyboard paradigm such that text entry and
command activation are smoothly accommodated inside a single system on touchscreen mobile
devices. The investigation of original solutions to this intriguing challenge is the focus of this project.
We will set off on a trip as we work through this project which includes building a scenario, coming
up with creative design concepts, creating a digital prototype, generating a video tour, and writing a
thorough report. Each of these assignments will help students develop a comprehensive understanding
of UI design principles and how to use them practically to solve problems in the real world.
Through this project, we hope to provide students with the knowledge and abilities necessary to
address challenging UI design issues, encouraging creative solutions that improve the touchscreen
Scenario Map
Map?type=whiteboard&node-id=0%3A1&t=AKB9zoJ6YVQ2wb5q-1
1. User Action: The user starts the gesture keyboard on their touchscreen mobile device to enter
text.
Comment: At this point, the user is getting ready to enter text using the gesture keyboard.
Question: How can we distinguish between command input gestures and standard text input
2. Text Entry: By using gesture input, the user selects individual letters to begin entering text.
Comment: It's crucial to provide an easy-to-use text entry method.
Question: Are there any particular difficulties that users can encounter while typing lengthier
words or phrases?
3. Text Editing: The user encounters a scenario where they need to perform text editing tasks, such
Comment: Text editing is a common task, but it might require distinct gestures or actions to
Question: How can we incorporate text editing commands without disrupting the text entry
flow?
4. Command Activation: A command, such as "save," must be activated by the user while using the
gesture keyboard.
Comment: Activating instructions ought to be simple and different from entering text.
input??
5. Feedback: By identifying gestures and carrying out the appropriate tasks, the system delivers
feedback.
Comments: Feedback is essential to let the user know that their motions have been
recognised.
Question: How can we make sure that the user receives rapid, unambiguous feedback
6. Command Execution: The selected command, in this case, "save," is executed by the system.
Comment: It's important that the system accurately executes the intended command.
of gestures?
7. User Experience: Throughout the scenario, maintaining a positive and efficient user experience is
paramount.
Question: What steps can be taken to optimize the overall user experience when using the
The scenario map graphically depicts the user's trip through the gesture keyboard interface,
emphasising significant locations where command activation and text input collide. As stated in the
assignment's objectives, this map serves as a starting point for our investigation of creative ways to
Ideation:
In the pursuit of addressing the central question of this assignment, which revolves around enabling
command input on the gesture keyboard while maintaining the integrity of text input, it becomes
essential to explore and articulate innovative ideas and concepts. The following section outlines
potential solutions and concepts aimed at distinguishing between text input and command input
1. Gesture Differentiation:
Concept: Introduce a distinct gesture or motion pattern that signifies the initiation of
command input. For instance, a double tap or a circular motion could indicate the
transition from text input mode to command input mode (Mackamul, 2023).
Advantages: This method reduces the possibility of user confusion between text input and
command input by relying on natural movements that users can easily adapt to.
2. Gesture Triggers:
swipe upward could mean "save," while a swipe downward means "delete."
Advantages: Users can easily activate commands without having to explicitly switch
3. Contextual Gestures:
Concept: Context-aware gestures that consider the user's present activity or application
should be used. Depending on whether the user is in a text editor, email client, or web
Advantages: This approach enhances the versatility of gesture commands, allowing users
4. Gesture Combinations:
double tap followed by a letter gesture could trigger a "save as" function (Reed, 2023).
Advantages: By using combinations, users can access a wide range of commands without
5. Visual Feedback:
Concept: Display visual cues on the screen to indicate the mode the gesture keyboard is
in. For instance, when in text input mode, the virtual keyboard layout is visible, while
Advantages: Visual feedback provides users with clear indications of the current mode,
6. Voice Commands:
Concept: Incorporate voice recognition alongside gesture input. Users can trigger
command input by speaking specific keywords or phrases, enhancing the versatility of the
7. User Customization:
Concept: Allow users to customize gestures and their associated commands, tailoring the
The difficulty of differentiating between text input and command input within the motion keyboard
paradigm is addressed differently by each of these concepts. Depending on elements like user
preferences, context, and the complexity of the mobile application, the option of which notion to
implement may change. These ideas serve as a basis for the later creation of a digital prototype, where
one of these ideas will be chosen and further developed to show how it might be used in practice.
Digital Prototype:
We demonstrate one of the suggested ideas from the ideation stage in this section with a digital
prototype created using the Figma design platform. The concept of distinguishing between text input
and command input on the gesture keyboard for touchscreen mobile devices is demonstrated by the
prototype.
https://www.figma.com/file/KiqYeZZp8WD7tQERK4wIhC/Keyboards-(Community)?
type=design&node-id=0%3A1&mode=design&t=y3gGOsMcwRr7GVpw-1
Figure 2: Prototype
Our approach to the problem of seamlessly combining command input with text input in the motion
keyboard paradigm is made real via the Figma digital prototype. It is made to look and feel like using
2. Text Entry: Using the conventional text input mode, users can see how the gesture keyboard
works. By dragging their finger around the virtual keyboard layout, they may enter text.
3. Command Activation: The prototype illustrates how specific gestures or groups of gestures can
activate different commands. For instance, the "save" command is activated by a swipe gesture,
4. Visual Feedback: Users can better grasp which mode they are in (text input or command input)
when there are obvious visual cues on the screen, as mentioned in the ideation section.
5. User Interaction: People can engage with the digital prototype to see how the suggested solution
The Figma digital prototype is enhanced by the video walkthrough, which offers an interactive display
of its behaviour and capabilities. Users can follow the step-by-step instructions to learn how to switch
between text input and command input modes and how to utilise gestures to carry out different
commands.
Conclusion:
Together, the digital prototype and video walkthrough demonstrate our creative response to the
problem practically posed by this task. Users can learn more about how the proposed concept
distinguishes between text and command input on gesture keyboards through these resources, which
will ultimately improve the usability and effectiveness of touchscreen mobile devices.
The video walkthrough and Figma prototype are crucial in helping to visualise the real-world use of
our technology and its potential user advantages. They offer a practical experience that supplements
ideation and the scenario map and exemplifies how our design notion manifests in the actual world.
Conclusion:
Finally, the major problem of this assignment has been to extend the gesture keyboard paradigm to
smoothly accommodate both text entry and command activation on touchscreen mobile devices. We
have attempted to offer creative answers to this challenging challenge through an organised method
that includes scenario mapping, brainstorming, digital prototyping, and a video walkthrough.
This assignment's main issue was the increased demand for user interface optimisation for
touchscreen mobile devices. The difficult part was enabling users to switch between text entry and
command activation without sacrificing the effectiveness and usability of the gesture keyboard.
We put forth ideas using gesture differentiation, gesture triggers, contextual gestures, gesture
combinations, visual feedback, and voice instructions, drawing from a variety of innovative thoughts.
These ideas sought to establish a distinct and understandable line between text input and command
We visualised the user journey using our scenario map, finding crucial touchpoints where text and
command input collide. The ideation segment expanded on these touchpoints by showcasing fresh
ideas that might make gesture keyboards on touchscreen mobile devices more useful.
Our Figma-hosted digital prototype gave a real-world example of one of our suggested solutions, and
the walkthrough movie that accompanied it served as a companion to it. Using motions, this prototype
demonstrated how users could easily transition between text input and command activation modes
The investigation of UI design ideas in the context of this project has been the current mobile
technology scenario. It emphasised the significance of user-centric design, where the effectiveness
and experience of the user are crucial. To help people use touchscreen mobile devices more smoothly
and in control, we think our suggested solutions are a step in the right direction.
References
33(6), p. e6051.
Commands In Ear. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous
Leno, V. P. A. D. M. L. R. M. a. M. F., 2021. Robotic process mining: vision and challenges. Business
Reed, R., 2023. Seniors Guide to iPhone and Apple Watch: 2 in 1: The Definitive and Intuitive Step-
by-Step Manual to Master your New iPhone and Apple Watch with Tips and Tricks for Senior