Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Smart House: The Holistic Smart Hardware Regulator

Gino de Mello, Arie Klaver


1 LIACS, Leiden University, Niels Bohrweg 1, Leiden, The Netherlands

S3239888@vuw.leidenuniv.nl, s3313581@vuw.leidenuniv.nl

supervised by Nikki Rademaker, Course Admin Verbeek.


nikki.rademaker@hotmail.com &
f.j.verbeek@liacs.leidenuniv.nl.

This paper introduces a 3D Smart Home application interface, challenging the prevalent
2D norm. By visualizing the devices within their spatial context, we hope the users will
have an easier time locating and regulating them. Through participative design and two
user evaluations, the 3D version proves more task-efficient than its 2D counterpart.
Although user preferences vary, the 3D visualization's efficiency positions it as a
promising advancement in smart home applications.

1 Introduction

In the 1980s Human-Computer Interaction began as a field of study and it really took
off when in the 1990s the World Wide Web was created. This led to the rise of
information architecture and information which became an important area of study.
Eventually, with the invention of mobile phones, laptops, color displays and better
screens, there was interaction between designers and software engineers to create
exciting experiences for their devices (Benyon, 2013). It was their goal to enhance the
user experience design, which is what this project aims to do as well. There are
numerous applications that allow the user to manipulate their smart devices around the
house in a 2D visualization. However, this might not be the optimal way to regulate
one's smart hardware. As the devices are located in a 3D environment, i.e. the user’s
house, it might be beneficial to make use of this spatial context when displaying the
devices, by rendering them within a 3D representation of the user's house. This would
potentially allow the user to use the mental model of their house and the spatial
metaphor of the 3D application to more easily locate a specific device, compared to
scrolling through a 2D list of devices. The application created for this project will
offer a new user experience while staying simplistic, ensuring it can be used by a wide
range of users, which in the researcher's opinion other 3D visualizations lack. In order
to compare the 3D version of the application, a simple 2D version has also been
made. The 2D version of the application will be implemented using a combination of
HTML and CSS and for the 3D version, the Godot 4.1.1 game engine will be used.
The research question this paper will try to answer is ‘What is the user friendliness of
3D versus 2D visualizations for regulating smart hardware around the house?’
This paper will continue by describing the problem and the user group, followed by
how this problem is solved with this project. After this, the results of the tests that
were conducted with the participants to improve the project will be explained. This
also includes the changes that were made during the development and how this has
impacted the project as a whole. Besides this, the limitations of the project
development will be mentioned as well. Lastly, this paper will summarize the findings
and results of this project which will be followed by future development
recommendations.
2 Application background and users

Our project is aimed at offering a new 3D visualization of a Smart Home application


that allows the user to manipulate its smart devices such as lamps, televisions, heating
etc. Traditionally, these kinds of applications are almost entirely offered in a 2D
visualization, but as mentioned before, this does not make use of the spatial context of
the devices and the mental model the user has of their house. After doing some
research we found that there are already some existing 3D visualizations, such as the
Smart Home 3D webapp by Wassim[3]. However, unlike our project, these are not
entirely user friendly for all ages, as the amount of detail can be confusing. Our
project is aimed to make the 3D visualization accessible for everyone with smart
devices in their house that they want to use. Therefore, our application can be seen as
relevant to society because numerous people can benefit from it. This has mainly been
done by using the participative design approach. This research work emphasizes that
the researchers are not the people they are designing for. The intended users will most
likely have different requirements. How we implemented this participative design will
be discussed later in this paper through user evaluations. We also could have chosen
the interview approach, however, this would be too time consuming with the limited
time we have for this project.

User analysis
Since our application is aimed at a wide range of users, it was very important for this
project to do a critical user analysis. We want to develop an application that is
accessible to people between the ages of 20 and 65, who use smart devices in their
houses. Therefore, we concluded that our application needs to be simple without too
many details and fancy designs. This is because it can be confusing for older users who
might already have difficulties with using smartphones and tablets. We brought this
detailed user analysis into our user evaluations as well. We tested our project two times
in user evaluation 1 and user evaluation 2. During these user evaluations, we had 5
participants per evaluation with a wide range of ages. For the first user evaluation, we
had 5 participants 23, 36, 48, 57 and 71 years of age. And for the second user evaluation,
we had 5 participants 20, 22, 57, 67 and 73 years of age. This would allow us to receive
different views on our application which we would otherwise not get if we only had
participants that were below 40 years of age for example, which made it easier to solve
the problem we stated earlier. The user evaluations will be extensively discussed in the
results and evaluations section of this paper.

3 Simple is already hard enough

As described in the sections above the 3D visualization of a Smart Home application


did not exist in the way we envisioned. This means the other versions were not aimed
at a wide audience. A specific usability requirement we set at the beginning of the
project is that a person of 20 should be able to use our application but also a person that
is 65 years of age. Therefore, simplicity was at the forefront of our design choices for
the 3D application. In contrast to other 3D visualizations, this project does not use a lot
of detail when portraying the 3D environment. Therefore, it is less cluttered and only
the important things such as devices are displayed. In addition, we explain all the
functions with a legend and small explanation at the beginning (Figure 1.1). This is
part of the principles of universal design which has ‘simple, intuitive use’ as one of its
pillars. This argues that the ‘use of the design is easy to understand, regardless of the
user’s experience, knowledge, language skills, or current concentration level.’(Benyon,
2013, p.78). These elements return in our 3D application since it is simple to understand
and could in theory be operated with basic language skills. In addition, it is also possible
to operate it with low concentration levels since it is not difficult or energy-consuming
for a person to use the devices (Figure 1.2). The following section will review how well
we have realized our usability requirements and if the 3D version performed better than
the 2D version.

Figure 1.1: 3D Interface when you start the application for the first time.

Figure 1.2: 3D Interface when interacting with devices.


4 Results and Evaluation
This section will discuss the results and evaluation of the prototype we created and used
during the first and second user evaluations. This is part of the participative design we
are using. This means that we use the user evaluations to find the requirements people
need for the application we are building. Participants are actively involved in the design
process (Benyon, 2013). During the first user evaluation, we did not let the user get
familiar with the 2D and 3D versions of our application. This led to faster times with
the 2D version since it was easier to recognize and read what the devices were. In
contrast, the participants took significantly longer since they were not yet familiar with
the layout of the 3D house. This was solved in the second user analysis by providing
the users with some time to familiarize themselves with the applications before
performing the tasks. After all, if you would use the 3D version in your own house you
would know where your devices are located as well. Another point that was raised
during the first user evaluation is that the task to read the temperature of the central
heating system was not fair. In the 2D version, it was displayed clearly on the screen
and you did not have to click anything. In contrast, in the 3D version, you first had to
click on a radiator to display the temperature. To solve this issue we created a tab in the
3D version in the second prototype that displays the temperature on all floors to make
it more fair during the second user evaluation. We also needed to improve the 2D
version based on the feedback we received. Firstly, the letters should be bigger to make
it easier to read. Secondly, we should implement a spin box to switch channels for the
televisions.

After implementing the changes to our prototype and altering our approach for the
experiment, the results from the second user evaluation were quite different from the
first, as can be seen in tables 2.1 and 2.2. Seven out of the ten tasks were performed
faster in the 3D version and the average time it took to perform a single task was 0,5
seconds faster for the 3D version than for the 2D version. During the first user
evaluation only three out of ten tasks were performed faster in the 3D version and the
time difference was less than a tenth of a second, so this is a significant improvement.
Allowing the users to familiarize themselves with the layout of the house, seems to
have been a good decision, as it made the participants more aware of the locations of
the different devices. This caused the users to find the devices quicker in the 3D version
than in the 2D version. Seeing the devices in their spatial context made them easier to
locate and recognize compared to seeing them in the list format used in the 2D version.

After each user evaluation we let our users fill in a Post-Test Survey in which they can
rate both applications. What can be seen in table 1.1 is that the average score for the 2D
version is higher than the 3D version but the difference is minimal. On average the 2D
version also scores higher but this is also a small difference.
SUS Post-Test Survey scores
evaluation 2
2D Version 3D Version
85 points 57,5 points
85 points 90 points
87,5 points 92,5 points
92,5 points 92,5 points
92,5 points 100 points
Total: 442,5 Total: 432,5
Average points per person: 88,5 Average points per person: 86,5

Table 1.1: SUS Post-Test Survey results user evaluation 2.

After we implemented the aforementioned changes we continued with the second user
evaluation, for which we again had a diverse age variation for the participants. This
time we did not get any suggestions to improve our prototype. The scores of the Post-
Test Survey were however quite different from the first round. On average, the score
for the 2D version was 88,5 and for the 3D this was 86,7. However, unlike during the
first user evaluation, 4 out of 5 participants scored the 3D version higher. The 3D
version still has a lower score due to a single participant. They gave the 2D version a
score of 85 points and the 3D version a score of 57,5 points. This makes it a little more
difficult for us to draw a decisive conclusion. If we take the average of the scores, the
2D version would have the highest score, but when assigning points based on
individual comparisons between the scores assigned to the 2D and 3D versions by the
participants, the 3D version would be the clear favorite. The participant who rated the
3D version relatively poorly indicated that they simply did not like the 3D version and
preferred the 2D version.
SUS Post-Test Survey scores
evaluation 1
2D Version 3D Version
70 points 67.5 points
72,5 points 72,5 points
87,5 points 72,5 points
87,5 points 87,5 points
97,5 points 95 points
Total: 415 Total: 395
Average points person 83 Average points person: 79

Table 1.2: SUS Post-Test Survey results user evaluation 1.

During the mid-term presentations, we also got some feedback to switch the floors
around so that floor 3 is at the top when you click the floors button in the 2D version.
However, we did not like this idea because if you enter your house you always start on
the first floor. Therefore, it is likely that you also use the devices on the first floor the
most often instead of the other floors.

There are also some limitations to this research that should be noted. Firstly, the
researchers are known by the participants and therefore there could be some bias in the
scoring of the projects. However, this was something we decided to do because of time
constraints and since we have such a diverse age group it will be difficult to find random
persons. Secondly, we created the 2D and 3D prototypes as desktop apps to test and
modify them. However, the apps would be created for your phone or tablet if intended
on launching the application to the public. It must be noted that it did not impact our
testing in any way, however, it could be a limitation since it is easier to see everything
on a desktop screen compared to a smart phone. Lastly, the designs of the 2D
applications were different from the final product. This is because one researcher did
not have any HTML and CSS development experience and therefore we used an open-
source template which we changed to our liking. The following section will conclude
this paper and discuss the main findings.

5 Conclusion and Discussion

The aim of this paper was to find an answer to the research question ‘What is the user-
friendliness of 3D versus 2D visualizations for regulating smart hardware around the
house?’. The results of the two user evaluations have shown us that the 2D version
would be the best if we look at the scores from the SUS Post-Test Survey. However,
we had one outlier that gave the 3D version a low score compared to the other
participants. If we had more participants it is very likely that the 3D version would have
scored higher than the 2D version. Another relevant insight is that during the second
user evaluation, 7 out of 10 tasks were performed faster in the 3D version. During the
second user evaluation, each task was also completed 0,5 seconds faster in the 3D
version. Therefore, the efficiency in the 3D version is significantly higher. It can be
concluded that if the user evaluation sample size was increased, the 3D version would
likely score a higher average score. Furthermore, it is also the most efficient
visualization and the efficiency will likely increase as people get more used to the 3D
version.
6 Future Work

Our prototype is just a proof of concept. If it would be turned into a real app there are
some things that still need to be implemented. The house used in our prototype is just
an arbitrary house filled with devices, but in a real use case this would of course be the
house of the user. It would not be practical for the user to build the house in the same
manner as we did for our prototype, so we would need to find a better way. A possible
solution could be to use the camera and LiDAR scanner on your smartphone to create
a 3D model of your house, which the app would convert to the same simplistic style as
the house in our prototype. You could then mark certain parts of the interior as a smart
device and connect its controls. Besides this, the app would need a login function. The
login information along with the data of the house would need to be stored in a secure
database.

Right now the prototype is a windows executable as this allowed us to test it more
efficiently, but ideally it would be an app for a smartphone or tablet. During our design
process we took this into account, so all controls could easily be converted to controls
using a touch device, but the actual conversion would still need to be done.
References

Benyon, D. (2013). Designing interactive systems: A Comprehensive Guide to HCI,

UX and Interaction Design.

Creating a smart Home Controlling UI CSS HTML Javascript. (2023, October 15).

Retrieved from https://prosepond.blogspot.com/2023/08/creating-smart-

home-controlling-ui-css.html

Smart Home 3D webapp by Wassim. (2024, January 4)

Retrieved from

https://www.homesmartmesh.com/docs/applications/home3d/
Appendix

Time Task Measured Current Worst Planned Best


2D/3D

Initial 1 Time on 4,4 s / 2,2 10 s 5s 2s


performance first trial s

Initial 2 Time on 7 s / 4,2 s 20 s 10 s 3s


performance first trial

Initial 3 Time on 9,6 s / 7,8 20 s 10 s 3s


performance first trial s

Initial 4 Time on 4,6 s / 8s 4s 2s


performance first trial 2s

Initial 5 Time on 7,4 s / 8,4 25 s 15 s 5s


performance first trial s

Initial 6 Time on 3,4 s / 8s 4s 2s


performance first trial 2,4 s

Initial 7 Time on 7s/ 20 s 10 s 3s


performance first trial 8s
Initial 8 Time on 3s/ 8s 4s 2s
performance first trial 2,2 s

Initial 9 Time on 5,2 s / 9 s 20 s 10 s 3s


performance first trial

Initial 10 Time on 12 s / 20 s 10 s 3s
performance first trial 12 s
Table 2.1: Results of testing round 2 (time measured in seconds).
Time Task Measured Current Worst Planned Best
2D/3D

Initial 1 Time on 10,5 s / 3,3 10 s 5s 2s


performance first trial s

Initial 2 Time on 5,4 s / 13,5 20 s 10 s 3s


performance first trial s

Initial 3 Time on 10,1 s / 7,1 20 s 10 s 3s


performance first trial s

Initial 4 Time on 7,1 s / 20 s 10 s 3s


performance first trial 12,4s

Initial 5 Time on 12,3 s / 5,2 8s 4s 2s


performance first trial s

Initial 6 Time on 14,9 s / 25 s 15 s 5s


performance first trial 16,6 s

Initial 7 Time on 4,8 s / 8s 4s 2s


performance first trial 4,2 s

Initial 8 Time on 9,8 s / 20 s 10 s 3s


performance first trial 8,4 s

Initial 9 Time on 2,6 s / 3,3 8s 4s 2s


performance first trial s

Initial 10 Time on 5,6 s / 20 s 10 s 3s


performance first trial 9,9 s

Table 2.2: Results of testing round 1 (time measured in seconds).

You might also like