Report

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Assignment 2

Group 26

Rejhan Kavazbašić 0391174

Sander Breshamer 5034973

Introduction
When designing the plant refresher product we focused on making it user-friendly and widely
applicable in day-to-day use, even though it is a standalone product. This started with
creating a physical housing that accommodated all the internals while also being portable.
For this, we chose a multi-piece, modular 3D-printed housing. This resulted in us being
flexible with iterations and reducing printing time
overhead.

Afterwards, we created the program itself and the


watering mechanism. This was followed by a
seamless and easy-to-use gesture system to show
the data of the plant and manually trigger the
watering mechanism. Our testing showed positive
feedback while also critiquing the system on the
base concept.

System details
The plant watering system is integrated into a
housing 3D printed by us. We used the housing cited
in the appendix as a template to integrate our
components into it. We however heavily modified the
internals to make sure no shorts could happen due to heavy movements. The servo is
housed in the tubing from the bottle.

Furthermore, we have created our own gesture mount in the shape of a ring that the user
wears. This is wired onto the Arduino with long jumper cables. The One Ring.
The system currently can read and display the amount of light present, soil moisture levels,
temperature and ambient pressure. This is displayed on the OLED display integrated into the
housing and also communicated through MQTT to the application on our phones. We used
the Adafruit_SSD1306 library to flip the OLED screen output because the housing we used
required a flipped OLED screen to fit properly.

In the code, the degrees of freedom for the servo are -80 to 180 instead of 0 to 180. This
increases our total amount of degrees of freedom. The manual mode is activated by the app.
When a user triggers the watering feature it automatically switches to the manual mode and
when watering the screen displays the message “watering, wait”, as an indication of
imminent watering.

We have used the moisture sensor for measuring the moisture levels in the soil of the plants.
The terminals of the soldered ends are covered with automotive-grade heat shrink material
to prevent unwanted shorts because of the watering mechanism, creating a suicide situation.

An LDR sensor sticks out of the top of the housing to measure the lumen levels outside. This
is consecutively displayed on the display. Furthermore, a LED is installed internally for
diagnostic purposes to see if the manual mode is triggered. For the user, the display will say
in which mode the device is which is more user-friendly, this is also viewable in the app.
There also is a button for triggering the manual mode on the inside of the casing, again for
diagnostic purposes, but this can also be triggered on the app on the user their phones,
which we found was more user-friendly than the button.

Libraries used

ESP8266WIFI network connection

PubSubClient MQTT server broker connection

Servo Servo control

Adafruit_SSD1306 OLED screen flip & general screen

Wire I2C address association

Adafruit_BMP280 For the BMP280 sensor control

Automatic mode
In automatic mode, the plant watering system activates the watering
cycle when the threshold of 20% of the total 1024 positions is
reached in the soil moisture sensor. Meaning position 205 < soil
moisture. When this happens a servo lowers a transparent tube that
is filled with water and waters the plant. The mechanism relies on the
bottle being filled to a minimum level and gravity.
During the automatic mode of the plant system, it cycles
through multiple screens. The screens will contain the
previously mentioned sensor data (i.e light, soil moisture,
temperature, ambient pressure and current mode) and cycle
between these during the automatic mode. In addition to
these sensor values, it will also cycle to a screen that
displays how much time has elapsed between the previous
‘watering session’ and now. The interval of the cycle is 10
seconds.

MQTT
For the communication with our app on our phones (as seen in Figure 3) we used the MQTT
protocol. Furthermore, we used the UU MQTT broker for actual implementation while using a
local Mosquitto MQTT broker for initial testing purposes. We used MQTT Explorer to
subscribe to the broadcasts. We used a frequency for updating the subscribers of 10000
milliseconds. We also used the default QOS level of 0. We did not find it necessary to
sacrifice more overhead for the guarantee of delivery. The sensor data will not be vital on an
MQTT viewable basis as the computing is done on the device itself.

MQTT topics

Name Address

MQTT_topic infobb3it/049/light

motor_topic infobb3it/049/motorswitch

mode_topic infob3it/049/modeSwitch

moisture_topic infobb3it/049/moisture

pressure_topic infobb3it/049/pressure

temp_topic infobb3it/049/temp

mode_topic infobb3it/049/mode

modeswitch_topic infobb3it/049/modeswitch

The motor_topic is used to trigger the motor via the app and the mode_topic switches the
device from the manual mode to the automatic mode. When the device is in automatic mode
and the user signals via the app that it wants to spray it is automatically switched to manual
mode and it sprays. All other modes are self-descriptive, the light topic equals lumen output
presented in percentage, to make it relative, moisture equals moisture level in percentages,
pressure is equal to the Hpa level in the air and the temp is output in Celsius.
The payload structure is currently displayed in the actual sensor values converted to make
sense to users. Also, the switch from manual mode to automatic mode sends a ‘manual’
message to the device to switch its current state to the opposite.

Node-Red

The trigger of the gestures


from the Arduino board is
transmitted to the ESP8266
with the use of node-red.
The following flow diagram
was created to transmit the
serial output of the Arduino
and publish it to the MQTT
network under the topic
name ‘gesture’
(INFOB3IT/049/gesture).
This transmitted the output
of either a Y or X from the
Arduino device. With X representing the screen gesture and Y representing the watering
gesture. We did add a limit function into our flow to prevent any unwanted queueing of the
serial communications sent to MQTT and triggering the gesture multiple times unintended.
Also added to the Node-Red environment is the Node-Red online dashboard. It shows the
current sensor data and is updated at regular intervals.

The future
In the end, we would have wanted to implement more, but sadly we were time constrained in
this aspect. As we are genuinely interested in using this device in our day-to-day lives we
wanted to actually attempt to create a system that is tailored to our wishes. For this, we
would have liked to implement a battery-powered system instead of a USB micro-b. We
would have also preferred USB type c if the battery was outside our scope.

Also, as mentioned on the Herlaar website, in the previous years the students received a
humidity sensor with this year not receiving one due to shortages. This sensor could have
been interesting to implement and to test the room conditions for a specific plant. We are
interested in locally hosting the current setup, but in the current state, it uses the UU MQTT
broker.

Gesture Elicitation
When we started to design our gesture control we first wanted a streamlined method for the
users interacting with the gesture system. For this, we developed a ring with the mounted
MPU6050 mounted on it. This, however, meant that while coding for this application we had
to realise that ring fitment and location might differ from person to person as is the ‘mounting’
location on the finger. This is something we wanted to test in the real world and the
upcoming study. In addition to this we wanted to test certain accessibility issues that could
arise such as difficulty with hands, spasms and recognizability of gestures (training/muscle
memory).

For the recruitment process, we needed to find a particular group that was willing and able to
partake in the short test. For this, we have invited people that, at least partly, match the
aforementioned criteria or similar. We ended up having one person with hand problems, with
swollen fingers and lack of nerve feel in the hand used in the testing scenario. Two persons
wearing other rings, so other fingers were used and two ‘normal’ users were without any of
the optional prerequisite requirements. Searching for our users was met with slight logistical
issues but for the most part, user recruitment went smoothly.

Users
Female 56 - chronic hand issues
Male 49 - Work (Metal industry)
Male 22 - Student
Male 23 - Student
Male 21 - Student

Scenarios
When studying our users we presented them with a number of scenarios during the tests,
which were the following:

- For instance, the plant watering device has a screen which displays data. You the
customer are intrigued by how the new plant is doing with its watering schedule and
you want to see the humidity statistics. This device is equipped with a gesture control
item, which is the ring.
- You realise that your plant has been situated near a heating element the past week
and you have been heating your house relentlessly the past week. Your plant has
completely dried up because the water reservoir has been empty before you
expected it. You have refilled the water but want the mechanism to be activated
immediately because of the dried plant’s situation.

These scenarios were displayed in this order for each participant as we wanted to compare
the ‘handicapped’ users to the normal users and keep the results consistent.

We presented the users with a plant in a vase, the bullet-shaped water device and the water
attachment, as well as asked them to wear the ring with the mounted MPU-6050 (without
wires). Afterwards, we asked them to create a suitable gesture that would trigger the water
mechanism and also asked them to create one for switching the data on the screen.

Observation
We noticed that our older users (2 of 5) deliberately chose very easy-to-use gestures. For
this, they shaped an open palm downwards and in another instance one made a fist and did
the same gesture, but thrusting upwards. They explained they tried to replicate the rain
coming down (palm and down) and the life created (the fist upwards). One of our users
chose a ‘pouring a glass of water’ gesture for the water mechanism trigger. One of the users
who was wearing other rings (decorative, not relationship based) wore the ring on their ring
finger and middle finger. Both fingers were interchangeable for this use case. The user made
a gesture by holding a watering can for watering and also for their second attempt tried
swiping “like you do on Tinder” with their index finger pointed forward.

Some users also did an open palm ‘shove’ to the left while the others imitated a ‘swipe’, like
on a touchscreen on their second attempt. Also, one participant used a twist of the wrist to
switch from screens and water the plant (opposite sides). The final participant with chronic
hand disease did not have any issues linked to accessibility and the device. They however
did not do any fists or ‘pressuring’ gestures as they did not prefer this. This person also did
an open palm swipe, once to the left and once to the right. Other gestures were performed
but all previously mentioned were executed successfully and were repeated from user to
user making it the most logical.

Classification criteria
For our classification criteria, we created the following table. We based this table on our
findings to compare them. We labelled each combination with meaning or category.

Swipe Thrust Pouring Turn/Rotate Hand action

Up Iconic Pantomimic Symbolic Iconic

Right Iconic Pantomimic Symbolic Iconic

Left Iconic Pantomimic Symbolic Iconic

Down Iconic Pantomimic Symbolic Iconic

Hand
direction
Consensus
When we tested our users we quickly found that the leads were quite short for testing and
general use purposes. Furthermore, when un-equipping the ring often actions would be
triggered. And also due to the plastic nature of the ring, if not fitted properly could rotate on
its own axis. In general, gestures were sufficiently controllable but often there would be
insufficient feedback if the action actually was completed other than the triggered
mechanism itself. The delay in the action was however sufficient for the usability of the
gesture.

The users often used open palm gestures, turn or a pouring motion for gestures. We
recognize these gestures as possible good gestures for implementation into our device. We
have determined an agreement score based on our observation.

Watering gesture

ID Name of gestures Subgroups with consensus Agreement score

G1 Pouring a can of water 2 0.16

G2 Turn hand 1 0.04

G3 Open palm rain 2 0.16

Agreement score 0.36

Agreement rate 0.2

Screen gesture

ID Name of gestures Subgroups with consensus Agreement score

G1 Open palm swipe 2 0.16

G2 Turn hand 1 0.04

G3 Fist upwards 1 0.04

G4 Swipe with finger 1 0.04

Agreement score 0.28

Agreement rate 0.1


Gesture Implementation
As we mentioned in our user testing we made a ring with a mount for
the MPU6050 chip. This was to be the base of our gesture system.
We connected this to our Stickuino via wires and transmitted the
signal to the ESP8266. When we set out to design the gestures we
wanted it to be simple. For this, we looked at our previously
conducted study for good candidates. The plant watering system
itself was also simple and the device is very mundane in general use.
Thus overcomplicating the gesture control would be out of place and
unnecessary as you would only use it rarely. We decided to use the Y
axis to control the two functions: Manual triggering of the water
system and showing plant data. We settled on the turn-hand gesture
as the implementation and repeatability seemed good from our study.
The gesture resembles a seesaw in that you can tilt your hand right
or left if you want to use the functions (see Figure 1 and Figure 2).

The constraints in this aspect were, matching the simplicity aspect of the rest of the device,
creating a gesture that has a relatively high usability aspect, the gesture being easy to
replicate multiple times and needing to work with the ring we created as a body mount. We
think that we have succeeded in this implementation.

To connect the ring to the Arduino PCB we extended the


jumper cables significantly and used some older Intel stock
cooler fan headers to connect it to the MPU6050 to make it
modular.

Usability testing

Methodology
For the usability testing, we watered our plant for a week and
in addition asked our previous users from our gesture study
to be in a qualitative observation. We focused on In Vivo style
content style analysis and thematic style analysis. Due to
time constraints as seen on the Herlaar website, we limited
the plant testing to 1 week. In previous testing of the gesture control, the users we tested,
gave us a wide palette of gestures. We eventually settled on the turn-hand gesture. This will
be evaluated in this usability test.

First, we gave the user time to use the device with minimal guidance. We gave them the
app, the current system and gesture control. Due to the fact we were strained for time we
used our previous candidates from the gesture testing. This resulted in slight biases in our
testing as some previous users provided the gesture used in the final product.

During the initial time with the device, we asked the five users to score the categories listed
below following their brief experience with the entire system and the gestures.
We asked the users to give a score from 1-5 for these categories:
- Usability, how useful and usable is the gesture to control the device, and is the
device in general usable?
- Accessibility, and how content are you with the interacting mount ring? Is it an
obstacle in an otherwise functioning program? Is the app satisfactory?
- Creativity, do you enjoy the novelty of the interaction with the device? Do you like the
design?

The following scores were given:

Category Average Standard


score deviation

Usability 2.6 0.89

Accessibility 4.4 0.89

Creativity 3.4 0.55

Analysis

Gesture
From this and the comments, we can conclude that while the user did not find the gesture to
be that usable in this context, the context the application was given in was satisfied. That is,
the ring was not found to be an obstacle to the interaction and solely the implementation or
the base concept was the issue. When we asked the user to clarify they stated the following.

“I don't see the benefit of building this into a product. It's cool though.”
“It looks a bit like a gimmick at the moment.” (translated from Dutch).

When we observed the users interact with the system we noticed a number of things. Initially
when the user needed to get used to the gesture they held the gesture for a longer period.
This caused messages to ‘’queue’’ and in turn the gesture to trigger the event for extended
periods of time. We tried to rectify this by adding a limit function in our Node-Red
environment, as previously mentioned in the system details. When users got acquainted with
the gesture system they would push the limits of the system resulting in the gyro
‘overshooting’ the target designated in the code at times. We rectified this by readjusting the
range of when the gesture triggers. These problems were shared by multiple users.
Furthermore, some users managed to pull out the jumper cables of the ring itself. We
changed the design by including some fan headers, which provided more rigidity in the
connection.

The device
In general, users were fond of the system as is. The watering system works as intended and
the sensor data is displayed correctly and clearly. Users however noted that some language
was ambiguous, but this was due to the limited screen size available due to the 0.96” OLED
screen (temperature would not fit so we changed it for a synonym). One user also pointed
out that, to finish off the design, nice heat-shrunk wires for the sensor in the plant pot would
be a nice-to-have, but due to the fact that we would need to disassemble the housing and
desolder some sensors to fit this we omitted this in the end. One more IT-oriented student
pointed out the micro B connection instead of a type c connection, we addressed this in our
future plans of the system details.

One slight detail of the watering system is that there is no max level on the bottle and due to
the design, if filled too high, the water would seep out slowly on its own. This is easily
remedied by replacing the bottle with a smaller example or drawing a max level line on the
current bottle.

App
One of the major critiques from users was the manual mode switch delay. When you would
press the button it would take quite a while to trigger and the feedback was poor. This is due
to the polling rate of our device. We realize that this is an issue for some, but to save power
and bandwidth this feature was added to the device and is essential to make the product
viable. Multiple users noted that the app was nice to see and were quite impressed, mostly
the older participants.

Appendix
Thingiverse.com. (n.d.). NodeMCU ESP8266 Modular Case - You Made It! by

vutana. Thingiverse. https://www.thingiverse.com/thing:2627220

Submission – Interaction technology (INFOB3IT). (n.d.).

https://herlaar.net/b3it/iot-submission/

You might also like