Bosepo Ali Hassan

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

AutoHome Inc

02.02.2022

Ali Sefo and Emir Halilović

Richmond Park International School of Sarajevo


Contents:

Abstract ………………………………………………………………………………………………………….. 2

Introduction …………………………………………………………………………………………………… 3

Methods ………………………………………………………………………………………………………… 5

Discussion ……………………………………………………………………………………………………… 8

Result ……………………………………………………………………………………………………………. 9

Conclusion ………………………………………………………………………………………………….. 10

Acknowledgements …………………………………………………………………………………….. 11

References …………………………………………………………………………………………………. 12
Abstract

We all have to agree that people cannot live in the world without modern
technology. We decided to “flow with the trend”. This could possibly make our
lives easier. The reason we made our project is because we wanted to share our
idea and revolutionize the future of houses and their inner systems. The problem
we laid our eyes on is the fact that most people don’t know how to turn on things
like air conditioning, washing machines and other home appliances that use
electricity. We went to every device that used electricity and learned how it works
and we figured that a computer induced signal would be enough to turn on a
light bulb or maybe even a sink (If it uses buttons and commands like the
modern sinks). The results will be fascinating as people won’t have to walk to the
switch to turn it on, instead we can build in a module that can use signals sent by
a computer, which will proceed to fulfill its given task.
The conclusion is that an automated home is a good feature to the world even
though it sounds lazy it is pretty useful and tactile. We used PictoBlox to create
and give these commands a purpose, we used stempedia for instructions. To add
on to the idea that people wont work, that will just make us even more lazy, and
today when we have such a problem with obesity that is just something that will
make it even worse. But as once people said, “For Every Problem There Is A
Solution” which is exactly what we tried to fix and possibly could with our idea…
P.S, this project was planned to be made as a “collab” or a vol. 2 of the last year
BOSEPO project “Can't Touch This” or the intruder alert that one of us made
which is why we will include it in our materials and methods part and perhaps in
more chapters.
Introduction:

In this world it is estimated that there are 1 billion(15%) disabled people but just
a bit less than 300 million of them are unable to move, which is why we are here
to help them. The project will be able to benefit them and also save them a lot of
suffering. This project is a code script that can be made in real life (one is still in
the making process), it is like a canvas that was colored by the most beautiful
colors. But as beautiful as it is, it needs a foundation. Now if you take that into
context, it matches with our situation, even though it is a program. If this idea
gets invested in, and someone modulates it and uses our program and code, then
all the people that have problems with moving will be helped. The fact that it is
voice controlled makes it even better. We could have made it controlled by a
remote but we decided to “put in the work and put in the hours” and we
successfully made the voice controlled module.

Now, we have spoken about the people with obesity problems. We are trying to
make as many possible reasons that the person with such conditions does not
misuse our project (which of course everyone can use). One of our solutions could
be a limit of usage of our module. For example, each person will be able to use
the module about 40-60 times per day to make sure that people after the limit do
try to move and do their “chores” by themselves.
Methods:

We made this project virtually meaning that we used a program. In our case, the
methods will be explained further in the text.

We used a program called PictoBlox (It is a program that uses codes, but the
codes are formed into blocks). It uses Python and as said stores the commands
into blocks, and then the blocks can be put in order. Later, they will create a
command and that's its base concept. We were using both options because we
thought it would be more interesting to make something with PictoBlox and
Python at the same time…

This is a project that is suitable for a bit better than average coders. There were
almost no costs, except the real model we are planning to build.

Parts you will need and the ones we used:

● A laptop or a computer with a camera


● The latest version of PictoBlox
● A good Internet connection
Step 1: Setting Up The Project

Begin by adding the Artificial Intelligence and Text to Speech extension. Follow the
steps below to add the extension:

-Open PictoBlox

-Click on Board and select evive

-Next, click on the Add extension button. You need to keep two things in mind
while working with this extension:

a) You need to log in to your PictoBlox account to use it.


b) That your computer is connected to the internet

-Select the Artificial Intelligence Extension

-Next, add the Text-to-speech extension. This extension lets you convert the Audio
into a text message. Click on the Add extension button again and scroll down till
you find the text-to-speech extension.

NOTE: The following steps are skipped due to their insignificance because it
is just styling your AI Assistant and writing your scripts
Step 2: Coding and Programming:

First thing we have to do is to add our sprites or characters. We chose the


following:

1. Naruto-sprite

CODE:

What happens here is when we click start, our


face recognition camera opens, analyzes the face
and if the face corresponds, you will be able to
control the AI (in this case, it's represented by
Naruto) but if the face isn't recognized, the security
protocol runs, which turns on the Intruder Alarm is
played.
Here, when our “special key”
(which is hidden) is pressed, it sees
the face which has to be centered
into the bounding box, and then
remembers it as the person which
in our case is Ali. This is how you
teach an AI how to remember a
face.

Now, the recognized person can access the


controllable parts of the house. For example, if
a sentence which the person says contains
“Lights on”, the AI will turn them on. Same thing
works with the radio and
2. Radio

This part is the code for the


weather station. We made our
AI tell us what to wear
depending on the temperature.
If the temp. is for example less
than 0 degrees, it tells you to
wear winter clothes like a jacket,
scarf etc.

This part of the code is responsible for


speech recognition. In this code, if the
AI hears the word “Radio On” it will
turn on and play “Never gonna give
you up”. This works in the same way for
the fan, lamp and the other elements of the house.
3. Fan

4. Lamp
Discussion:

Problems we had: Well, for us it most certainly was the coding itself. We faced
errors and some problems we couldn't solve. But thanks to our mentor Tarik
Sulić, we managed to finish everything.

How did we solve them: As we said, we had our mentor, who along the way also
teached us new, and very helpful codes that we may use sometimes in the
future…

What can we improve: We are very proud of our project and we've been thinking
a bit on this topic… We got suggestions from our friends, parents and teachers,
and all of them agreed that we should maximize the capability of our project and
perhaps add new features which will make this project be the first one to have it.
Also what we want is to make our voice-controlled house in real life with
the real size of a house. It would be great if Bosnia had a reason to be
considered seriously in the field of engineering and IT. Then, we will make a radio,
which when told to “what's the weather” tells you the weather outside and
recommends you what to wear.
Results:

All the people we showed our project to were indeed very fascinated, especially
because this is something rare to have here in our country. This was a very fun
process of making our project, we loved it, so we hope that you will too. On the
day of the competition we also wanted to show some pictures of the making and
some videos to the judges to make it more interesting…
Conclusion:
We made this project because we were being inspired by the future technology
and its limits. We decided to take a step and make something that will maybe be
useful to people that have difficulties moving… Our results showed that we have
the capability to do this and that we have people around us that really liked the
idea and that encouraged us to continue. My colleague and I can't wait to show
this project to the judges, but also one day to the whole world.
Acknowledgements:
We would like to express our deep gratitude to Tarik Sulić, our mentor and
class teacher for encouragement, giving useful critiques. We would also like
to extend our thanks to our school, Richmond Park Schools, for their help in
offering us the resources to make our project possible. Finally, we wish to
thank our parents for their support and encouragement throughout our
study.
References:
● https://thestempedia.com/project/artificial-intelligence-based-home-autom
ation/
● https://thestempedia.com/product/pictoblox/
● https://teachablemachine.withgoogle.com/

You might also like