Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 65

VISVESVARAYA TECHNOLOGICAL

UNIVERSITY
Jnana Sangama, Belgavi-590018

MINI PROJECT
REPORT ON

Submitted in Partial fulfillment of the requirements for 6th Semester

Bachelor of Engineering
In
Electronics and Communication Engineering

by

Mohith Kumar G 1KS19EC048


Srinivasan M 1KS19EC088

Under the Guidance


of
Dr. Rekha N
Associate Professor, KSIT

DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING


K.S. INSTITUTE OF TECHNOLOGY, BENGALURU-560109
2021-22
K. S. INSTITUTE OF TECHNOLOGY
No. 14, Raghuvanahalli, KanakapuraRoad, Bangalore-560109
DEPARTMENT OF ELECTRONICS AND COMMUNICATION
ENGINEERING

CERTIFICATE

This is to certify that the mini project work entitled HOME AUTOMATION AND
CONTROL USING COMPUTER VISION carried out by Mohith Kumar G -
1K19EC048, Srinivasan M - 1KS19EC088, are bona fide students of KSIT in partial
fulfillment for the award of Bachelor of Engineering in Electronics and Communication
Engineering under Visvesvaraya Technological University, Belagavi during the year 2021-
2022. It is certified that all corrections/suggestions indicated for Internal Assessment have been
incorporated in the report deposited in the departmental library. The project report has been
approved as it satisfies the academic requirements in respect of Project work prescribed for the
said Degree.

Dr. Rekha N Dr. P N Sudha Dr. Dilip Kumar K


Associate Professor Professor and HOD, Principal and Director,
Department of ECE, KSIT Department of ECE, KSIT
KSIT

External Viva
Name of the Examiners Signature with Date

2
ACKNOWLEDGEMENT

The success of this project depends largely on the encouragement of many


individuals who have guided us through this. We take this opportunity to express our
gratitude to those who have been instrumental in the successful completion of this project.

We wish to express our deep sense of gratitude to our institution, K.S. Institute of
Technology for its ideals and inspirations.

We express our sincere thanks to Dr. Dilip Kumar K , Principal and Director,
KSIT, for his encouragement in completing this project and for facilitating academic
excellence in the college.

We take immense pleasure in thanking Dr. Sudha N, Professor and Head,


Department of Electronics and Communication Engineering, KSIT for having given
this opportunity and support to carry out our project work.

We wish to express our deep sense of gratitude to our internal guide Dr. Rekha N,
Associate professor, Department of Electronics and Communication Engineering, KSIT
for her able and timely guidance.

We also thank our Project Coordinator Dr. Rekha N, Associate Professor,


Department of Electronics and Communication Engineering, KSIT who has helped and
motivated us at every stage to complete this project successfully.
ABSTRACT

Computer vision gives us the ability to detect and analyze human


actions from images and videos. Artificial Intelligence combined with
computer vision can be used to perform certain tasks based on the
information detected from a video or an image. This paper introduces a
home automation framework using computer vision and image
processing. The work done will derive meaningful gestures of a person
from a video and use it to control the home appliances. Based on the
hand gesture detected, the corresponding appliance can be controlled or
automated. The work done aims to develop a framework on which the
appliances can be automated based on desired parameters by the user.
TABLE OF CONTENTS

CHAPTERS PAGE NO

CHAPTER 1: INTRODUCTION 01

CHAPTER 2: LITERATURE SURVEY 03

CHAPTER 3: HARDWARE AND SOFTWARE REQUIREMENT 05

3.1 HARDWARE COMPONENTS 08

3.10 SOFTWARE REQUIREMENTS 14

CHAPTER 4: IMPLEMENTATION AND METHODOLOGY 16

4.1 WORKING PROCEDURE

CHAPTER 5: RESULTS 19

CHAPTER 6: CONCLUSION AND FUTURE WORK

6.1 CONCLUSION 21

6.2 FUTURE WORK 22

REFERENCES

II
CONTENTS

TABLE NO DESCRIPTION PAGE NO

Table 3.1 Hardware Components Used 08


Table 3.2 Software tools used 10
Table 3.32 Specification of Arduino uno
Table 3.6 Specification of LED

IV
LIST OF FIGURES

FIGURE NO. DESCRIPTION PAGE NO

Figure 3.1 Arduino Uno 7


Figure 3.2 IR Sensor 10
Figure 3.3 Light Dependent Resistor 12
Figure 3.4 Light Emitting Diode 14
Figure 3.5 JTAG Cable 16
Figure 3.6 Arduino IDE 19
Figure 4.1 Circuit Diagram of Auto
Intensity Control of Street lights 22
Figure 4.2 Block Diagram of Auto
Intensity Control of Street lights 23
Figure 4.3 Flow Chart of Auto Intensity Control
of Street Lights 25
Figure 5.1 Initial Step 26
Figure 5.2 Operation Phase 1 27
Figure 5.3 Operation Phase 2 28
Figure 5.4 Operation Phase 3 29

V
CHAPTER 1

INTRODUCTION
Home automation is building automation for a home, called a smart home or smart
house. A home automation system will monitor and control home attributes such as
lighting, climate, entertainment systems, and appliances.
Computer vision is a field of artificial intelligence (AI) that enables computers and
systems to derive meaningful information from digital images, videos and other visual
inputs — and take actions or make recommendations based on that information.
Automation is the key to progress in any industry. The ability to control and automate
the tasks will reduce the time and effort, helps us to focus on our primary objectives.
One such automation is home automation. The controlling of home appliances can be
automated according to our needs. Usually, the appliances are manually controlled by a
switch. To automate the switching of appliances we have Bluetooth, Internet of Things,
smartphone applications. Although all these methods are able to automate a home, we
are trying to develop an alternative solution for the same. In our proposed work we are
developing the system in such a way that anybody can control the home appliances
hassle free by just using their hand gestures. We are developing an automation
framework, upon which any appliance or any other feature can be added. Through
computer vision we will be able to detect the hand gestures, upon detecting the hand
gestures we need to process it, in order to get meaningful information from it. The
corresponding appliance has to be controlled based on the gesture detected. There is an
interface between computer vision and the microcontroller that controls the appliances.
The information derived by the computer vision should be communicated to the
microcontroller and then the appliances will be controlled by the microcontroller. This
method provides an ecosystem inside a house such that anything can be automated as
desired by the user.

1.1 OBJECTIVE:
 To derive meaningful gestures of a person, from a video and use it to control the
home appliances.
 Based on the hand gesture detected the corresponding appliance will be
controlled or automated.
 Our project aims to automate everything in a house, starting from the main door
to lights, fans, Air conditioners etc.

1.2 APPLICATIONS AND ADVANTAGES:


● This system can be used for Industrial automation and Healthcare automation.
● This system can be used along with Energy monitoring system.
● It can be applied to Internet of things as an integrated system
● The response time of the system is good.
● This system will detect all gestures as desired by the user and can be used to
control or automate the appliance based on given parameters.
CHAPTER 2

LITERATURE SURVEY

This paper uses segmented image processing to automate a home, a http server is used
to transfer the data between devices and control it. Raspberry pi is used as the
microprocessor to process all the data and receive it. To detect the wastage of electricity
a circuit is used [1]. In this paper Artificial intelligence is used in the process of imaging
through computer vision. It introduces the methods that can be used through computer
vision in order to extract images from a webcam video and to process it for our
requirements. This approach can be used to process the image using artificial
intelligence. The calculation methods and approaches that are basically used to derive
information were discussed [2]. Using the internet of things protocol the automation of
home appliances was developed. The different scenarios are considered and analyzed.
Based on the analyzed data the approaches for the automation of home were modified.
The control of state of the appliances like bulbs, fans and other appliances were
developed. The security aspect of the internet of things were taken care. The possible
ways that can be implemented to improve the security level of the internet of things
protocol and application layer [3]. This paper uses the methods to vary the brightness of
the home light. Potentiometer is used to control the intensity of the light. Based on the
analog value received from the potentiometer the values are mapped to a bulb. The
Analog to digital converter is used to convert the potentiometer analog values to digital
by a microcontroller. The potentiometer values are then converted to a range of 0 to 255
such that using pulse width modulation these values can be driven to a driver and in turn
varying the intensity of light. The paper discussed on the different ways and approaches
that can be used to reduce the electricity wastage [4]. This paper proposes a system that
detects the activity of the human and can control the bulbs accordingly. This system
uses Internet of things to solve this problem. The motion of the human is detected and
that data is sent to a http server through the internet. The received data is then used to
control the on off state of the bulb or any appliance. The mobile application developed
can be used to control the automation of the appliances. This approach can be used in
scenarios where the human motion has to be taken as a parameter to automate a device
or control its state [5]. This paper discusses the hand gesture detection based on the
parameters that corresponds to the shape of the hand. The location of each finger in a
hand is tracked. The orientation of the hand, whether the fingers are raised or folded. All
these attributes are checked, determining the gesture based on these attributes and
arriving at a solution was discussed. Implementation of this algorithm on different
lighting conditions was also explored. Based on different skin textures and skin tones
the algorithm was optimized [6].
CHAPTER 3
HARDWARE AND SOFTWARE
REQUIREMENTS

3.1 HARDWARE COMPONENTS REQUIRED:

Sl.no. Component Quantity


1 Arduino Uno 1
2 L298N motor driver 1
3 LED 3
4 DC motor 1
5 Servo motor 1
6 Smartphone as webcam 1
7 USB wire 1

3.2 SOFTWARE TOOLS REQUIRED:

Sl.no. Component
1 Python
2 Embedded C
3 Open CV library
4 CVzone2 library
5 Pycharm IDE
6 Arduino IDE
7 USB wire

3.3 HARDWARE COMPONENTS


3.3.1 ARDUINO UNO
Figure 3.1: Arduino Uno

The Arduino Uno is an open-source microcontroller board based on the Microchip


ATmega328P microcontroller and developed by Arduino.cc. The board is equipped
with sets of digital and analog input/output (I/O) pins that may be interfaced to various
expansion boards (shields) and other circuits. The board has 14 digital I/O pins (six
capable of PWM output), 6 analog I/O pins, and is programmable with the Arduino IDE
(Integrated Development Environment), via a type B USB cable. It can be powered by
the USB cable or by an external 9-volt battery, though it accepts voltages between 7 and
20 volts.
"Uno" means one in Italian and was chosen to mark the release of Arduino Software
(IDE) 1.0. The Uno board and version 1.0 of Arduino Software (IDE) were the
reference versions of Arduino, now evolved to newer releases. The Uno board is the
first in a series of USB Arduino boards, and the reference model for the Arduino
platform; for an extensive list of current, past or outdated boards see the Arduino index
of boards.

The 14 digital input/output pins can be used as input or output pins by using
pinMode(), digitalRead() and digitalWrite() functions in arduino programming. Each
pin operates at 5V and can provide or receive a maximum of 40mA current, and has an
internal pull-up resistor of 20-50 KOhms which are disconnected by default. Out of
these 14 pins, some pins have specific functions as listed below:

1. Serial Pins 0 (Rx) and 1 (Tx): Rx and Tx pins are used to receive and transmit TTL
serial data. They are connected with the corresponding ATmega328P USB to TTL serial
chip.
2. External Interrupt Pins 2 and 3: These pins can be configured to trigger an interrupt
on a low value, a rising or falling edge, or a change in value.
3. PWM Pins 3, 5, 6, 9 and 11: These pins provide an 8-bit PWM output by using
analogWrite() function.
4. SPI Pins 10 (SS), 11 (MOSI), 12 (MISO) and 13 (SCK): These pins are used for SPI
communication.
5. In-built LED Pin 13: This pin is connected with an built-in LED, when pin 13 is
HIGH – LED is on and when pin 13 is LOW, its off.
6. Along with 14 Digital pins, there are 6 analog input pins, each of which provide 10
bits of resolution, i.e. 1024 different values. They measure from 0 to 5 volts but this
limit can be increased by using AREF pin with analog Reference() function.
7. Analog pin 4 (SDA) and pin 5 (SCA) are also used for TWI communication using the
Wire library.
Arduino Uno has a couple of other pins as explained below:
1. AREF: Used to provide reference voltage for analog inputs with analogReference()
function.
2. Reset Pin: Making this pin LOW, resets the microcontroller.

3.3.2 SPECIFICATIONS OF ARDUINO UNO

Microcontroller ATmega328
Operating voltage 5V

Input voltage recommended 7-12V

Input voltage limits 6-20V

Digital I/O pins 14(of which 6 provide PWM output)

Analog input pins 6

DC current per I/O pins 40Ma

DC current for 3.3V pins 50Ma


Flash memory 32KB (ATmega328) of which 0.5kb used
by bootloader
SRAM 2KB (ATmega328)

EEPROM 1KB (ATmega328)

Clock speed 16MHz


3.4 L298N Motor driver

Figure 3.41 L298N MOTOR DRIVER


PWM, or pulse width modulation is a technique which allows us to adjust the average
value of the voltage that’s going to the electronic device by turning on and off the power
at a fast rate. The average voltage depends on the duty cycle, or the amount of time the
signal is ON versus the amount of time the signal is OFF in a single period of time.
So depending on the size of the motor, we can simply connect an Arduino PWM output
to the base of transistor or the gate of a MOSFET and control the speed of the motor by
controlling the PWM output. The low power Arduino PWM signal switches on and off
the gate at the MOSFET through which the high-power motor is driven.

On the other hand, for controlling the rotation direction, we just need to inverse the
direction of the current flow through the motor, and the most common method of doing
that is by using an H-Bridge. An H-Bridge circuit contains four switching elements,
transistors or MOSFETs, with the motor at the centre forming an H-like configuration.
Figure 3.42 H-Bridge
By activating two particular switches at the same time, we can change the direction of
the current flow, thus change the rotation direction of the motor

The pinout of L298N module and explain how it works. The module has two screw
terminal blocks for the motor A and B, and another screw terminal block for the Ground
pin, the VCC for motor and a 5V pin which can either be an input or output.
Figure 3.43 L298N
MOTOR
DRIVER
connection

This depends on
the voltage used
at the motors VCC. The module have an onboard 5V regulator which is either enabled
or disabled using a jumper. If the motor supply voltage is up to 12V we can enable the
5V regulator and the 5V pin can be used as output, for example for powering our
Arduino board. But if the motor voltage is greater than 12V we must disconnect the
jumper because those voltages will cause damage to the onboard 5V regulator. In this
case the 5V pin will be used as input as we need connect it to a 5V power supply in
order the IC to work properly.
Next are the logic control inputs. The Enable A and Enable B pins are used for enabling
and controlling the speed of the motor. If a jumper is present on this pin, the motor will
be enabled and work at maximum speed, and if we remove the jumper we can connect a
PWM input to this pin and in that way control the speed of the motor. If we connect this
pin to a Ground the motor will be disabled.

3.5 LIGHT EMITTING DIODE


Figure 3.5 LED
A light-emitting diode (LED) is a two-lead semiconductor light source. It is a p-n
junction diode that emits light when activated. The long terminal is positive and the
short terminal is negative. When a suitable current is applied to the leads, electrons are
able to recombine with electron holes within the device, releasing energy in the form of
photons. This effect is called electroluminescence, and the color of the light
(corresponding to the energy of the photon) is determined by the energy band gap of the
semiconductor. LEDs are typically small (less than 1 mm2 ) and integrated optical
components may be used to shape the radiation pattern. LEDs are versatile
semiconductors with a number of attributes which make them perfect for most
applications. Their features include: ∙ Long Life: LEDs can last over 100,000 hours (10+
years) if used at rated current ∙ No annoying flicker as we experience with fluorescent
lamps. ∙ LEDs are impervious to heat, cold, shock and vibration. ∙ LEDs do not contain
breakable glass. ∙ Solid-State, high shock and vibration resistant ∙ Extremely fast turn
on/off times ∙ Low power consumption puts less load on the electrical systems
increasing battery life. Here we have used the most common 5mm white light. White
LEDs are perfect for replacing inefficient incandescent bulbs in night lights and path
lights.

Different colors

Inside the semiconductor material of the LED, the electrons and holes are contained
within energy bands. The separation of the bands (i.e. the bandgap) determines the
energy of the photons (light particles) that are emitted by the LED.

The photon energy determines the wavelength of the emitted light, and hence its color.
Different semiconductor materials with different bandgaps produce different colors of
light. The precise wavelength (color) can be tuned by altering the composition of the
light-emitting, or active, region.

LEDs are composed of compound semiconductor materials, which are made up of


elements from group III and group V of the periodic table (these are known as III-V
materials). Examples of III-V materials commonly used to make LEDs are gallium
arsenide (GaAs) and gallium phosphide (GaP).

Until the mid-90s LEDs had a limited range of colors, and in particular commercial blue
and white LEDs did not exist. The development of LEDs based on the gallium nitride
(GaN) material system completed the palette of colors and opened up many new
applications.

Main LED materials

The main semiconductor materials used to manufacture LEDs are:

● Indium gallium nitride (InGaN): blue, green and ultraviolet high-brightness LEDs
● Aluminum gallium indium phosphide (AlGaInP): yellow, orange and red high-
brightness LEDs
● Aluminum gallium arsenide (AlGaAs): red and infrared LEDs
● Gallium phosphide (GaP): yellow and green LEDs

3.6 SPECIFICATION OF LED:

Intensity 28,500mcd

Color Freq x=31 y=32


Viewing Angle 48º

Lens Water Clear

Voltage 3.0v-3.3v

Typical 3.1v

Current 20mA

3.7 MICRO SERVO MOTOR


Figure 3.7 Servo motor
A servomotor (or servo motor) is a rotary actuator or linear actuator that allows for
precise control of angular or linear position, velocity and acceleration.[1] It consists of a
suitable motor coupled to a sensor for position feedback. It also requires a relatively
sophisticated controller, often a dedicated module designed specifically for use with
servomotors.
Servomotors are not a specific class of motor, although the term servomotor is often
used to refer to a motor suitable for use in a closed-loop control system.
Servomotors are used in applications such as robotics, CNC machinery, and automated
manufacturing.

3.8 SOFTWARE REQUIREMENT:


3.81 ARDUINO IDE:
Figure 3.81 Arduino IDE
The Arduino Integrated Development Environment (IDE) is a cross-platform application
(for Windows, macOS, Linux) that is written in functions from C and C++. It is used to write
and upload programs to Arduino compatible boards, but also, with the help of third-party
cores, other vendor development boards.

The source code for the IDE is released under the GNU General Public License, version 2. The
Arduino IDE supports the languages C and C++ using special rules of code structuring. The
Arduino IDE supplies a software library from the Wiring project, which provides many
common input and output procedures. User-written code only requires two basic functions, for
starting the sketch and the main program loop, that are compiled and linked with a program
stub main() into an executable cyclic executive program with the GNU toolchain, also included
with the IDE distribution. The Arduino IDE employs the program avrdude to convert the
executable code into a text file in hexadecimal encoding that is loaded into the Arduino board
by a loader program in the board's firmware. By default, avrdude is used as the uploading tool
to flash the user code onto official Arduino boards.

Arduino IDE is a derivative of the Processing IDE, however as of version 2.0, the Processing
IDE will be replaced with the Visual Studio Code-based Eclipse Theia IDE framework. With
the rising popularity of Arduino as a software platform, other vendors started to implement
custom open-source compilers and tools (cores) that can build and upload sketches to other
microcontrollers that are not supported by Arduino's official line of microcontrollers.

The Arduino Integrated Development Environment - or Arduino Software (IDE) - contains a


text editor for writing code, a message area, a text console, a toolbar with buttons for common
functions and a series of menus. It connects to the Arduino hardware to upload programs and
communicate with them.

Writing Sketches

Programs written using Arduino Software (IDE) are called sketches. These sketches are
written in the text editor and are saved with the file extension .ino. The editor has features for
cutting/pasting and for searching/replacing text. The message area gives feedback while saving
and exporting and also displays errors. The console displays text output by the Arduino
Software (IDE), including complete error messages and other information. The bottom right
hand corner of the window displays the configured board and serial port. The toolbar buttons
allow you to verify and upload programs, create, open, and save sketches, and open the serial
monitor.

CHAPTER 4

IMPLEMENTATION AND METHODOLOGY


4.1 WORKING PROCEDURE:
This proposed concept is based on Computer vision and arduino. The user will be able
to show few hand gestures towards the camera and control the appliances. In order to
recognize and detect the gesture we are using python opencv library and mediapipe
library, this will help us to detect the hand gesture. Then the commands will be sent by
python to arduino through USB serial communication. The arduino microcontroller,
based on the command received through serial communication will control the
corresponding appliance. The state of the appliance can be controlled and any analog
parameter can also be varied. In case of a bulb the brightness of a bulb can be varied.
The speed of the fan, temperature can be set in air conditioners.
Fig4.1: Block diagram

The system can be divided into two, software and hardware. Using python opencv
library and mediaipipe library we have developed a hand gesture detection module.
The smartphone camera is used instead of webcam for better video quality. The video
captured by the smartphone is processed by the hand gesture detection module. The
processing of the video is based on the hand node points. The nodepoints are the points
marked on the vital motionary parts in a hand. Each finger and within the finger each
movable part is marked as a node point. For the brightness level of the bulb, the distance
between index finger and thumb finger is calculated. That distance is mapped to the
analog value of the appliance and sent to arduino. The commands and parameters are
sent to arduino through USB serial communication. The python program developed will
detect the hand gesture and send the commands through serial communication to
arduino.
Based on the command received by the arduino microcontroller bulb, fan and other
home appliance will be controlled. Arduino microcontroller uses the PWM pins to vary
the brightness level. DC appliances can be directly controlled by the arduino. AC
appliances can be controlled by a driver or a triac.

In this project we are going to derive meaningful gestures of a person, from a video and
use it to control the home appliances. We can divide the project into two sections, one is
software - Gesture detection using CV and the second is hardware - Automation and
control using Arduino microcontroller board. To communicate between python and
Arduino we are using serial communication.
Figure 4.2 Hand Node Points

B. FLOWCHART
Fig 4.3: Flowchart

Initially the hand will be detected. If it is not detected then it will wait until it recognizes
the hand. Then the hand gesture will be detected. This detected gesture value has to be
mapped to the parameter that we are willing to control in the home appliance. Once the
value is mapped then it has to be sent to Arduino through USB serial communication.
Arduino receives the command and the parameter that has to be varied. Upon receiving
the values, Arduino will control or automate the corresponding appliance. The Arduino
microcontroller uses pulse width modulation to control the intensity of light or vary the
speed of the fan.

CHAPTER 05
RESULTS

1) The computer vision module developed in the project detects the presence of hand. It
draws a bounding box around the hand as a indication that within the box area the
gesture has to be detected. It also marks the nodal points within the hand, that is on
the fingers and important motion joints in our hand.
2) The python module then tracks the node points and identify the gesture shown by
the user. In this gesture the distance
between index finger and thumb
finger is calculated. This
distance can be used to indicate the
brightness of the home lights or can
be used to vary the speed of the fan.
3) In this gesture we can observe that distance between the index finger and thumb
finger is reduced, indicating to reduce the brightness or the intensity of the bulb.
4) The python module will then send the commands through serial communication to
arduino. As we can see in the image, the gesture is same as fig 3.3, indicating to turn
off the bulb. The leds are not glowing.
5) When we increase the distance between the index finger and thumb finger, this value
gets mapped to corresponding brightness level range (0 to 255) and this value is
pulse width modulated to vary the brightness. Thereby we can see the intensity of the
led being varied in accordance with distance between index and thumb finger.

CHAPTER 06

CONCLUSION AND FUTURE WORK

6.1 CONCLUSION:
The framework developed will be able to automate and control all the home appliances.
This system will detect all gestures as desired by the user and can be used to control or
automate the appliance based on given parameters.

6.2 Future Work:


The efficiency of detecting the gesture and automating the appliance can be improvised.
The response time of hand gesture detection and automation can be reduced. Energy
monitoring system along with control can be implemented.

REFERENCES

i. Agarwal, Mohammad Hasnain, Rishab S, Mayank P, Swapnil G. "


Smart Home Automation using Computer Vision and Segmented
Image Processing." In 2019 International Conference on
Communication and Signal Processing (ICCSP), IEEE, 2019.

ii. Faris Alsuhaym, Tawfik AL-Hadrami, Kenny Awuson-David. "


Toward Home Automation: An IoT Based Home Automation
System Control and Security." 2021 International Congress of
Advanced Technology and
Engineering(ICOTEN),IEEE.doi:10.1109/ICOTEN52080.2021.9493
464

iii. Xin Li, Yiliang Shi. "Computer Vision Imaging Based on Artificial
Intelligence." 2018 International Conference on Virtual Reality and
Intelligent Systems (ICVRIS), doi: 10.1109/ICVRIS.2018.00014.
iv. G.V.Nagesh Kumar, A. Bhavya, J. Balaji, S. Dhanunjay, P.
Vikasitha, V. Rafi. "Smart Home Light Intensity Control using
Potentiometer method for Energy Conservation," 2021 International
Conference on Recent Trends on Electronics, Information,
Communication & Technology (RTEICT), doi:
10.1109/RTEICT52294.2021.9573521.

v. Nursyazwani Adnan, Noorfazila Kamal, Kalaivani Chellappan. " An


IoT Based Smart Lighting System Based on Human Activity." 2019
IEEE 14th Malaysia International Conference on Communication
(MICC), doi: 10.1109/ICAC347590.2019.9036777.

vi. Meenakshi Panwar. "Hand gesture recognition based on shape


parameters." 2012 International Conference on Computing,
Communication and Applications. doi:
10.1109/ICCCA.2012.6179213.

You might also like