18bec1080 Ece3999 Report

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 81

SMART AGRICULTURE SYSTEM

by

(ECE)
PRAYANSH S JOSHI (18BEC1229)
SWATI TIWARI (18BEC1080)
KSHITIJ JAIN (18BEC1145)
SHIKHAR KUMAR PADHY (18BEC1078)
ASHISH KUMAR (18BEC1294)
MONICA SINGH (18BEC1205)

A project report submitted to

Dr. VETRIVELAN. P

SCHOOL OF ELECTRONICS ENGINEERING

in partial fulfilment of the requirements for the course of

ECE3999
TECHNICAL ANSWERS FOR REAL WORLD PROBLEMS
in
B.Tech. ELECTRONICS AND COMMUNICATION ENGINEERING

Vandalur – Kelambakkam Road


Chennai – 600127
JUNE 2021
BONAFIDE CERTIFICATE

Certified that this project report entitled “Smart Agriculture System” is a bonafide work of
Prayansh S Joshi (18BEC1229), Swati Tiwari (18BEC1080), Kshitij Jain (18BEC1145),
Shikhar Kumar Padhy (18BEC1078), Ashish Kumar (18BEC1294) and Monica Singh
(18BEC1205) who carried out the project work under my supervision and guidance for
ECE3999 - Technical Answers for Real World Problems.

Dr. VETRIVELAN.P

Associate Professor Senior & HoD-B.Tech(ECE)

School of Electronics Engineering (SENSE),

VIT University, Chennai

Chennai – 600 127.


ABSTRACT

Agriculture is the broadest economic sector and plays an important role in the overall
economic development of a nation. Technological advancements in the arena of agriculture
will ascertain to increase the competence of certain farming activities.
In our project we have proposed a novel methodology for smart farming by linking a smart
sensing system and smart irrigation system through wireless communication technology. Our
system focuses on the measurement of physical parameters such as water level, air quality,
fire detection and leaf disease detection that plays a vital role in farming activities. Based on
the essential physical and chemical parameters we can help farmers to plan their strategies for
farming.
ACKNOWLEDGEMENT

We wish to express our sincere thanks and deep sense of gratitude to our project guide, Dr.
Vetrivelan. P, Associate Professor Senior, School of Electronics Engineering, for his
consistent encouragement and valuable guidance offered to us in a pleasant manner
throughout the course of the project work.

We are extremely grateful to Dr. Sivasubramanian. A, Dean of School of Electronics


Engineering, VIT Chennai, for extending the facilities of the School towards our project and
for her unstinting support

We express our thanks to our Head of the Department Dr. Vetrivelan. P for his
support throughout the course of this project.

We also take this opportunity to thank all the faculty of the School for their support
and their wisdom imparted to us throughout the course.

We thank our parents, family, and friends for bearing with us throughout the course of
our project and for the opportunity they provided us in undergoing this course in such a
prestigious institution.

SWATI PRAYANSH KSHITIJ SHIKHAR MONICA ASHISH


TABLE OF CONTENTS

Sl. no Title Page no

ABSTRACT 3

ACKNOWLEDGMENT 4

1 INTRODUCTION

1.1 OBJECTIVES and PROBLEM 6


STATEMENT

1.2 LITERATURE SURVEY 22

2 DESIGN

2.1 BLOCK DIAGRAM OF SYSTEM 23

2.2 STANDARDS 28

2.3 CONSTRAINTS, TRADE OFFS AND 30


ALTERNATIVES

3 IMPLEMENTATION AND ANALYSIS 31-74

3.1 SYSTEM 31
IMPLEMENTATION(ALGORITHM)

3.2 RESULTS AND INFERENCES(CODE AND 34


OUTPUT)

4 COST ANALYSIS 75

5 CONCLUSION & FUTURE WORK

5.1 CONCLUSION 77

5.2 FUTURE WORK 77

6 REFERENCES 79-81

BIODATA 81-82

DEMO VIDEO LINK :


https://drive.google.com/drive/folders/19NZl0O5nIICdZ0SzRSiOWULelEgBoce5?
usp=sharing

PRESENTATION LINK:
https://docs.google.com/presentation/d/1hKWIMIlOKZjm4WqwTS6OFmOVFx0sJ9-
bJtPEpQJOjrI/edit?pli=1#slide=id.gdd69ada1c9_3_0
CHAPTER 1
INTRODUCTION

1.1 BACKGROUND AND MOTIVATION

From a general perspective in this modern world everyone wants to be one level up. In this
era every human is running a race to be the best and this best is generally the one related to
high paying jobs , corporate world , a big startup idea and many more like this.

The usual movement of people is from the urban area to the rural area and it is quite obvious
because from the past few decades all the technological modernization have been majorly
taking place in the urban areas and also it provides much more paying jobs.

Now when the World has the COVID outbreak as we all know that nothing is being spared
and same is the case with jobs. We can hear about many reputed companies laying out the
employees in huge amounts and we can not imagine the impact on employees of smaller or
economically weaker companies.

Hence we have seen a backward movement from urban to the rural areas because agriculture
has been a more dependable and reliable job sector.

So we planned to make a surrounding which is technologically sound and can help farmers
ease up their job . Also we have accommodated various global goals so that the largest sector
of our country is doing its work and that too in a sustainable way.

1.2 PROBLEM STATEMENT AND OBJECTIVES

It’s a step forward to the ever-increasing demand of humans and evolution of technologies.
The problem statement is based on a simple measurement scale known as SOCIAL
PROGRESS INDEX.

Our idea is inspired by a recent meeting held at the UN, which talked about how the world
can be a better place by 2030. Hence, understanding the need of these goals, we as a team
agreed to act upon and plan our model to our aims along with taking into account to ratify
some of the global issues.

So, in the UN the global goals discussed had several targets and several sub goals. We will be
summing all these targets using an index called SPI. SPI is a tool that sums up all the targets
that the global goals are trying to achieve, which we can use as a benchmark. The lowest
performing country Central African Republic scores 31, whereas the best performing country
is Norway with score 88. The world average is found to be 61 and the global goals can get us
to 75.

Now there are no economic indicators in the social progress index. Countries having huge
economies like Russia and China are underperforming in SPI due to their lack of headway in
human rights or environmental issues.

But countries like Costa Rica which have lesser economies are performing much better on
SPI. This is due to their focus on well-being and most of the listed 17 global goals rather than
wealth.

Hence, in our project we try to incorporate some of the 17 major global goals that are listed
below :
By implementing our model were able to address :

1 - AFFORDABLE AND CLEAN ENERGY

2 - DECENT WORK AND ECONOMIC GROWTH

3 - INDUSTRY , INNOVATION AND INFRASTRUCTURE

4 - CLIMATE ACTION

5 - LIFE ON LAND
CHAPTER 2

2.1 LITERATURE SURVEY

S. Name of the paper Work done Inferences


N
o

( FIRE DETECTION AND


SENDING ALERT)
1. About automatic fire alarm This paper describes the We can infer from
systems research overall structure of the this paper that Stable
fire alarm system, fire operation, fire and
Huide Liu; Suwei Li; Lili alarm control software in fault alarm
Gao; Tao Wu. the design. Fire detectors preparation, timely,
using two-wire method to good man-machine
2010 2nd IEEE
reduce the wall alignment, interface, automatic
International Conference on
improve reliability, ease of fire alarm system, fire
Information Management
construction and prevention of
and Engineering
installation. They have buildings plays a
used a fire alarm, fault decisive role.
alarm, fire priority,
DOI: detector log information When a fire occurs,
10.1109/ICIME.2010.547783 storage and database the automatic fire
5 management, fire fault alarm system to be
information storage, accurate, timely,
information print, fire and rapid and automatic
fault audio, control fire alarm system in
linkage devices, standby the liquid crystal
power automatically display of fire
switches, LCD screens occurred at a time,
Chinese display. place and other
information to help
monitoring personnel
to keep abreast of
fire, rescue personnel
quickly arrive at the
fire scene, the
efficient conduct of
rescue operations.
2. Smart and automated fire This paper discusses an We can infer from
and power monitoring automated fire alarms this paper that the
system generating (fire system and protype is
monitoring), electrical well defined and
G. K. Baddewithana;G. A. power consumption management of data
H. S. Godigamuwa;P. S. monitoring and electrical is also good as the
Gauder;D. C. N. power quality monitoring data is analyzed and
Hapuarachchi;UdayaDamp (power monitoring) messages are
age;R. Wijesiriwardana system. This system is generated and sent to
capable of detecting fire the respective units.
2013 IEEE 8th International
and sending the alarms The system is
Conference on Industrial
via mobile messages to designed for
and Information Systems
predetermined recipients. industrial as well as
The system also home usage. More we
distinguishes between an can see in the paper in
DOI: electrical fire and non-
10.1109/ICIInfS.2013.67320 electrical fire. Upon “Prototype and Test”
42 detection of a fire, this section for better
system identifies the understanding.
location of the fire,
enables the fire
extinguishing system of
that particular location
and sends mobile
messages to fire
departments.Moreoverthe
system provides voice
warnings and messages
for the trapped people to
evacuate to safer zones,
and also detects the
damages of the electrical
wiring. The power
monitoring part of the
system measures and
records voltage, current,
power and frequency
fluctuation parameters.
3. GSM Based Low-cost Gas In this paper, a system is The main aim of this
Leakage Explosion and Fire proposed that can detect paper is to design a
Alert System with Advanced not only gas leakage, it system that can give
Security can detect explosion, and security to people at a
Pritam Ghosh;PalashKanti fire as well. And can take low cost. And people
Dhar some protective steps too. can feel safe to use
It is equipped with gas LPG cylinder. Many
2019 International sensor to detect the leaked researches were done
Conference on Electrical, gas and flame sensor to on this area, but a
Computer and detect the explosion and system was needed
Communication fire. It has exhaust fan that which can give
Engineering (ECCE) system to clear the leaked advanced security in a
gas and solenoid valve to low cost. The system
inlet the water or carbon proposed in this
DOI: dioxide gas (CO2) if paper can meet the
10.1109/ECACE.2019.86794 explosion and fire occurs. demand.A prototype
11 The explosion security of this system has
system response been developed and
individually when there is tested with Liquefied
only a fire with no relation Petroleum Gas (LPG)
to gas leakage. If any and Fire as well.
incident occurs, that
information is sent to
owner through wireless
media, a display shows the
alert message and buzzer
makes the alarm. It is
equipped with Global
System for Mobile
communications (GSM)
modem as wireless media
to send information to
owner through Short
Message Service (SMS).
This ensures preventive
actions immediately even
in the absence of people
on-site.
Computer Vision Based Fire
4. Detection with a Video Alert In this paper, they have We can infer from
System proposed a fire detection this paper as we can
G.Sathyakala;V. method using OpenCV use Camera,
Kirthika;B. Aishwarya and Raspberry Pi to Microcontroller and
detect fire and send an a good algorithm to
2018 International alert as an alarm in the detect fire.
Conference on same building and a short
Communication and Signal video is sent to the remote Through the video,
Processing (ICCSP) fire alarm control unit. they can know how
They are using computer- many people are
vision technology to detect inside the room and
DOI:10.1109/ fire with the help of the fire station might
ICCSP.2018.8524216 surveillance camera and send sufficient
send an alert to a remote rescuers based on the
fire station. video to rescue
people. The
advantage is that
theyhaven’t used
flame and smoke
sensors that might
give false alarms. In
case there is a false
alarm by using
proposed system, they
can verify it by
checking the video
also.

(FLOOD DETECTION and


SOLAR POWERED
FENCING)
Automated Irrigation This paper proposes a In this paper, an
System Using Solar Power model of variable rate automated irrigation
5 automatic microcontroller model is proposed
Jia Uddin1 , S.M. Taslim based irrigation system and successfully
Reza2 , Qader Newaz2 , implemented using
Jamal Uddin2 , Touhidul different circuits as
Islam2 , and Jong-Myon demonstrated in
Kim1 different figures.
They have designed
and implemented this
2012 7th International model considering
Conference on Electrical low cost, reliability,
and Computer Engineering alternate source of
20-22 December, 2012, electric power and
Dhaka, Bangladesh automatic control. As
the proposed model is
https:// automatically
www.researchgate.net/ controlled it will help
publication/ the farmers to
261470241_Automated_irrig properly irrigate their
ation_system_using_solar_p fields. The model
ower always ensures the
sufficient level of
water in the paddy
field avoiding the
under-irrigation and
over-irrigation.
Farmers can remotely
ON/OFF the motor by
using cell phone even
from away
Flood Detection using
6 Sensor Network and This paper presents a The system will
Notification via SMS and description of an alert determine the
Public Network generating system for current water level
flood detection. This by using
Mohamed Ibrahim Khalaf, paper focused on the
Azizah suliman development of the system wireless sensor
which will determine the network, which will
College of information current water level by also provide
technology, department means of sensors and by notification
system and network using wireless sensor
Universiti Tenaga Nasional network will then provide of SMS via GSM
(UNITEN) modem. SMS is as
notification via GSM
an helpful alert
modem.
Student Conference On
Research And Development communication tools
(SCOReD 2011) 2 nd that can distribute
November 2011, the information to
Administration Gallery,
floods victim in a
UNITEN
particular area. This
system is able to
detect
https://
www.researchgate.net/ a level of water and
publication/ send that data to the
263088726_Flood_Detection main flood control
_using_Sensor_Network_an
centre even if it close
d_Notification_via_SMS_an
or far away from the
d_Public_Network
sensor that detect

the level of water. The


purpose of radio
module in this project

is used as the medium


to send the data from
transmitter module

to the receiver module


Real-time flood monitoring The two main objectives The developed system
and warning system of the developed system is is composed of three
7 to serve 1) as information major components: 1)
Jirapon Sunkpho and channel for flooding sensor network, 2)
Chaiwat Ootamakorn between the involved processing and
authorities and experts to transmitting modules,
School of Engineering and
enhance their and 3) database and
Resources, Walailak
responsibilities and application server.
University
collaboration and 2) as a The GPRS sensor
J. Sunkpho & C. web based information network is
Ootamakorn / source for the public, implemented at 15
Songklanakarin J. Sci. responding to their need remote sites, where
Technol. 33 (2), 227-235, for information on water network
2011 condition and flooding infrastructure is not
https:// available. The
www.researchgate.net/ connectivity is done
publication/ through the wireless
263922229_Real- GPRS tunnels. The
time_flood_monitoring_and sensor network
_warning_system measures
waterrelated data
while the processing
and transmission
module is used to
transmit measured
data to the database
and application
server. The database
and application
server is implemented
as a web-based
application to allow
users to view real-
time water-related
data as well as
historical data.
8 Flood Monitoring and Early This work falls under the The project
Warning System Using utilization of the Arduino, contributes towards
Ultrasonic Sensor ultrasonic sensors, GSM economy and the
module, web-monitoring citizens. It envisions a
J G Natividad and J M and SMS early warning safe, prepared and
Mendez system in helping less casualty
stakeholders to mitigate community before,
1)ICT Department, Isabela
casualties related to flood. during and after
State University – Ilagan
The paper envisions typhoon devastation.
Campus, City of
helping flood-prone areas. The model also
Ilagan ,2)College of
promotes the use of
Computer Studies and
real-time monitoring
Engineering, Lorma
system through the
Colleges, San Fernando
developed web-based
City, La Union, Philippines
application and SMS
DOI: notification system as
10.1088/1757-899X/325/1/01 an easy medium in
2020 disseminating
information
https://iopscience.iop.org/ particularly in the
article/10.1088/1757-899X/ remote areas. By
325/1/012020/pdf allowing the system in
two-way
communication, it
gives more flexibility
in providing
important
information to the
community.
9 Design of solar powered This paper proposes the This paper presents
energizer and on-line design of a solar powered the design and safety
monitoring of electric energizer and on-line requirements of
fencing system fence voltage monitoring electric fencing
system for electric fences system. The
M.Anantha kumar Assistant simulation of the
Professor, Dept. of energizer circuit
Electrical and Electronics shows clearly that this
Engg Sri Krishna College of kind of circuit is
Engg and Tech appropriate to be
used as electric fence
DOI:
energizer because it
10.1109/ISCO.2014.7103946
complies with the
https://ieeexplore.ieee.org/ safety standard
document/7103946 requirement. The
impulse waveform
generated in the
simulation is
approximated to the
impulse waveform
measured and the
results validate the
design method. The
monitoring of Electric
fencing system is
experimented with
the actual fencing
system with the

help of capacitor
divider arrangement
and by using Zigbee
wireless transmission
system. Since the
monitored data is
interfaced with the
internet server, the
operator can observe
the fence status at
anywhere in the
world. This system is
less economical while
comparing with the
other types of fault
10 Solar Powered Smart This paper proposes a The proposed eco-
Irrigation System system where solar energy friendly system aims
from the solar panels is at conserving energy
S. Harishankar, R. Sathish utilized to pump water by optimal usage of
Kumar, Sudharsan K.P, U. from bore well directly water by reducing
Vignesh and T.Viveknath into a ground level storage wastage and reduces
tank. Here a single stage the human
Advance in Electronic and
energy consumption intervention for
Electric Engineering.
wherein the water is farmers. The excess
ISSN 2231-1297, Volume 4, pumped into a ground energy from the solar
Number 4 (2014), pp. 341- level tank from which a panels can also act as
346 simple valve mechanism revenue for the
controls the flow of water farmers. Solar pumps
© Research India into the field. A valve also offer clean
Publications controller is used which solutions with no
helps in the regulation of danger of borehole
flow of water depending contamination.
upon the moisture present
http:// The system requires
in the soil using the soil
www.ripublication.com/ minimal maintenance
moisture sensor.
aeee.htm and attention as they
are self starting.

Plant Disease Identification This paper proposes a This paper proposes a


Based on Deep Learning mathematical model of recognition model
11 Algorithm in Smart plant disease detection integrating RPN
Farming and recognition based on algorithm, CV
deep learning, which algorithm, and TL
Yan Guo,1,2 Jin Zhang,3 improves accuracy, algorithm, which can
Chengxin Yin,4 Xiaonan generality, and training effectively solve the
Hu,1 Yu Zou,1 Zhipeng efficiency. Firstly, the problem of plant
Xue,1 and Wei Wang3 region proposal network disease identification
(RPN) is utilized to in the complex
Hindawi Discrete Dynamics recognize and localize the environment. model
in Nature and Society Volume leaves in complex not only adapts to
2020, Article ID 2479172, 11 surroundings.Then, complex
pages images segmented based environments, but
https://doi.org/10.1155/2020/2 on the results of RPN also increases the
479172 algorithm contain the accuracy of
feature of symptoms identification.
through Chan–Vese (CV) Compared with the
algorithm. Finally, the traditional model, the
segmented leaves are model proposed in
input into the transfer this paper not only
learning model and guarantees the
trained by the dataset of robustness of the
diseased leaves under convolutional neural
simple background. network, but also
reduces the number
and quality
requirements of the
convolutional neural
network on the data
set and obtains better
results.

. Thee results show


that the accuracy of
the method is 83.57%,
which is better than
the traditional
method.

2.2 EACH COMPONENT FLOW DIAGRAM

2.2.1 FLOOD DETECTION


2.2.2 AIR QUALITY MONITORING
2.2.3 SOLAR POWERED SMART IRRIGATION
2.2.4 FIRE DETECTION
2.2.5 LEAF DISEASE DETECTION
2.3 CLASS DIAGRAM
2.4 ACTIVITY DIAGRAM
2.5 USE CASE DIAGRAM
2.6 OVERALL FLOW DIAGRAM
2.7 ELABORATED FLOW DIAGRAM
CHAPTER 3

3.1 ALGORITHMS FOR EACH COMPONENT

3.1.1 SOLAR POWER AUTOMATIC IRRIGATION SYSTEM

Algorithm Description for Initial Step

Step1:-start

Step2:-Verify for the Water level in the tank

Step3:- If full, wait for the farmer initiation

Step4:- If not full, verify SOC of the Battery

Step5:- If SOC is full, start watering

Step6:-If tank full, go to step 3

Algorithm Description Watering

Step1:-Verify the syntax of the message sent

Step2:- If yes, check the water level

Step3:-If water level is full, open the valve

Step4:-If, below 1/3 of the level, SOC full, Open valve and On-Motor and go to step 6

Step5:- If, below 1/3 of the level, SOC not sufficient,

Send message for job not done and got step7

Step6:- If water is done for specified time, send message “Watering done “and got to step7

Step7:- Stop the process and go to step one in Algorithm 1

3.1.2 FLOOD DETECTION

START

READ the WATER LEVEL;

READ the FLOW RATE

CONNECT WiFi module;

START storing data;

IF (WATER LEVEL > THRESHOLD && FLOW RATE >THRESHOLD)


SEND ALERT;

CONTINUE

ELSE

DELAY 500 MS;

CONTINUE;

END;

3.1.3 AIR QUALITY MONITORING

This System Calculates the air pollutant concentration in the environment using MQ135
Sensor and Arduino Uno

For Transferring the data , we are using Thingspeak software with the ESP8266 WiFi
Module.

TinkerCad is used for online simulation .

The analog output voltage returned by MQ135 is assumed to be directly proportional to the
concentration of pollutants in ppm.

Algorithm

START

READ the POLLUTANT CONCENTRATION t;

CONNECT WiFi module;

if (t<=500) {

lcd.print("Fresh Air");

Serial.print("Fresh Air "); }

else if( t>=500 && t<=1000 ) {

lcd.print("Poor Air");

Serial.print("Poor Air"); }

else if (t>=1000 )
lcd.print("Very Poor");

Serial.print("Very Poor");

SEND ALERT

END;

Components Required

● Arduino Uno
● 16X2 Charater LCD
● ESP8266 Wi-Fi Module
● MQ135 Gas Sensor

CIRCUIT DESIGN:

3.1.4 EARLY FIRE DETECTION USING DEEP LEARNING


3.1.5 Leaf Disease Detection:

● Imported various libraries used


● Pre processed the dataset
● Convert image to array
● Classified the dataset into different diseases
● Parameters assigned to the model
● Trained the model
● Plot the graph of validation and training accuracy
● Test the dataset
● Print the accuracy

3.2 IMPLEMENTATION OF EACH COMPONENT (CODE AND OUTPUT)

3.2.1 SOLAR POWERED IRRIGATION SYSTEM

CODE:

#include <LiquidCrystal.h>
const int LM35 = A0;
const int motor = 13;
const int LedRed = 12;
const int LedGreen = 11;

LiquidCrystal lcd(2, 3, 4, 5, 6, 7);


void setup() {
Serial.begin(9600);
lcd.begin(16, 2);
lcd.print("Automated Plant");
lcd.setCursor(0,1);
lcd.print("Watering System!");
pinMode(motor, OUTPUT);
pinMode(LedRed, OUTPUT);
pinMode(LedGreen, OUTPUT);
delay(2000);
lcd.clear();
lcd.print("Temp= ");
lcd.setCursor(0,1);
lcd.print("WaterPump= ");
}
void loop() {

int value = analogRead(LM35);


float Temperature = value * 500.0 / 1023.0;
lcd.setCursor(6,0);
lcd.print(Temperature);
lcd.setCursor(11,1);

if (Temperature > 50){


digitalWrite(motor, HIGH);
digitalWrite(LedRed, HIGH);
digitalWrite(LedGreen, LOW);
lcd.print("ON ");
}
else {
digitalWrite(motor, LOW);
digitalWrite(LedRed, LOW);
digitalWrite(LedGreen, HIGH);
lcd.print("OFF");
}

delay(1000);
}

OUTPUT:
3.2.2 AIR QUALITY MONITORING

CODE:

#include <SoftwareSerial.h>

float t=0;

char data = 0;

String apiKey = "JLBZ0ODIQ32HXY2K"; // Write API key

SoftwareSerial ser(8,9); // RX, TX

void setup()

// enable debug serial

Serial.begin(9600); // serial data transmission at Baudrate of 9600

// enable software serial

ser.begin(9600);

lcd.begin(16, 2); // to intialize LCD

lcd.setCursor(0,0);
lcd.print(" Welcome");

lcd.setCursor(0,1);

lcd.print(" To ");

delay(3000);

lcd.clear();

lcd.setCursor(0,0);

lcd.print(" AIR");

lcd.setCursor(0,1);

lcd.print("QUALITY MONITOR");

delay(3000);

ser.println("AT"); // Attenuation

delay(1000);

ser.println("AT+GMR"); // To view version info for ESP-01 output: 00160901 and ESP-12
output: 0018000902-AI03

delay(1000);

ser.println("AT+CWMODE=3"); // To determine WiFi mode

delay(1000);

ser.println("AT+RST"); // To restart the module

delay(5000);

ser.println("AT+CIPMUX=1"); // Enable multiple connections

delay(1000);

String cmd="AT+CWJAP=\"SSID\",\"PASSWORD\""; // connect to Wi-Fi

ser.println(cmd);

delay(1000);

ser.println("AT+CIFSR"); // Return or get the local IP address


delay(1000);

lcd.clear();

lcd.setCursor(0,0);

lcd.print(" WIFI");

lcd.setCursor(0,1);

lcd.print(" CONNECTED");

void loop()

delay(1000);

t = analogRead(A0); // Read sensor value and stores in a variable t

Serial.print("Airquality = ");

Serial.println(t);

lcd.clear();

lcd.setCursor (0, 0);

lcd.print ("Air Qual: ");

lcd.print (t);

lcd.print (" PPM ");

lcd.setCursor (0,1);

if (t<=500)

lcd.print("Fresh Air");

Serial.print("Fresh Air ");

else if( t>=500 && t<=1000 )


{

lcd.print("Poor Air");

Serial.print("Poor Air");

else if (t>=1000 )

lcd.print("Very Poor");

Serial.print("Very Poor");

delay(10000);

lcd.clear(); lcd.setCursor(0,0);

lcd.print(" SENDING DATA");

lcd.setCursor(0,1);

lcd.print(" TO CLOUD");

esp_8266();

void esp_8266()

cmd += "184.106.153.149"; // api.thingspeak.com

cmd += "\",80";

ser.println(cmd);

Serial.println(cmd);

if(ser.find("Error"))

Serial.println("AT+CIPSTART error");
return;

String getStr = "GET /update?api_key="; // API key

getStr += apiKey;

getStr +="&field1=";

getStr +=String(t);

getStr += "\r\n\r\n";

cmd = "AT+CIPSEND="; // Send data AT+CIPSEND=id,length

cmd += String(getStr.length());

ser.println(cmd);

Serial.println(cmd);

delay(1000);

ser.print(getStr);

Serial.println(getStr);

delay(17000);

OUTPUT:
3.2.3 FLOOD DETECTION

int PIR = 0;
int Distance = 0;
long readUltrasonicDistance(int triggerPin, int echoPin)
{
pinMode(triggerPin, OUTPUT); // Clear the trigger
digitalWrite(triggerPin, LOW);
delayMicroseconds(2);
// Sets the trigger pin to HIGH state for 10 microseconds
digitalWrite(triggerPin, HIGH);
delayMicroseconds(10);
digitalWrite(triggerPin, LOW);
pinMode(echoPin, INPUT);
// Reads the echo pin, and returns the sound wave travel time in microseconds
return pulseIn(echoPin, HIGH);
}
void setup()
{
pinMode(13, INPUT);
pinMode(12, OUTPUT);
pinMode(6, OUTPUT);
}
void loop()
{
PIR = digitalRead(13);
delay(10); // Wait for 10 millisecond(s)
if (PIR == HIGH) {
digitalWrite(12, HIGH);
delay(1); // Wait for 1 millisecond(s)
} else {
digitalWrite(12, LOW);
}
Distance = 0.01723 * readUltrasonicDistance(5, 4);
if (Distance <= 100) {
tone(6, 880, 125); // play tone 69 (A5 = 880 Hz)
delay(125); // Wait for 125 millisecond(s)
} else {
noTone(6);
}
}

OUTPUT :

ALARM NOT RAISED WHEN BELOW CERTAIN LEVEL.

WHEN REACHES A CERTAIN LEVEL(THRESHOLD), ALARM RAISED.


3.2.4 WATER LEVEL DETECTION

#include <LiquidCrystal.h>
LiquidCrystal lcd(7,4,3,2,A2,A3);

String ssid = "Simulator Wifi";


String password = "";
String host = "api.thingspeak.com";
const int httpPort = 80;
String url = "/update?api_key=38R1GMOVPJIX9Z5J";
int motor=10, sensor=12;
float dist, percent;

int setupESP8266(void)
{
Serial.begin(115200);
Serial.println("AT");
delay(10);
if (!Serial.find("OK"))
return 1;
Serial.println("AT+CWJAP=\"" + ssid + "\",\"" + password + "\"");
delay(10);
if (!Serial.find("OK"))
return 2;
Serial.println("AT+CIPSTART=\"TCP\",\"" + host + "\"," + httpPort);
delay(50);
if (!Serial.find("OK"))
return 3;
return 0;
}

void setup()
{
lcd.begin(16, 2);
setupESP8266();
digitalWrite(motor, LOW);
}

void loop()
{
int motorstatus=0;
pinMode(sensor, OUTPUT);
digitalWrite(sensor, LOW);
delayMicroseconds(2);
digitalWrite(sensor, HIGH);
delayMicroseconds(5);
digitalWrite(sensor, LOW);
pinMode(sensor, INPUT);
dist = pulseIn(sensor, HIGH);
dist = dist/29/2;
percent = (dist * -100.0)/330 + 100;
if(percent<30)
{
motorstatus=1;
digitalWrite(motor, HIGH);
}
if(percent>90)
{
motorstatus=0;
digitalWrite(motor, LOW);
}
lcd.setCursor(0,0);
lcd.print("Percent : ");
lcd.print(percent);
lcd.setCursor(0,1);
lcd.print("Motor: ");
if(motorstatus==0)
lcd.print("Off");
else
lcd.print("On ");
String httpPacket= "GET "+ url +"&field1="+percent+"&field2="+motorstatus+"
HTTP/1.1\r\nHost: " + host + "\r\n\r\n";
int length = httpPacket.length();
Serial.print("AT+CIPSEND=");
Serial.println(length);
delay(10);
Serial.print(httpPacket);
delay(10);
Serial.println(dist);
if (!Serial.find("SEND OK\r\n"))
return;
}

OUTPUT :
3.2.5 EARLY FIRE DETECTION USING DEEP LEARNING AND SENDING
WHATSAPP ALERT USING SELENIUM

CODE:

from tensorflow.keras.preprocessing.image import


ImageDataGenerator from tensorflow.keras.applications import
MobileNetV2

from tensorflow.keras.layers import


AveragePooling2D from tensorflow.keras.layers
import Dropout

from tensorflow.keras.layers import


Flatten from tensorflow.keras.layers
import Dense from tensorflow.keras.layers
import Input from
tensorflow.keras.models import Model
from tensorflow.keras.optimizers import
Adam

from tensorflow.keras.applications.mobilenet_v2 import preprocess_input


from tensorflow.keras.preprocessing.image import img_to_array

from tensorflow.keras.preprocessing.image import


load_img from tensorflow.keras.utils import
to_categorical
from sklearn.preprocessing import LabelBinarizer
from sklearn.model_selection import
train_test_split from sklearn.metrics import
classification_report from imutils import paths

import matplotlib.pyplot as
plt import numpy as np

import argparse

import os

from
google.colab
import drive
drive.mount('/con
tent/drive')

Mounted at /content/drive

TO MOUNT WITH GOOGLE DRIVE

def prep(data_folder_path):

dirs = os.listdir(data_folder_path)

images = []

labels = []

for dir_name in dirs:

print(dir_name)

if dir_name=='0':

label = 0

elif dir_name=='1':

label= 1
subject_dir_path = data_folder_path + "/" + dir_name

subject_images_names = os.listdir(subject_dir_path;

for image_name in subject_images_names:

if image_name.startswith("."):

continue;

image_path = subject_dir_path + "/" + image_name

images.append( image_path)

labels.append(label);

return images,labels

APPENDING IMAGES

path = "/content/drive/My Drive/archive/Train" train_img , train_lab = prep(path)

path = "/content/drive/My Drive/archive/test" test_img , test_lab = prep(path)

tl = len(train_img)

train_data = []

c=0

for path in train_img:


img = load_img(path,target_size=(224,224))

img = img_to_array(img)

img = preprocess_input(img)

train_data.append(img)

c+=1

print(c)

tl = len(test_img)

test_data = []

c=0

for path in test_img:

img = load_img(path,target_size=(224,224))

img = img_to_array(img)

img = preprocess_input(img)

test_data.append(img)

c+=1

print(c)

trainY ,testY = np.array(train_data) , np.array(test_data)

trainX , testX = np.array(train_lab) , np.array(test_label)

CHANGING AS A ARRAY

trainY.shape

(536, 224, 224, 3)


import pickle
withopen("fire.pickle","wb")asf:
pickle.dump([trainY,trainX,testY,testX],f)

SAVING PRE PROCESSED IMAGES

aug = ImageDataGenerator(
rotation_range=20,

zoom_range=0.15,

width_shift_range=0.2,

height_shift_range=0.2,

shear_range=0.15,

horizontal_flip=True,

fill_mode="nearest")

DATA AUGMENTATION

baseModel = MobileNetV2(weights="imagenet", include_top=False,

input_tensor=Input(shape=(224, 224, 3)))

headModel = baseModel.output

headModel = AveragePooling2D(pool_size=(7, 7))(headModel)

headModel = Flatten(name="flatten")(headModel)

headModel = Dense(128, activation="relu")(headModel)

headModel = Dropout(0.5)(headModel)

headModel = Dense(2, activation="softmax")(headModel)

model = Model(inputs=baseModel.input, outputs=headModel)

for layer in baseModel.layers:

layer.trainable = False

LAYER OF NEURAL NETWORK

INIT_LR = 1e-4

EPOCHS = 100

BS = 64

opt = Adam(lr=INIT_LR,decay=INIT_LR/EPOCHS)

model.compile(optimizer=opt,loss="sparse_categorical_crossentropy",metrics=["accuracy"])
from tensorflow import keras

class callbacks(keras.callbacks.Callback):

def on_epoch_end(self,epochs,logs={}):

if logs.get('accuracy') > 0.95 and logs.get('val_accuracy') > 0.86:

print("\n Accuracy reached \n")

self.model.stop_training = True

callbacks = callbacks()

H = model.fit(

aug.flow(trainY,trainX),

steps_per_epoch=len(trainX)//BS,

validation_data=(testY,testX),

validation_steps=len(testX)//BS,epochs=EPOCHS,shuffle=True,callbacks = [callbacks])

TRAINING

acc = H.history['accuracy']

val_acc = H.history['val_accuracy']

loss = H.history['loss']

val_loss = H.history['val_loss']

epochs = range(len(acc))

plt.figure(figsize=(10,6))

plt.plot(epochs,acc)

plt.plot(epochs,val_acc)

plt.figure()

plt.plot(epochs,loss)

plt.plot(epochs,val_loss)

PLOTTING TRAINING AND TESTING ACCURACY AND LOSS FUNCTION


from keras.models import
model_from_json model_json
= model.to_json()
withopen("model_fire_arch.json","w")asjson_file:
json_file.write(model_json)

model.save_weights("model_fire_weights.h5")

SAVING TRAINED MODEL


from tensorflow.keras.models import model_from_json
json_file=open('/content/drive/MyDrive/archive/model_fire_arch.json','r')
loaded_model_json = json_file.read()

json_file.close()

model = model_from_json(loaded_model_json)
model.load_weights("/content/drive/My Drive/archive/model_fire_weights.h5"
LOADING TRAINED MODEL

clas = ["no fire","fire"]

#path = "/content/drive/My
Drive/archive/test/1/100.jpg" path=
"/content/drive/My
Drive/archive/test/0/450.jpg" #path=
"/content/drive/My Drive/archive/test/0/450.jpg"
img=load_img(path,target_size=(224,224))
img1=cv2.imread(path)
img1
=cv2.cvtColor(img1,cv2.COL
OR_BGR2RGB) #re =
cv2.imread(path)
#img
=cv2.cvtColor(re,cv2.COLOR
_BGR2RGB) #img
=cv2.resize(img,(224,224))

img = img_to_array(img)

img = preprocess_input(img)
img =
np.expand_dims(img,
axis=0) last
=time.time()
pred =
model.pre
dict(img)
#print(tim
e.time()-
last)
op=clas[n
p.argmax(
pred)] e =
np.argmax
(pred)
x=round(pre
d[0]
[e]*100,2)
conf = str(x)
+'%'
conf
if
x>8
0.0:

cv2.putText(img1,op,(25,50),cv2.FONT_HERSHEY_SIMPLEX,2,(0,200,0),3)

plt.imshow(img1)
TESTING WITH THREE DIFFERENT IMAGES

from IPython.display import display,


Javascript from google.colab.output
importeval_js

from base64 import b64decode

deftake_photo(filename='photo.jpg'
,quality=0.8): js=Javascript('''

async function takePhoto(quality) {

const div = document.createElement('div');


const capture = document.createElement('button');
capture.textContent = 'Capture';

div.appendChild(capture);
const video =
document.createElement('video');
video.style.display = 'block';

const stream = await navigator.mediaDevices.getUserMedia({video: true});

document.body.appendChild(div);
div.appendChild(video);
video.srcObje
ct = stream;
awaitvideo.pl
ay();

// Resize the output to fit the video element.

google.colab.output.setIframeHeight(document.documentElement.scrollHeight,
true);

// Wait for Capture to be clicked.

await new Promise((resolve) => capture.onclick = resolve);

const canvas =
document.createElement('canvas');
canvas.width =video.videoWidth;

canvas.height = video.videoHeight;

canvas.getContext('2d').drawImage(
video,0,0);stream.getVideoTracks()
[0].stop();

div.remove();

return canvas.toDataURL('image/jpeg', quality);


}
''')
display(js)
data=eval_js('takePhoto({})'.format(quali
ty))binary=b64decode(data.split(',')[1])
with
open(filena
me, 'wb')
as f:
f.write(bin
ary)

return filename;

FOR CAMERA : REAL TIME ANALYSIS

filena
me=t
ake_p
hoto(
)
impor
t cv2

import time

clas = ["no

fire","fire"

] path

=filename

img=load_img(path,target_size=(224,224))
img1=cv2.imread(path)
img1
=cv2.cvtColor(img1,cv2.COL
OR_BGR2RGB) #re =
cv2.imread(path)
#img
=cv2.cvtColor(re,cv2.COLOR
_BGR2RGB) #img
=cv2.resize(img,(224,224))

img = img_to_array(img)

img = preprocess_input(img)
img =
np.expand_dims(img,
axis=0) last
=time.time()
pred =
model.pre
dict(img)
#print(tim
e.time()-
last)
op=clas[n
p.argmax(
pred)] e =
np.argmax
(pred)
x=round(pre
d[0]
[e]*100,2)
conf = str(x)
+'%'
conf
if
x>8
0.0:

cv2.putText(img1,op,(25,50),cv2.FONT_HERSHEY_SIMPLEX,2,(0,200,0),3)

plt.imshow(img1)

OUTPUT
SENDING WHATSAPP ALERT USING SELENIUM:

filename=t
ake_photo(
0)

import cv2

import time

clas = ["no

fire","fire"]

path

=filename

img =
load_img(path,target_size=(
224,224))
img1=cv2.imread(path)

img1
=cv2.cvtColor(img1,cv2.COLO
R_BGR2RGB) #re =
cv2.imread(path)
#img
=cv2.cvtColor(re,cv2.COLO
R_BGR2RGB) #img =
cv2.resize(img,(224,224))

img = img_to_array(img)

img = preprocess_input(img)

img =
np.expand_dims(img,
axis=0) last =
time.time()

pred =
model.predict
(img)
#print(time.t
ime()-last)
op=clas[np.ar
gmax(pred)] e
=
np.argmax(pre
d)

x =
round(pred[0]
[e]*100,2) conf
= str(x)+'%'
print(x)
print(op)

op = op +
'-' +
conf if
x>80.0:

cv2.putText(img1,op,(25,50),cv2.FONT_HERSHEY_SIMPLEX,2,
(0,200,0),3)

plt.im
show(i
mg1)
if
e==1:
driver = webdriver.Chrome('/content/drive/My
Drive/selenium/chromedrive r_win32/chromedriver.exe')

driver.get('http://
web.whatsapp.com') msg
= "FIRE DETECTED PLEASE
HELP..."

# name = input('Enter the name of


user or group : ') #
print(type(name))

name = 'Fire'

input("Enter after screen load")

user = driver.find_element_by_xpath('//span[@title =
"{}"]'.format(name

))

user.click()

msg_box =
driver.find_element_by_xpath('//*[@id="main"]/footer/di
v[1]/d

iv[2]/div/div[2]')

msg_box.send_keys(msg)
driver.find_element_by_xpath('//*[@id="main"]/footer/di
v[1]/div[3]/butt

on/span').click()

OUTPUT:
COST ANALYSIS:

Camera:1200

Raspberry Pi: 700

TOTAL COST:Rs1900

For Leaf Disease Detection

Code:

Different Libraries imported:

import numpy as np

import pickle

import cv2

from os import listdir

from sklearn.preprocessing import LabelBinarizer

from keras.models import Sequential

from keras.layers.normalization import BatchNormalization


from keras.layers.convolutional import Conv2D

from keras.layers.convolutional import MaxPooling2D

from keras.layers.core import Activation, Flatten, Dropout, Dense

from keras import backend as K

from keras.preprocessing.image import ImageDataGenerator

from keras.optimizers import Adam

from keras.preprocessing import image

from keras.preprocessing.image import img_to_array

from sklearn.preprocessing import MultiLabelBinarizer

from sklearn.model_selection import train_test_split

import matplotlib.pyplot as plt

Preprocessing of dataset:

EPOCHS = 20

INIT_LR = 1e-3

BS = 32

default_image_size = tuple((256, 256))

image_size = 0

directory_root = "C:\\Users\\Kshitij Jain\\Pictures\\plantdisease_dataset"

width=256

height=256

depth=3

Function to convert image to array:

def convert_image_to_array(image_dir):

try:

image = cv2.imread(image_dir)

if image is not None :


image = cv2.resize(image, default_image_size)

return img_to_array(image)

else :

return np.array([])

except Exception as e:

print(f"Error : {e}")

return None

List of directories present in the dataset:

image_list, label_list = [], []

try:

print("[INFO] Loading images ...")

root_dir = listdir(directory_root)

for directory in root_dir :

# remove .DS_Store from list

if directory == ".DS_Store" :

root_dir.remove(directory)

for plant_folder in root_dir :

plant_disease_folder_list = listdir(f"{directory_root}/{plant_folder}")

for disease_folder in plant_disease_folder_list :

# remove .DS_Store from list

if disease_folder == ".DS_Store" :

plant_disease_folder_list.remove(disease_folder)

for plant_disease_folder in plant_disease_folder_list:


print(f"[INFO] Processing {plant_disease_folder} ...")

plant_disease_image_list =
listdir(f"{directory_root}/{plant_folder}/{plant_disease_folder}/")

for single_plant_disease_image in plant_disease_image_list :

if single_plant_disease_image == ".DS_Store" :

plant_disease_image_list.remove(single_plant_disease_image)

for image in plant_disease_image_list[:200]:

image_directory =
f"{directory_root}/{plant_folder}/{plant_disease_folder}/{image}"

if image_directory.endswith(".jpg") == True or image_directory.endswith(".JPG")


== True:

image_list.append(convert_image_to_array(image_directory))

label_list.append(plant_disease_folder)

print("[INFO] Image loading completed")

except Exception as e:

print(f"Error : {e}")

Saving the dataset in the disk:

image_size = len(image_list)

In [6]:

label_binarizer = LabelBinarizer()

image_labels = label_binarizer.fit_transform(label_list)

pickle.dump(label_binarizer,open('label_transform.pkl', 'wb'))

n_classes = len(label_binarizer.classes_)
Various Classification of diseases:

print(label_binarizer.classes_)

Normalisation of array:

np_image_list = np.array(image_list, dtype=np.float16) / 225.0

Splitting of data to Training and testing:

print("[INFO] Spliting data to train, test")

x_train, x_test, y_train, y_test = train_test_split(np_image_list, image_labels, test_size=0.2,


random_state = 42)

Data augmentation:

aug = ImageDataGenerator(

rotation_range=25, width_shift_range=0.1,

height_shift_range=0.1, shear_range=0.2,

zoom_range=0.2,horizontal_flip=True,

fill_mode="nearest")

In [11]:

Model Parameters Summary:

model = Sequential()
inputShape = (height, width, depth)

chanDim = -1

if K.image_data_format() == "channels_first":

inputShape = (depth, height, width)

chanDim = 1

model.add(Conv2D(32, (3, 3), padding="same",input_shape=inputShape))

model.add(Activation("relu"))

model.add(BatchNormalization(axis=chanDim))

model.add(MaxPooling2D(pool_size=(3, 3)))

model.add(Dropout(0.25))

model.add(Conv2D(64, (3, 3), padding="same"))

model.add(Activation("relu"))

model.add(BatchNormalization(axis=chanDim))

model.add(Conv2D(64, (3, 3), padding="same"))

model.add(Activation("relu"))

model.add(BatchNormalization(axis=chanDim))

model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Dropout(0.25))

model.add(Conv2D(128, (3, 3), padding="same"))

model.add(Activation("relu"))

model.add(BatchNormalization(axis=chanDim))

model.add(Conv2D(128, (3, 3), padding="same"))

model.add(Activation("relu"))

model.add(BatchNormalization(axis=chanDim))

model.add(MaxPooling2D(pool_size=(2, 2)))

model.add(Dropout(0.25))

model.add(Flatten())
model.add(Dense(1024))

model.add(Activation("relu"))

model.add(BatchNormalization())

model.add(Dropout(0.5))

model.add(Dense(n_classes))

model.add(Activation("softmax"))

In [12]:

model.summary()

Training Model:

# distribution

model.compile(loss="binary_crossentropy", optimizer=opt,metrics=["accuracy"])

# train the network

print("[INFO] training network...")

[INFO] training network...

In [14]:

history = model.fit(

aug.flow(x_train, y_train, batch_size=BS),

validation_data=(x_test, y_test),

steps_per_epoch=len(x_train) // BS,

epochs=EPOCHS, verbose=1

)
Plotting of Model Parameters:

accuracy = history.history['accuracy']

val_accuracy = history.history['val_accuracy']

loss = history.history['loss']

val_loss = history.history['val_loss']

epochs = range(1, len(accuracy) + 1)

#Train and validation accuracy

plt.plot(epochs, accuracy, 'b', label='Training accurarcy')

plt.plot(epochs, val_accuracy, 'r', label='Validation accurarcy')

plt.title('Training and Validation accurarcy')

plt.legend()

plt.figure()

#Train and validation loss

plt.plot(epochs, loss, 'b', label='Training loss')

plt.plot(epochs, val_loss, 'r', label='Validation loss')

plt.title('Training and Validation loss')

plt.legend()

plt.show()
Calculating Model accuracy:

print("[INFO] Calculating model accuracy")

scores = model.evaluate(x_test, y_test)

print(f"Test Accuracy: {scores[1]*100}")


Output:

3.3 STANDARDS AND CONSTRAINTS

1) SOLAR POWERED IRRIGATION SYSTEM: (Software used – Tinkercad)

PV Sizing: Different sizes of PV modules will produce different amounts of power. To find
out the sizing of the PV module, the total peak watt produced is needed. The peak watt (WP)
produced depends on size of the PV module and climate of site location. To determine the
sizing of the PV modules, calculate as follows:

STEP 1: Calculation of Total Load Connected

Total Load Connected =[D.C Pump Power Rating * Time of usage] + [Remaining
Components Power Rating* Time of usage]

STEP 2: Calculation of Total PV Panels Energy Needed

Total PV panels energy needed= Total Load Connected + Losses

STEP 3: Calculation of Total Wp Of PV Panel Capacity Needed

Total Wp of PV Panel Capacity Needed = Total PV panels energy needed/ No of Illumination


hours

STEP 4: Calculation of No. of PV Panels Required

No. of PV panels = Total WP of PV panel capacity needed / Rating of the PV Panel

Battery Sizing: The Amp-hour (Ah) Capacity of a battery tries to quantify the amount of
usable energy it can store at a nominal voltage. All things equal, the greater the physical
volume of a battery, the larger its total storage capacity.

STEP 1: Calculation of total Load Connected

Total Load Connected = Sum of all appliances (power rating of each device * Time of usage)

STEP 2: Calculation of Battery (Ah)

Total Load Connected*Days of Autonomy / Battery Losses*Depth of Discharge* N.B.V


The following design constraints were considered for the Solar powered Automatic irrigation
system:

1. Design a sustainable low-cost, low-power, irrigation system to control and reduce water
waste in farming

2. The sub-system should use power from solar energy

3. The sub-system should be able to operate in real-time, and remotely without human
intervention

4. Economics should be kept in mind to keep parts and configuration costs as low as possible

5. The design should be tailored to the type of crops being grown and the landscape

6. Weather conditions are to be taken into consideration.

2) FOR AIR QUALITY MANAGEMENT:

Converting air pollutant concentration

1.Converting Micrograms per cubic meter to PPM ppmv = mg/m^3 x (0.08205 x T) / M

2.Converting PPM to Micrograms per cubic meter mg/m^3 = ppmv x M /(0.08205 x T)

Where,

mg/m^3 = microgram of pollutant per cubic meter of air

ppmv = air pollutant concentration, in parts per million by volume

T = ambient temperature in kelvin

0.08205 = Universal gas constant

M = Molecular weight of air pollutant

SOFTWARE : Tinkercad and thingspeak

3) FLOOD DETECTION :

Water Level is measured in Liters.

Water Level for Water Level Monitoring is measured in percentage of the total volume of
container hence is variable.
For Example :

Tank Capacity : To find the capacity of a rectangular or square tank: Multiply length (L) by
width (W) to get area (A). Multiply area by height (H) to get volume (V). Multiply volume
by 7.48 gallons per cubic foot to get capacity (C).

For suppose the capacity comes around 30 gallons then if the water level is below 30% I.e
below 9 gallons in this case then the alarm will be raised or the motor will be turned on
automatically(in case of water level monitoring).

SOFTWARE : Tinkercad and thingspeak

3.4 DESIGN TRADEOFF

1. Accuracy and Precision: Sensors vary in their accuracy. Users need to determine the
level of precision required to credibly measure the variable(s) of interest.
Manufacturers often provide a sensor specification data sheet, which contains
information about sensor sensitivity, range, energy requirements, and operating
conditions. Yet field-testing of sensors is required in the actual location of
deployment, as lab and field conditions may vary.
2. Frequency and Duration: Another trade-off is frequency (how often) and duration
(total period) of data reading and logging. High frequency monitoring—for example,
with samples taken multiple times per minute—can be costly in terms of power, data
storage, and data transmission. In general, there are two methods for logging data:

Continuous logging: Collecting data throughout the sampling period. Example:


Measuring environmental conditions, such as temperature or noise pollution, at
regular intervals until the sampling period ends, or the device fails.

Trigger-based logging: Logging data only if a parameter reaches some threshold


value. Example: To monitor hand washing behaviors after latrine use, you would
design a water flow meter that is activated only when an adjacent latrine use monitor
is triggered. The water flow sensor would then return to “off” mode once a zero value
is logged.

3. Data Retrieval: There are two primary options for storing and retrieving sensing
data: manually or remotely. Manual data storage and retrieval is usually done
through a micro secure data (SD) card incorporated into the sensor. This is relatively
cheap and minimizes the size and cost of sensors being deployed. This option requires
end-user interaction on-site to manually download the data, either through data cables
or through Bluetooth, RFID or near field communications (NFC).
4. Remote Monitoring and Calibration: Sensors can be programmed to transmit data
to the cloud for sharing and visualization in real or near-real time, using services like
Xively or Open.Sen.se. Some platforms provide custom dashboards to view streaming
data, create instant reports, or update sensor calibration and reporting parameters
remotely. This can be particularly useful when many sensors are deployed over a
large area and the data collection strategy is expected to change after a period of time.
5. Power Requirements: In resource-constrained environments, where access to
continuous power is challenging or impossible, sensors often rely on stored power .
For users, there is a clear tradeoff between measurement frequency and energy
consumption: the lower the frequency, the higher the battery life. Since field trips to
replace batteries can be time-consuming and expensive, organizations and engineers
often design innovative methods to decrease the sensor's energy requirements.
6. Cost: The price tag for any sensor or sensor network will include fixed costs
(materials) as well as expenses for operations and maintenance, including data
transmission or retrieval, batteries or other power supply, and labor. In addition,
technical support may be required to deploy, calibrate, troubleshoot, or repair sensors
in the field. Sensors deployed in resource-constrained environments may require
additional packaging to protect against precipitation or extreme heat; this can also
drive up costs.
7. Community Response: When deploying sensors for the SMART AGRICULTURE
SYSTEM, organizations need to consider closely how the community will react to the
devices, and how this can affect data quality. Are you measuring human behaviors, or
capturing environmental data? Where will the sensors be placed: in homes, or outside
in public areas? How does the monitored population interact with the devices? Are
you carrying out participatory measurement, with individual community members
helping to collect the data?

To overcome these issues, organizations can carry out a reactivity study that compares
data from a group with knowledge of the sensor, to one that does not know when the
sensor is being deployed. It is also possible to triangulate between different data
collection methods (e.g., comparing sensing data with survey data that measure the
same variables) to estimate the bias in each data generating process.

8. Land use tradeoffs (FOR FLOOD DETECTION SYSTEM): Hydrological


processes respond to changes in land use. Thus, hydrological ecosystem services can
be affected by land use trade-offs and need to be considered in both land use
management and water management.
9. Parameter and constraints tradeoff (FOR FLOOD DETECTION AND AIR
QUALITY ): There is a clear trade-off along the axis of complexity. While the more
simple models can constrain their parameters quite well, they fail to get the
hydrological signatures right. It is the other way around for the more complex models.
The method of evapotranspiration only influences the parameters directly related to it.

3.5 TECHNICAL ANALYSIS

TIME COMPLEXITY:
FOR WATER LEVEL , AIR QUALITY AND SOLAR POWERED AUTOMATIC
IRRIGATION

Since the code only has one if condition which checks the threshold level in each module
and since the statements are O(1), the total time for the for loop is N * O(1), which is O(N)
overall.

FOR FIRE DETECTION MODULE :

The DNN contains hidden layers, training images and we are using a mobile net that has very
few parameters compared to other models. So here we are going to use the fire category only.
SO the total time complexity of the model is O (N^2). This method saves a lot of
computation time in screening the fire pixels. It operates quickly and does not include
complex calculations,

FOR LEAF DISEASE DETECTION :

The CNN model that we have used has different libraries. Different libraries have different
time complexity but the highest time complexity and since three for loop used is O(n^3)
which is used in converting image to array.

SPACE COMPLEXITY:

FOR WATER LEVEL , AIR QUALITY AND SOLAR POWERED AUTOMATIC


IRRIGATION

Each iteration of the loop will need O(1) space for temporary variables, but this space can get
reused because after each loop iteration finishes, the space for those temporaries isn't needed
anymore and can be reused in the next iteration. Therefore, the total space needed is just O(1)

FOR FIRE DETECTION MODULE :

The number of units is a measure of space complexity, It does not include complex
calculations. In this model we did training and testing of images for neural networks so Total
space complexity is O(n^2).
FOR LEAF DISEASE DETECTION :

Space complexity is the amount of space required by the model, so space complexity of CNN
model is O(n^2 k).

3.6 FEASIBILITY ANALYSIS

ECONOMIC FEASIBILITY :

Considering the battery life expectancy and reliable single-hop communication abilities, IoT
monitoring systems are believed to be the most reliable solutions for IAQ measurement. With
lower latencies and lesser power consumption, these systems also demand lesser efforts for
maintenance.Since the model only uses the basic module of arduino(atmega), and sensors like
MQ135 , UltraSonic sensor and basic modules like wifi module hence it is very pocket
friendly and the whole setup is made in a way that farmers don't feel the economic burden.

ENVIRONMENTAL FEASIBILITY :

IoT sensors reduce energy consumption, generate renewable energy on-site, and measure
carbon consumption plus waste.IoT-powered precision agriculture can be another facilitator
of change. Producing more and wasting less is the major rationale behind smart, data-driven
agricultural solutions. IoT is already in place when it comes to monitoring crops and soil
conditions, with this model we can monitor and reduce greenhouse gas emissions as well.

DEMOGRAPHIC FEASIBILITY :

Data from devices can guide farmers’ decisions, helping them farm smarter and safer and
adapt more quickly to changing conditions.

The ability to monitor farm conditions and infrastructure remotely can free up time, labour
and capital to invest, allowing farmers to focus on other things.

Connecting physical resources on farms to the internet promotes:

● remote monitoring of farm conditions and infrastructure, saving time and labour on
routine farm checks

● improving producers’ decision making through data analytics


● faster and quicker insights from real-time data across the value-chain, helping farmers
respond to what the market wants

● efficiency in how we produce food to ensure less wastage, expediency to market, and
enhanced traceability to demonstrate safe and sustainable food to our customers

● building the capabilities to respond to new and emerging technologies and investing
in research and development to contribute to ongoing innovation and improved productivity.

CHAPTER 4

COST ANALYSIS

4.1 WATER LEVEL DETECTION

INDIVIDUAL COMPONENTS AND COSTS :


Arduino ATMEGA : Rs 1000

WIFI Module : Node MCU - Rs 349

Ultrasonic Sensor - Rs 100

LCD Module - Rs 400

Miscellaneous - Rs 200

Water Tank - Rs 2000 for 1000L(AquaTech)

TOTAL COST:

Rs 2049(Excluding Water Tank)

Rs 4029(Including Water Tank)

4.2 SOLAR POWERED AUTOMATIC IRRIGATION SYSTEM

INDIVIDUAL COMPONENTS AND COSTS :

1. Arduino ATMEGA/UNO : Rs 1000/Rs 800

2. LM35 Temperature Sensor – Rs 65

3. DHT11 Humidity Sensor – Rs 120

4. LCD Module - Rs 400

5. GSM Module(TTL SIM800) – Rs 350

6. 12v Relay – Rs 20

7. Soil Moisture Sensor – Rs 135

8. Voltage Regulator(IC LM317) – Rs 20

9. Solar Panels – Rs 700 (acc. to required area msq.)

10. Other Accessories (Transistor (BC547)+Connecting Wires+Power Supply+Water


Pump+Resistors+Variable Resistors+Terminal Connectors) – Rs 730

TOTAL COST : Rs 3540

4.3 AIR QUALITY MONITORING


COST COMPONENT WISE :

Arduino UNO : Rs 800

16X2 Character LCD :Rs 400

ESP8266 Wi-Fi Module: Rs 600

MQ135 SENSOR : Rs 365

TOTAL COST : Rs2165

4.4 LEAF DISEASE DETECTION

COST ANALYSIS:

FACE VEHICLE CAMERA : Rs 900

16X2 Character LCD :Rs 700

ESP8266 Wi-Fi Module: Rs 600

RAspberry Pi: Rs 700

Total Cost : Rs 2900

CHAPTER 5

CONCLUSION AND RECOMMENDATION FOR FUTURE WORK

5.1 CONCLUSION

The SMART AGRICULTURE SYSTEM was built and implemented for our target audience
mainly for the farmers. This prototype can be used to study the conditions of the land as well
as can ease the farm labour ,make the farm smarter and safer and adapt more quickly to
changing conditions.The ability to monitor farm conditions and infrastructure remotely can
free up time, labour and capital to invest, allowing farmers to focus on other things.
The system implements using the Arduino for the irrigation , air and water detection system
and uses Raspberry Pi controller for faster computation and minimum delay for the fire
Detection using deep learning.
The preliminary test results were promising. Due to lack of availability of open areas and free
movement, complete real time analysis could not be conducted and hence we finalised our
observations based on online simulation using tinkercad .
This project was an immense opportunity at understanding interdisciplinary work and the
practical relationship between our specializations. As much as possible the work was carried
out with vigour and excitement. The SMART AGRICULTURE SYSTEM was successfully
carried out to the best of our abilities.

5.2 RECOMMENDATIONS FOR FUTURE WORK

1. We can further add more features in our project like SOLAR POWERED FENCING
for the security and preventing the farms from wild animals .
2. After conducting surveys with the farmers and based on the area , we can suggest
solar paints on houses and electricity producing wells.
3. Modifying the electronic circuit with advanced and latest sensors will lead to better
precision and accuracy in results.
4. Improvised sensor placement can be done for the system to detect the conditions
accurately enough to prevent information repetition
5. The threshold limits for water level and air quality can be changed based on the
conditions of the locations
6. The data set obtained from the initial phase can be used for research purposes.
7. Design and Integration of the Crop Monitoring via web application or Android
based app is also an interesting route.

CHAPTER 6

REFERENCES

List all sources you referred to and make sure you cite it in the appropriate place(s) in
the text.
[1] S. Harishankar, R. Sathish Kumar, Sudharsan K.P, U. Vignesh and T. Viveknath, “Solar
Powered Smart Irrigation System”, © Research India Publications, Advance in Electronic
and Electric Engineering, ISSN 2231-1297, Volume 4, Number 4 (2014), pp. 341-346
http://www.ripublication.com/aeee.htm

[2] Abdelouahed Selmani, Mohamed Outanoute, Hassan Oubehar, Abdelali Ed-Dahhak,


Abdeslam Lachhab, Mohammed Guerbaoui, Benachir Bouchikhi, “An Embedded Solar-
Powered Irrigation System Based on a Cascaded Fuzzy Logic Controller”, Asian Journal Of
Control July (2019), Sensors, Electronic & Instrumentation Team, Faculty of Sciences,
Moulay Ismaïl University, B.P. 11201, Meknes, Morocco, Vol. 21, Issue 4, pp. 1941-1951,
DOI: 10.1002/asjc.2220

[3] Shuxiao Wang, Jiming Hao, “Air quality management in China: Issues, challenges, and
options”, Journal Of Environmental Sciences (2012), Vol. 24, Issue 1,pp. 2-13, ISSN 1001-
0742 CN 11-2629/X, DOI: 10.1016/S1001-0742(11)60724-9
http://www.jesc.ac.cn

[4] AI Abdelkerim, MMR Sami Eusuf, MJE Salami, A. Aibinu and M A


Eusuf,”Development of Solar Powered Irrigation System”, IOP Conference Series: Materials
Science and Engineering, Volume 53, 5th International Conference on Mechatronics (ICOM'
13) 2–4 July 2013, Kuala Lumpur, Malaysia, DOI:10.1088/1757-899X/53/1/012005
[6] G. K. Baddewithana;G. A. H. S. Godigamuwa;P. S. Gauder;D. C. N.
Hapuarachchi;UdayaDampage;R. Wijesiriwardana, “Smart and automated fire and power
monitoring system”,  2013 IEEE 8th International Conference on Industrial and Information
Systems, DOI: 10.1109/ICIInfS.2013.6732042
 
[7] Pritam Ghosh;PalashKanti Dhar, ”GSM Based Low-cost Gas Leakage Explosion and Fire
Alert System with Advanced Security”, 2019 International Conference on Electrical,
Computer and Communication Engineering (ECCE)
DOI: 10.1109/ECACE.2019.8679411
 
[8] G.Sathyakala;V. Kirthika;B. Aishwarya, “Computer Vision Based Fire Detection with a
Video Alert System”, 2018 International Conference on Communication and Signal
Processing (ICCSP), DOI:10.1109/ICCSP.2018.8524216
 
[9] Automated Irrigation System Using Solar Power, Jia Uddin1 , S.M. Taslim Reza2 , Qader
Newaz2 , Jamal Uddin2 , Touhidul Islam2 , and Jong-Myon Kim1, 2012 7th International
Conference on Electrical and Computer Engineering 20-22 December, 2012, Dhaka,
Bangladesh
https://www.researchgate.net/publication/
261470241_Automated_irrigation_system_using_solar_power
 
 [10] Flood Detection using Sensor Network and Notification via SMS and Public Network
Mohamed Ibrahim Khalaf, Azizah suliman, College of information technology, department
system and network Universiti Tenaga Nasional (UNITEN), Student Conference On
Research And Development (SCOReD 2011) 2 nd November 2011, Administration Gallery,
UNITEN
https://www.researchgate.net/publication/
263088726_Flood_Detection_using_Sensor_Network_and_Notification_via_SMS_and_Publ
ic_Network 

[11] Real-time flood monitoring and warning system, Jirapon Sunkpho and Chaiwat
Ootamakorn, School of Engineering and Resources, Walailak University
J. Sunkpho & C. Ootamakorn / Songklanakarin J. Sci. Technol. 33 (2), 227-235, 2011
https://www.researchgate.net/publication/263922229_Real
time_flood_monitoring_and_warning_system
 
[12] Flood Monitoring and Early Warning System Using Ultrasonic Sensor
J G Natividad and J M Mendez 1)ICT Department, Isabela State University – Ilagan Campus,
City of Ilagan ,2)College of Computer Studies and Engineering, Lorma Colleges, San
Fernando City, La Union, Philippines DOI: 10.1088/1757-899X/325/1/012020
https://iopscience.iop.org/article/10.1088/1757-899X/325/1/012020/pdf
 
[13] Design of solar powered energizer and on-line monitoring of electric fencing system,
M.Anantha kumar Assistant Professor, Dept. of Electrical and Electronics Engg Sri Krishna,
College of Engg and Tech, DOI: 10.1109/ISCO.2014.7103946
https://ieeexplore.ieee.org/document/7103946
 
[14] Forest Fire Detection Using a Rule-Based Image Processing Algorithm and Temporal
Variation, Mubarak A. I. Mahmoud and Honge Ren,Volume 2018 
1

DOI: https://doi.org/10.1155/2018/7612487
 
[15] Implementation of Fire Image Processing for Land Fire Detection Using Color Filtering
Method, Ahmad Zarkasi, Siti Nurmaini, Deris Stiawan,Firdaus Masdung, Journal of Physics
Conference Series March 2019, DOI: 10.1088/1742-6596/1196/1/012003

BIODATA

Name : Prayansh S Joshi


Mobile Number : 9784320042
E-mail : prayanshshsank.joshi2018@vitstudent.ac.in
Permanent Addre s :Type4/22-A, Anukiran Colony, Rawatbhata, Rajasthan

Swati Tiwari
Mobile Number 9587119360
E-mail : swati.tiwari2018@vitstudent.ac.in
Permanent Address: P-385/4, Lancer Line, MES Colony , Army Area , Jodhpur, Rajasthan
Shikhar Kumar Padhy
9438052098
shikharkumar.padhy2018@vitstudent.ac.in
Gandhinagar 10th Lane, Bijipur, Berhampur, Odisha

TO
Kshitij Jain
Mobile Number : 9664339875
E-mail kshitij.jain2018@vitstudent.ac.in
Permanent Address Type4/22-A, Anukiran Colony, Rawatbhata, Rajasthan

Name : Monica Singh


Mobile Number 8527869199
E-mail : monica.singh2018@vitstudent.ac.in
Permanent Address: RZF28A,Palam colony sadhnagar 2,NEWdelhi-110045

: Ashish Kumar
9110066731
ashish.kumar2018a@vitstudent.ac.in
vill-chaksahbaj,p.o-marar,p.s-parsa,dist- saran,pin-841219

You might also like