Professional Documents
Culture Documents
18bec1080 Ece3999 Report
18bec1080 Ece3999 Report
18bec1080 Ece3999 Report
by
(ECE)
PRAYANSH S JOSHI (18BEC1229)
SWATI TIWARI (18BEC1080)
KSHITIJ JAIN (18BEC1145)
SHIKHAR KUMAR PADHY (18BEC1078)
ASHISH KUMAR (18BEC1294)
MONICA SINGH (18BEC1205)
Dr. VETRIVELAN. P
ECE3999
TECHNICAL ANSWERS FOR REAL WORLD PROBLEMS
in
B.Tech. ELECTRONICS AND COMMUNICATION ENGINEERING
Certified that this project report entitled “Smart Agriculture System” is a bonafide work of
Prayansh S Joshi (18BEC1229), Swati Tiwari (18BEC1080), Kshitij Jain (18BEC1145),
Shikhar Kumar Padhy (18BEC1078), Ashish Kumar (18BEC1294) and Monica Singh
(18BEC1205) who carried out the project work under my supervision and guidance for
ECE3999 - Technical Answers for Real World Problems.
Dr. VETRIVELAN.P
Agriculture is the broadest economic sector and plays an important role in the overall
economic development of a nation. Technological advancements in the arena of agriculture
will ascertain to increase the competence of certain farming activities.
In our project we have proposed a novel methodology for smart farming by linking a smart
sensing system and smart irrigation system through wireless communication technology. Our
system focuses on the measurement of physical parameters such as water level, air quality,
fire detection and leaf disease detection that plays a vital role in farming activities. Based on
the essential physical and chemical parameters we can help farmers to plan their strategies for
farming.
ACKNOWLEDGEMENT
We wish to express our sincere thanks and deep sense of gratitude to our project guide, Dr.
Vetrivelan. P, Associate Professor Senior, School of Electronics Engineering, for his
consistent encouragement and valuable guidance offered to us in a pleasant manner
throughout the course of the project work.
We express our thanks to our Head of the Department Dr. Vetrivelan. P for his
support throughout the course of this project.
We also take this opportunity to thank all the faculty of the School for their support
and their wisdom imparted to us throughout the course.
We thank our parents, family, and friends for bearing with us throughout the course of
our project and for the opportunity they provided us in undergoing this course in such a
prestigious institution.
ABSTRACT 3
ACKNOWLEDGMENT 4
1 INTRODUCTION
2 DESIGN
2.2 STANDARDS 28
3.1 SYSTEM 31
IMPLEMENTATION(ALGORITHM)
4 COST ANALYSIS 75
5.1 CONCLUSION 77
6 REFERENCES 79-81
BIODATA 81-82
PRESENTATION LINK:
https://docs.google.com/presentation/d/1hKWIMIlOKZjm4WqwTS6OFmOVFx0sJ9-
bJtPEpQJOjrI/edit?pli=1#slide=id.gdd69ada1c9_3_0
CHAPTER 1
INTRODUCTION
From a general perspective in this modern world everyone wants to be one level up. In this
era every human is running a race to be the best and this best is generally the one related to
high paying jobs , corporate world , a big startup idea and many more like this.
The usual movement of people is from the urban area to the rural area and it is quite obvious
because from the past few decades all the technological modernization have been majorly
taking place in the urban areas and also it provides much more paying jobs.
Now when the World has the COVID outbreak as we all know that nothing is being spared
and same is the case with jobs. We can hear about many reputed companies laying out the
employees in huge amounts and we can not imagine the impact on employees of smaller or
economically weaker companies.
Hence we have seen a backward movement from urban to the rural areas because agriculture
has been a more dependable and reliable job sector.
So we planned to make a surrounding which is technologically sound and can help farmers
ease up their job . Also we have accommodated various global goals so that the largest sector
of our country is doing its work and that too in a sustainable way.
It’s a step forward to the ever-increasing demand of humans and evolution of technologies.
The problem statement is based on a simple measurement scale known as SOCIAL
PROGRESS INDEX.
Our idea is inspired by a recent meeting held at the UN, which talked about how the world
can be a better place by 2030. Hence, understanding the need of these goals, we as a team
agreed to act upon and plan our model to our aims along with taking into account to ratify
some of the global issues.
So, in the UN the global goals discussed had several targets and several sub goals. We will be
summing all these targets using an index called SPI. SPI is a tool that sums up all the targets
that the global goals are trying to achieve, which we can use as a benchmark. The lowest
performing country Central African Republic scores 31, whereas the best performing country
is Norway with score 88. The world average is found to be 61 and the global goals can get us
to 75.
Now there are no economic indicators in the social progress index. Countries having huge
economies like Russia and China are underperforming in SPI due to their lack of headway in
human rights or environmental issues.
But countries like Costa Rica which have lesser economies are performing much better on
SPI. This is due to their focus on well-being and most of the listed 17 global goals rather than
wealth.
Hence, in our project we try to incorporate some of the 17 major global goals that are listed
below :
By implementing our model were able to address :
4 - CLIMATE ACTION
5 - LIFE ON LAND
CHAPTER 2
help of capacitor
divider arrangement
and by using Zigbee
wireless transmission
system. Since the
monitored data is
interfaced with the
internet server, the
operator can observe
the fence status at
anywhere in the
world. This system is
less economical while
comparing with the
other types of fault
10 Solar Powered Smart This paper proposes a The proposed eco-
Irrigation System system where solar energy friendly system aims
from the solar panels is at conserving energy
S. Harishankar, R. Sathish utilized to pump water by optimal usage of
Kumar, Sudharsan K.P, U. from bore well directly water by reducing
Vignesh and T.Viveknath into a ground level storage wastage and reduces
tank. Here a single stage the human
Advance in Electronic and
energy consumption intervention for
Electric Engineering.
wherein the water is farmers. The excess
ISSN 2231-1297, Volume 4, pumped into a ground energy from the solar
Number 4 (2014), pp. 341- level tank from which a panels can also act as
346 simple valve mechanism revenue for the
controls the flow of water farmers. Solar pumps
© Research India into the field. A valve also offer clean
Publications controller is used which solutions with no
helps in the regulation of danger of borehole
flow of water depending contamination.
upon the moisture present
http:// The system requires
in the soil using the soil
www.ripublication.com/ minimal maintenance
moisture sensor.
aeee.htm and attention as they
are self starting.
Step1:-start
Step4:-If, below 1/3 of the level, SOC full, Open valve and On-Motor and go to step 6
Step6:- If water is done for specified time, send message “Watering done “and got to step7
START
CONTINUE
ELSE
CONTINUE;
END;
This System Calculates the air pollutant concentration in the environment using MQ135
Sensor and Arduino Uno
For Transferring the data , we are using Thingspeak software with the ESP8266 WiFi
Module.
The analog output voltage returned by MQ135 is assumed to be directly proportional to the
concentration of pollutants in ppm.
Algorithm
START
if (t<=500) {
lcd.print("Fresh Air");
lcd.print("Poor Air");
Serial.print("Poor Air"); }
else if (t>=1000 )
lcd.print("Very Poor");
Serial.print("Very Poor");
SEND ALERT
END;
Components Required
● Arduino Uno
● 16X2 Charater LCD
● ESP8266 Wi-Fi Module
● MQ135 Gas Sensor
CIRCUIT DESIGN:
CODE:
#include <LiquidCrystal.h>
const int LM35 = A0;
const int motor = 13;
const int LedRed = 12;
const int LedGreen = 11;
delay(1000);
}
OUTPUT:
3.2.2 AIR QUALITY MONITORING
CODE:
#include <SoftwareSerial.h>
float t=0;
char data = 0;
void setup()
ser.begin(9600);
lcd.setCursor(0,0);
lcd.print(" Welcome");
lcd.setCursor(0,1);
lcd.print(" To ");
delay(3000);
lcd.clear();
lcd.setCursor(0,0);
lcd.print(" AIR");
lcd.setCursor(0,1);
lcd.print("QUALITY MONITOR");
delay(3000);
ser.println("AT"); // Attenuation
delay(1000);
ser.println("AT+GMR"); // To view version info for ESP-01 output: 00160901 and ESP-12
output: 0018000902-AI03
delay(1000);
delay(1000);
delay(5000);
delay(1000);
ser.println(cmd);
delay(1000);
lcd.clear();
lcd.setCursor(0,0);
lcd.print(" WIFI");
lcd.setCursor(0,1);
lcd.print(" CONNECTED");
void loop()
delay(1000);
Serial.print("Airquality = ");
Serial.println(t);
lcd.clear();
lcd.print (t);
lcd.setCursor (0,1);
if (t<=500)
lcd.print("Fresh Air");
lcd.print("Poor Air");
Serial.print("Poor Air");
else if (t>=1000 )
lcd.print("Very Poor");
Serial.print("Very Poor");
delay(10000);
lcd.clear(); lcd.setCursor(0,0);
lcd.setCursor(0,1);
lcd.print(" TO CLOUD");
esp_8266();
void esp_8266()
cmd += "\",80";
ser.println(cmd);
Serial.println(cmd);
if(ser.find("Error"))
Serial.println("AT+CIPSTART error");
return;
getStr += apiKey;
getStr +="&field1=";
getStr +=String(t);
getStr += "\r\n\r\n";
cmd += String(getStr.length());
ser.println(cmd);
Serial.println(cmd);
delay(1000);
ser.print(getStr);
Serial.println(getStr);
delay(17000);
OUTPUT:
3.2.3 FLOOD DETECTION
int PIR = 0;
int Distance = 0;
long readUltrasonicDistance(int triggerPin, int echoPin)
{
pinMode(triggerPin, OUTPUT); // Clear the trigger
digitalWrite(triggerPin, LOW);
delayMicroseconds(2);
// Sets the trigger pin to HIGH state for 10 microseconds
digitalWrite(triggerPin, HIGH);
delayMicroseconds(10);
digitalWrite(triggerPin, LOW);
pinMode(echoPin, INPUT);
// Reads the echo pin, and returns the sound wave travel time in microseconds
return pulseIn(echoPin, HIGH);
}
void setup()
{
pinMode(13, INPUT);
pinMode(12, OUTPUT);
pinMode(6, OUTPUT);
}
void loop()
{
PIR = digitalRead(13);
delay(10); // Wait for 10 millisecond(s)
if (PIR == HIGH) {
digitalWrite(12, HIGH);
delay(1); // Wait for 1 millisecond(s)
} else {
digitalWrite(12, LOW);
}
Distance = 0.01723 * readUltrasonicDistance(5, 4);
if (Distance <= 100) {
tone(6, 880, 125); // play tone 69 (A5 = 880 Hz)
delay(125); // Wait for 125 millisecond(s)
} else {
noTone(6);
}
}
OUTPUT :
#include <LiquidCrystal.h>
LiquidCrystal lcd(7,4,3,2,A2,A3);
int setupESP8266(void)
{
Serial.begin(115200);
Serial.println("AT");
delay(10);
if (!Serial.find("OK"))
return 1;
Serial.println("AT+CWJAP=\"" + ssid + "\",\"" + password + "\"");
delay(10);
if (!Serial.find("OK"))
return 2;
Serial.println("AT+CIPSTART=\"TCP\",\"" + host + "\"," + httpPort);
delay(50);
if (!Serial.find("OK"))
return 3;
return 0;
}
void setup()
{
lcd.begin(16, 2);
setupESP8266();
digitalWrite(motor, LOW);
}
void loop()
{
int motorstatus=0;
pinMode(sensor, OUTPUT);
digitalWrite(sensor, LOW);
delayMicroseconds(2);
digitalWrite(sensor, HIGH);
delayMicroseconds(5);
digitalWrite(sensor, LOW);
pinMode(sensor, INPUT);
dist = pulseIn(sensor, HIGH);
dist = dist/29/2;
percent = (dist * -100.0)/330 + 100;
if(percent<30)
{
motorstatus=1;
digitalWrite(motor, HIGH);
}
if(percent>90)
{
motorstatus=0;
digitalWrite(motor, LOW);
}
lcd.setCursor(0,0);
lcd.print("Percent : ");
lcd.print(percent);
lcd.setCursor(0,1);
lcd.print("Motor: ");
if(motorstatus==0)
lcd.print("Off");
else
lcd.print("On ");
String httpPacket= "GET "+ url +"&field1="+percent+"&field2="+motorstatus+"
HTTP/1.1\r\nHost: " + host + "\r\n\r\n";
int length = httpPacket.length();
Serial.print("AT+CIPSEND=");
Serial.println(length);
delay(10);
Serial.print(httpPacket);
delay(10);
Serial.println(dist);
if (!Serial.find("SEND OK\r\n"))
return;
}
OUTPUT :
3.2.5 EARLY FIRE DETECTION USING DEEP LEARNING AND SENDING
WHATSAPP ALERT USING SELENIUM
CODE:
import matplotlib.pyplot as
plt import numpy as np
import argparse
import os
from
google.colab
import drive
drive.mount('/con
tent/drive')
Mounted at /content/drive
def prep(data_folder_path):
dirs = os.listdir(data_folder_path)
images = []
labels = []
print(dir_name)
if dir_name=='0':
label = 0
elif dir_name=='1':
label= 1
subject_dir_path = data_folder_path + "/" + dir_name
subject_images_names = os.listdir(subject_dir_path;
if image_name.startswith("."):
continue;
images.append( image_path)
labels.append(label);
return images,labels
APPENDING IMAGES
tl = len(train_img)
train_data = []
c=0
img = img_to_array(img)
img = preprocess_input(img)
train_data.append(img)
c+=1
print(c)
tl = len(test_img)
test_data = []
c=0
img = load_img(path,target_size=(224,224))
img = img_to_array(img)
img = preprocess_input(img)
test_data.append(img)
c+=1
print(c)
CHANGING AS A ARRAY
trainY.shape
aug = ImageDataGenerator(
rotation_range=20,
zoom_range=0.15,
width_shift_range=0.2,
height_shift_range=0.2,
shear_range=0.15,
horizontal_flip=True,
fill_mode="nearest")
DATA AUGMENTATION
headModel = baseModel.output
headModel = Flatten(name="flatten")(headModel)
headModel = Dropout(0.5)(headModel)
layer.trainable = False
INIT_LR = 1e-4
EPOCHS = 100
BS = 64
opt = Adam(lr=INIT_LR,decay=INIT_LR/EPOCHS)
model.compile(optimizer=opt,loss="sparse_categorical_crossentropy",metrics=["accuracy"])
from tensorflow import keras
class callbacks(keras.callbacks.Callback):
def on_epoch_end(self,epochs,logs={}):
self.model.stop_training = True
callbacks = callbacks()
H = model.fit(
aug.flow(trainY,trainX),
steps_per_epoch=len(trainX)//BS,
validation_data=(testY,testX),
validation_steps=len(testX)//BS,epochs=EPOCHS,shuffle=True,callbacks = [callbacks])
TRAINING
acc = H.history['accuracy']
val_acc = H.history['val_accuracy']
loss = H.history['loss']
val_loss = H.history['val_loss']
epochs = range(len(acc))
plt.figure(figsize=(10,6))
plt.plot(epochs,acc)
plt.plot(epochs,val_acc)
plt.figure()
plt.plot(epochs,loss)
plt.plot(epochs,val_loss)
model.save_weights("model_fire_weights.h5")
json_file.close()
model = model_from_json(loaded_model_json)
model.load_weights("/content/drive/My Drive/archive/model_fire_weights.h5"
LOADING TRAINED MODEL
#path = "/content/drive/My
Drive/archive/test/1/100.jpg" path=
"/content/drive/My
Drive/archive/test/0/450.jpg" #path=
"/content/drive/My Drive/archive/test/0/450.jpg"
img=load_img(path,target_size=(224,224))
img1=cv2.imread(path)
img1
=cv2.cvtColor(img1,cv2.COL
OR_BGR2RGB) #re =
cv2.imread(path)
#img
=cv2.cvtColor(re,cv2.COLOR
_BGR2RGB) #img
=cv2.resize(img,(224,224))
img = img_to_array(img)
img = preprocess_input(img)
img =
np.expand_dims(img,
axis=0) last
=time.time()
pred =
model.pre
dict(img)
#print(tim
e.time()-
last)
op=clas[n
p.argmax(
pred)] e =
np.argmax
(pred)
x=round(pre
d[0]
[e]*100,2)
conf = str(x)
+'%'
conf
if
x>8
0.0:
cv2.putText(img1,op,(25,50),cv2.FONT_HERSHEY_SIMPLEX,2,(0,200,0),3)
plt.imshow(img1)
TESTING WITH THREE DIFFERENT IMAGES
deftake_photo(filename='photo.jpg'
,quality=0.8): js=Javascript('''
div.appendChild(capture);
const video =
document.createElement('video');
video.style.display = 'block';
document.body.appendChild(div);
div.appendChild(video);
video.srcObje
ct = stream;
awaitvideo.pl
ay();
google.colab.output.setIframeHeight(document.documentElement.scrollHeight,
true);
const canvas =
document.createElement('canvas');
canvas.width =video.videoWidth;
canvas.height = video.videoHeight;
canvas.getContext('2d').drawImage(
video,0,0);stream.getVideoTracks()
[0].stop();
div.remove();
return filename;
filena
me=t
ake_p
hoto(
)
impor
t cv2
import time
clas = ["no
fire","fire"
] path
=filename
img=load_img(path,target_size=(224,224))
img1=cv2.imread(path)
img1
=cv2.cvtColor(img1,cv2.COL
OR_BGR2RGB) #re =
cv2.imread(path)
#img
=cv2.cvtColor(re,cv2.COLOR
_BGR2RGB) #img
=cv2.resize(img,(224,224))
img = img_to_array(img)
img = preprocess_input(img)
img =
np.expand_dims(img,
axis=0) last
=time.time()
pred =
model.pre
dict(img)
#print(tim
e.time()-
last)
op=clas[n
p.argmax(
pred)] e =
np.argmax
(pred)
x=round(pre
d[0]
[e]*100,2)
conf = str(x)
+'%'
conf
if
x>8
0.0:
cv2.putText(img1,op,(25,50),cv2.FONT_HERSHEY_SIMPLEX,2,(0,200,0),3)
plt.imshow(img1)
OUTPUT
SENDING WHATSAPP ALERT USING SELENIUM:
filename=t
ake_photo(
0)
import cv2
import time
clas = ["no
fire","fire"]
path
=filename
img =
load_img(path,target_size=(
224,224))
img1=cv2.imread(path)
img1
=cv2.cvtColor(img1,cv2.COLO
R_BGR2RGB) #re =
cv2.imread(path)
#img
=cv2.cvtColor(re,cv2.COLO
R_BGR2RGB) #img =
cv2.resize(img,(224,224))
img = img_to_array(img)
img = preprocess_input(img)
img =
np.expand_dims(img,
axis=0) last =
time.time()
pred =
model.predict
(img)
#print(time.t
ime()-last)
op=clas[np.ar
gmax(pred)] e
=
np.argmax(pre
d)
x =
round(pred[0]
[e]*100,2) conf
= str(x)+'%'
print(x)
print(op)
op = op +
'-' +
conf if
x>80.0:
cv2.putText(img1,op,(25,50),cv2.FONT_HERSHEY_SIMPLEX,2,
(0,200,0),3)
plt.im
show(i
mg1)
if
e==1:
driver = webdriver.Chrome('/content/drive/My
Drive/selenium/chromedrive r_win32/chromedriver.exe')
driver.get('http://
web.whatsapp.com') msg
= "FIRE DETECTED PLEASE
HELP..."
name = 'Fire'
user = driver.find_element_by_xpath('//span[@title =
"{}"]'.format(name
))
user.click()
msg_box =
driver.find_element_by_xpath('//*[@id="main"]/footer/di
v[1]/d
iv[2]/div/div[2]')
msg_box.send_keys(msg)
driver.find_element_by_xpath('//*[@id="main"]/footer/di
v[1]/div[3]/butt
on/span').click()
OUTPUT:
COST ANALYSIS:
Camera:1200
TOTAL COST:Rs1900
Code:
import numpy as np
import pickle
import cv2
Preprocessing of dataset:
EPOCHS = 20
INIT_LR = 1e-3
BS = 32
image_size = 0
width=256
height=256
depth=3
def convert_image_to_array(image_dir):
try:
image = cv2.imread(image_dir)
return img_to_array(image)
else :
return np.array([])
except Exception as e:
print(f"Error : {e}")
return None
try:
root_dir = listdir(directory_root)
if directory == ".DS_Store" :
root_dir.remove(directory)
plant_disease_folder_list = listdir(f"{directory_root}/{plant_folder}")
if disease_folder == ".DS_Store" :
plant_disease_folder_list.remove(disease_folder)
plant_disease_image_list =
listdir(f"{directory_root}/{plant_folder}/{plant_disease_folder}/")
if single_plant_disease_image == ".DS_Store" :
plant_disease_image_list.remove(single_plant_disease_image)
image_directory =
f"{directory_root}/{plant_folder}/{plant_disease_folder}/{image}"
image_list.append(convert_image_to_array(image_directory))
label_list.append(plant_disease_folder)
except Exception as e:
print(f"Error : {e}")
image_size = len(image_list)
In [6]:
label_binarizer = LabelBinarizer()
image_labels = label_binarizer.fit_transform(label_list)
pickle.dump(label_binarizer,open('label_transform.pkl', 'wb'))
n_classes = len(label_binarizer.classes_)
Various Classification of diseases:
print(label_binarizer.classes_)
Normalisation of array:
Data augmentation:
aug = ImageDataGenerator(
rotation_range=25, width_shift_range=0.1,
height_shift_range=0.1, shear_range=0.2,
zoom_range=0.2,horizontal_flip=True,
fill_mode="nearest")
In [11]:
model = Sequential()
inputShape = (height, width, depth)
chanDim = -1
if K.image_data_format() == "channels_first":
chanDim = 1
model.add(Activation("relu"))
model.add(BatchNormalization(axis=chanDim))
model.add(MaxPooling2D(pool_size=(3, 3)))
model.add(Dropout(0.25))
model.add(Activation("relu"))
model.add(BatchNormalization(axis=chanDim))
model.add(Activation("relu"))
model.add(BatchNormalization(axis=chanDim))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Activation("relu"))
model.add(BatchNormalization(axis=chanDim))
model.add(Activation("relu"))
model.add(BatchNormalization(axis=chanDim))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(1024))
model.add(Activation("relu"))
model.add(BatchNormalization())
model.add(Dropout(0.5))
model.add(Dense(n_classes))
model.add(Activation("softmax"))
In [12]:
model.summary()
Training Model:
# distribution
model.compile(loss="binary_crossentropy", optimizer=opt,metrics=["accuracy"])
In [14]:
history = model.fit(
validation_data=(x_test, y_test),
steps_per_epoch=len(x_train) // BS,
epochs=EPOCHS, verbose=1
)
Plotting of Model Parameters:
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
plt.legend()
plt.figure()
plt.legend()
plt.show()
Calculating Model accuracy:
PV Sizing: Different sizes of PV modules will produce different amounts of power. To find
out the sizing of the PV module, the total peak watt produced is needed. The peak watt (WP)
produced depends on size of the PV module and climate of site location. To determine the
sizing of the PV modules, calculate as follows:
Total Load Connected =[D.C Pump Power Rating * Time of usage] + [Remaining
Components Power Rating* Time of usage]
Battery Sizing: The Amp-hour (Ah) Capacity of a battery tries to quantify the amount of
usable energy it can store at a nominal voltage. All things equal, the greater the physical
volume of a battery, the larger its total storage capacity.
Total Load Connected = Sum of all appliances (power rating of each device * Time of usage)
1. Design a sustainable low-cost, low-power, irrigation system to control and reduce water
waste in farming
3. The sub-system should be able to operate in real-time, and remotely without human
intervention
4. Economics should be kept in mind to keep parts and configuration costs as low as possible
5. The design should be tailored to the type of crops being grown and the landscape
Where,
3) FLOOD DETECTION :
Water Level for Water Level Monitoring is measured in percentage of the total volume of
container hence is variable.
For Example :
Tank Capacity : To find the capacity of a rectangular or square tank: Multiply length (L) by
width (W) to get area (A). Multiply area by height (H) to get volume (V). Multiply volume
by 7.48 gallons per cubic foot to get capacity (C).
For suppose the capacity comes around 30 gallons then if the water level is below 30% I.e
below 9 gallons in this case then the alarm will be raised or the motor will be turned on
automatically(in case of water level monitoring).
1. Accuracy and Precision: Sensors vary in their accuracy. Users need to determine the
level of precision required to credibly measure the variable(s) of interest.
Manufacturers often provide a sensor specification data sheet, which contains
information about sensor sensitivity, range, energy requirements, and operating
conditions. Yet field-testing of sensors is required in the actual location of
deployment, as lab and field conditions may vary.
2. Frequency and Duration: Another trade-off is frequency (how often) and duration
(total period) of data reading and logging. High frequency monitoring—for example,
with samples taken multiple times per minute—can be costly in terms of power, data
storage, and data transmission. In general, there are two methods for logging data:
3. Data Retrieval: There are two primary options for storing and retrieving sensing
data: manually or remotely. Manual data storage and retrieval is usually done
through a micro secure data (SD) card incorporated into the sensor. This is relatively
cheap and minimizes the size and cost of sensors being deployed. This option requires
end-user interaction on-site to manually download the data, either through data cables
or through Bluetooth, RFID or near field communications (NFC).
4. Remote Monitoring and Calibration: Sensors can be programmed to transmit data
to the cloud for sharing and visualization in real or near-real time, using services like
Xively or Open.Sen.se. Some platforms provide custom dashboards to view streaming
data, create instant reports, or update sensor calibration and reporting parameters
remotely. This can be particularly useful when many sensors are deployed over a
large area and the data collection strategy is expected to change after a period of time.
5. Power Requirements: In resource-constrained environments, where access to
continuous power is challenging or impossible, sensors often rely on stored power .
For users, there is a clear tradeoff between measurement frequency and energy
consumption: the lower the frequency, the higher the battery life. Since field trips to
replace batteries can be time-consuming and expensive, organizations and engineers
often design innovative methods to decrease the sensor's energy requirements.
6. Cost: The price tag for any sensor or sensor network will include fixed costs
(materials) as well as expenses for operations and maintenance, including data
transmission or retrieval, batteries or other power supply, and labor. In addition,
technical support may be required to deploy, calibrate, troubleshoot, or repair sensors
in the field. Sensors deployed in resource-constrained environments may require
additional packaging to protect against precipitation or extreme heat; this can also
drive up costs.
7. Community Response: When deploying sensors for the SMART AGRICULTURE
SYSTEM, organizations need to consider closely how the community will react to the
devices, and how this can affect data quality. Are you measuring human behaviors, or
capturing environmental data? Where will the sensors be placed: in homes, or outside
in public areas? How does the monitored population interact with the devices? Are
you carrying out participatory measurement, with individual community members
helping to collect the data?
To overcome these issues, organizations can carry out a reactivity study that compares
data from a group with knowledge of the sensor, to one that does not know when the
sensor is being deployed. It is also possible to triangulate between different data
collection methods (e.g., comparing sensing data with survey data that measure the
same variables) to estimate the bias in each data generating process.
TIME COMPLEXITY:
FOR WATER LEVEL , AIR QUALITY AND SOLAR POWERED AUTOMATIC
IRRIGATION
Since the code only has one if condition which checks the threshold level in each module
and since the statements are O(1), the total time for the for loop is N * O(1), which is O(N)
overall.
The DNN contains hidden layers, training images and we are using a mobile net that has very
few parameters compared to other models. So here we are going to use the fire category only.
SO the total time complexity of the model is O (N^2). This method saves a lot of
computation time in screening the fire pixels. It operates quickly and does not include
complex calculations,
The CNN model that we have used has different libraries. Different libraries have different
time complexity but the highest time complexity and since three for loop used is O(n^3)
which is used in converting image to array.
SPACE COMPLEXITY:
Each iteration of the loop will need O(1) space for temporary variables, but this space can get
reused because after each loop iteration finishes, the space for those temporaries isn't needed
anymore and can be reused in the next iteration. Therefore, the total space needed is just O(1)
The number of units is a measure of space complexity, It does not include complex
calculations. In this model we did training and testing of images for neural networks so Total
space complexity is O(n^2).
FOR LEAF DISEASE DETECTION :
Space complexity is the amount of space required by the model, so space complexity of CNN
model is O(n^2 k).
ECONOMIC FEASIBILITY :
Considering the battery life expectancy and reliable single-hop communication abilities, IoT
monitoring systems are believed to be the most reliable solutions for IAQ measurement. With
lower latencies and lesser power consumption, these systems also demand lesser efforts for
maintenance.Since the model only uses the basic module of arduino(atmega), and sensors like
MQ135 , UltraSonic sensor and basic modules like wifi module hence it is very pocket
friendly and the whole setup is made in a way that farmers don't feel the economic burden.
ENVIRONMENTAL FEASIBILITY :
IoT sensors reduce energy consumption, generate renewable energy on-site, and measure
carbon consumption plus waste.IoT-powered precision agriculture can be another facilitator
of change. Producing more and wasting less is the major rationale behind smart, data-driven
agricultural solutions. IoT is already in place when it comes to monitoring crops and soil
conditions, with this model we can monitor and reduce greenhouse gas emissions as well.
DEMOGRAPHIC FEASIBILITY :
Data from devices can guide farmers’ decisions, helping them farm smarter and safer and
adapt more quickly to changing conditions.
The ability to monitor farm conditions and infrastructure remotely can free up time, labour
and capital to invest, allowing farmers to focus on other things.
● remote monitoring of farm conditions and infrastructure, saving time and labour on
routine farm checks
● efficiency in how we produce food to ensure less wastage, expediency to market, and
enhanced traceability to demonstrate safe and sustainable food to our customers
● building the capabilities to respond to new and emerging technologies and investing
in research and development to contribute to ongoing innovation and improved productivity.
CHAPTER 4
COST ANALYSIS
Miscellaneous - Rs 200
TOTAL COST:
6. 12v Relay – Rs 20
COST ANALYSIS:
CHAPTER 5
5.1 CONCLUSION
The SMART AGRICULTURE SYSTEM was built and implemented for our target audience
mainly for the farmers. This prototype can be used to study the conditions of the land as well
as can ease the farm labour ,make the farm smarter and safer and adapt more quickly to
changing conditions.The ability to monitor farm conditions and infrastructure remotely can
free up time, labour and capital to invest, allowing farmers to focus on other things.
The system implements using the Arduino for the irrigation , air and water detection system
and uses Raspberry Pi controller for faster computation and minimum delay for the fire
Detection using deep learning.
The preliminary test results were promising. Due to lack of availability of open areas and free
movement, complete real time analysis could not be conducted and hence we finalised our
observations based on online simulation using tinkercad .
This project was an immense opportunity at understanding interdisciplinary work and the
practical relationship between our specializations. As much as possible the work was carried
out with vigour and excitement. The SMART AGRICULTURE SYSTEM was successfully
carried out to the best of our abilities.
1. We can further add more features in our project like SOLAR POWERED FENCING
for the security and preventing the farms from wild animals .
2. After conducting surveys with the farmers and based on the area , we can suggest
solar paints on houses and electricity producing wells.
3. Modifying the electronic circuit with advanced and latest sensors will lead to better
precision and accuracy in results.
4. Improvised sensor placement can be done for the system to detect the conditions
accurately enough to prevent information repetition
5. The threshold limits for water level and air quality can be changed based on the
conditions of the locations
6. The data set obtained from the initial phase can be used for research purposes.
7. Design and Integration of the Crop Monitoring via web application or Android
based app is also an interesting route.
CHAPTER 6
REFERENCES
List all sources you referred to and make sure you cite it in the appropriate place(s) in
the text.
[1] S. Harishankar, R. Sathish Kumar, Sudharsan K.P, U. Vignesh and T. Viveknath, “Solar
Powered Smart Irrigation System”, © Research India Publications, Advance in Electronic
and Electric Engineering, ISSN 2231-1297, Volume 4, Number 4 (2014), pp. 341-346
http://www.ripublication.com/aeee.htm
[3] Shuxiao Wang, Jiming Hao, “Air quality management in China: Issues, challenges, and
options”, Journal Of Environmental Sciences (2012), Vol. 24, Issue 1,pp. 2-13, ISSN 1001-
0742 CN 11-2629/X, DOI: 10.1016/S1001-0742(11)60724-9
http://www.jesc.ac.cn
[11] Real-time flood monitoring and warning system, Jirapon Sunkpho and Chaiwat
Ootamakorn, School of Engineering and Resources, Walailak University
J. Sunkpho & C. Ootamakorn / Songklanakarin J. Sci. Technol. 33 (2), 227-235, 2011
https://www.researchgate.net/publication/263922229_Real
time_flood_monitoring_and_warning_system
[12] Flood Monitoring and Early Warning System Using Ultrasonic Sensor
J G Natividad and J M Mendez 1)ICT Department, Isabela State University – Ilagan Campus,
City of Ilagan ,2)College of Computer Studies and Engineering, Lorma Colleges, San
Fernando City, La Union, Philippines DOI: 10.1088/1757-899X/325/1/012020
https://iopscience.iop.org/article/10.1088/1757-899X/325/1/012020/pdf
[13] Design of solar powered energizer and on-line monitoring of electric fencing system,
M.Anantha kumar Assistant Professor, Dept. of Electrical and Electronics Engg Sri Krishna,
College of Engg and Tech, DOI: 10.1109/ISCO.2014.7103946
https://ieeexplore.ieee.org/document/7103946
[14] Forest Fire Detection Using a Rule-Based Image Processing Algorithm and Temporal
Variation, Mubarak A. I. Mahmoud and Honge Ren,Volume 2018
1
DOI: https://doi.org/10.1155/2018/7612487
[15] Implementation of Fire Image Processing for Land Fire Detection Using Color Filtering
Method, Ahmad Zarkasi, Siti Nurmaini, Deris Stiawan,Firdaus Masdung, Journal of Physics
Conference Series March 2019, DOI: 10.1088/1742-6596/1196/1/012003
BIODATA
Swati Tiwari
Mobile Number 9587119360
E-mail : swati.tiwari2018@vitstudent.ac.in
Permanent Address: P-385/4, Lancer Line, MES Colony , Army Area , Jodhpur, Rajasthan
Shikhar Kumar Padhy
9438052098
shikharkumar.padhy2018@vitstudent.ac.in
Gandhinagar 10th Lane, Bijipur, Berhampur, Odisha
TO
Kshitij Jain
Mobile Number : 9664339875
E-mail kshitij.jain2018@vitstudent.ac.in
Permanent Address Type4/22-A, Anukiran Colony, Rawatbhata, Rajasthan
: Ashish Kumar
9110066731
ashish.kumar2018a@vitstudent.ac.in
vill-chaksahbaj,p.o-marar,p.s-parsa,dist- saran,pin-841219