Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 57

IOT BASED PESTICIDE SPRAYING

ROBOT

A Project Report Submitted to the Faculty of Engineering & Technology at Eastern


University in partial fulfillment of the requirements for the Award of Degree of
Bachelor of Science in Electrical and Electronic Engineering.

Submitted By

Xxx
Xxx
xxx

Supervised by
xxx
Assistant Professor
Department of Electrical & Electronic Engineering Faculty of Engineering &
Technology

Eastern University

1
Declaration
It is hereby declared that the whole thesis or part of it has not been taken from other
works without reference. It is also declared that this thesis or part of it has not been
submitted elsewhere for the award of any degree or diploma.

Xx
Xx
Xx

2
Acknowledgement

This thesis title is “IoT Based Pesticide Spraying Robot" submitted by xxxx has
been accepted satisfactory in partial fulfillment of the requirement for the Bachelor
of Science in Electrical and Electronic Engineering.

Board of Examiners

1.
Prof. Dr. Mirza Golam Rabbani Chairman
Chairperson
Department of Electrical and Electronic Engineering
Faculty of Engineering and Technology
Eastern University Dhaka, Bangladesh

2.
Mehedy Hasan Sumon Member
Lecturer
Department of Electrical and Electronic Engineering
Faculty of Engineering and Technology
Eastern University Dhaka, Bangladesh

3.
Kazi Rizwanul Huq Member

Assistant Professor

3
Department of Electrical and Electronic Engineering
Faculty of Engineering and Technology
Eastern University Dhaka, Bangladesh

4.
Md. Maidul Islam Supervisor

Assistant Professor
Department of Electrical and Electronic Engineering
Faculty of Engineering and Technology
Eastern University Dhaka, Bangladesh

4
Acknowledgement

First of all, we would like to express our sincere gratitude to our advisor Md. Maidul
Islam for giving us an opportunity to conduct research work with him during our
undergraduate studies. We would like to thank him for his continuous mentoring,
support, and guidance throughout the work. We could not have imagined having a
better advisor and mentor for our B.Sc. Engineering degree. We would also like to
thank our family and friends for their encouragement, and insightful comments.
Again, special thanks to Md. Maidul Islam for giving data and research guideline in
the point of views of wireless power transmission.
This study would not be accomplished without the research assistantship from our
advisor. Thank you all

i
Dedicated to

Our Parents

ii
Abstract

The management of pest insects is the critical component of agricultural production


especially in the fertigation based farm. Although the fertigation farm in Malaysia has
advantages in the fertilization and irrigation management system, it still lacking with
the pest management system. Since almost the insect and pests are living under the
crop’s leaves, it is difficult and hard labor work to spray under the leaves of the crop.
Almost agricultural plants are damaged, weakened, or killed by insect pests
especially. These results in reduced yields, lowered quality, and damaged plants or
plant products that cannot be sold. Even after harvest, insects continue their damage
in stored or processed products. Therefore, the aim of this study is to design and
develop an autonomous pesticide sprayer for the chili fertigation system. Then, this
study intends to implement a flexible sprayer arm to spray the pesticide under the
crop’s leaves, respectively. This study involves the development of unmanned
pesticide sprayer that can be mobilized autonomously. It is because the pesticide is a
hazardous component that can be affected human health in the future if it exposed
during manual spraying method especially in a closed area such as in the greenhouse.
The flexible sprayer boom also can be flexibly controlled in the greenhouse and
outdoor environment such as open space farms. It is expected to have a successful
pesticide management system in the fertigation based farm by using the autonomous
pesticide sprayer robot. Besides, the proposed autonomous pesticide sprayer also can
be used for various types of crops such as rock melon, tomato, papaya, pineapples,
vegetables and etc. A pesticide spraying ramble is the device for exact pesticide
spraying equipped for managing nebulous shapes and variable article targets. The
gadget incorporates a solitary splash siphon engine with a consequently separate
flexible spraying utilizing ultrasonic sensors, all mounted on a pan tilt unit. The site-
explicit spraying gadget plans to splash explicit targets while diminishing the
utilization of pesticides. The proposed framework includes the advancement of an
article explicit sprayer arrangement. The created gadget intends to diminish pesticide
application by spraying singular targets explicitly by setting the item separation of the
spraying as per the objective. The spraying device is equipped for decreasing the
measure of pesticides connected. Real reserve funds rely upon the spraying lengths,
target size, and appropriation. We trust that such a device can be utilized in present
day farming and can be joined with an automated sprayer exploring independently
along yield fields. Such a gadget will add to decreased pesticide application.

Contents
iii
Name of Contents Page No
Declaration i
Acknowledgement ii
Abstract iii
Contents iv
List of Figure vi
List of Table vii
Abbreviations viii

Chapter 1: Introduction

1.1 Introduction 1
1.2 Objective of the Project 2
1.3 Motivation 2
1.4 Proposed System 3
1.5 Organization of the Project Book 4

Chapter 2: System Design

2.1 Introduction 5
2.2 Block Diagram 5
2.2.1 Flow Chart 6
2.3 What is IoT? 6
2.4 Agriculture Pesticide Robot 8

Chapter 3: Hardware Implementation

3.1 Introduction 9
3.1.1 Required Components 9
3.2 L293D Motor Driver 12
3.2.1 Pins of L293D Motor Driver 12
3.3 Water Pump 15
3.4 DC Gear Motor 15
3.5 Jumper Wire 16
3.6 DC Power Supply 16
3.7 Smart Phone 17
3.8 ESP 32 Cam 17
3.8.1 Pins of ESP 32 CAM 18
3.9 Arduino IDE Software 18
3.10 Spraying Nozzle 20
3.11 Circuit Diagram 22
3.12 Required software and their set up process 22
3.13 Software 22
3.13.1 Arduino IDE Installation 23

iv
3.13.2 Installation and setup of the Arduino software 23
3.14 Programming 24
3.14.1 Arduino program development 24

Chapter 4: Experimental Result

4.1 Introduction 26
4.2 Final Result 27
4.3 Project Outlook 28
4.4 Discussion 28

Chapter 5: Conclusion

5.1 Introduction 30
5.2 Advantages 30
5.3 Limitations 30
5.4 Future Improvement 30
5.5 Cost Analysis 31
5.6 Conclusion 31

References 32

Appendix 34

Programming Code For this project 34

List of Figure

v
vi
Figure No. Figure Contains Page No
Figure 2.1 Block Diagram 5
Figure 2.2 Flow Chart 6
Figure 2.3 IoT (Internet of Things) 7
Figure 2.4 Flowchart of IoT (Internet of Things) 7
Figure 2.5 Agriculture Pesticide Spraying Robot 8
Figure 3.1 Motor Driver 12
Figure 3.2 Water Pump 15
Figure 3.3 DC Gear Motor 15
Figure 3.4 Jumper Wire 16
Figure 3.5 18650 LiPo Battery 16
Figure 3.6 Mobile Phone 17
Figure 3.7 ESP 32 CAM 17
Figure 3.8 Arduino IDE 19
Figure 3.9 Spraying Nozzle 21
Figure 3.10 Circuit Diagram 22
Figure 3.11 Program installation Process-1 23
Figure 3.12 Program installation Process-2 24
Figure 3.13 Flowchart of the compiling process 24
Figure 4.1 Controlling View of Robot, Camera 27
Figure 4.2 Project Outlook 28

List of Table

vii
Table No. Table of Contains Page No
Table 3.1 Pins of L298N Motor Driver 12
Table 3.2 Pins of ESP 32 CAM 18
Table 5.1 Cost Analysis of this project 31

Abbreviations

viii
SL No Short Name Abbreviation
1 IoT Internet of Thing
2 MCU Microcontroller Unit
3 I2C Inter Integrated Circuit
4 UART Universal Asynchronies Receiver Transmitter

ix
Chapter 1
Introduction

1.1 Introduction
Farming is Bangladesh’s cornerstone. In our nation, approximately 14.3 million
hector of soil is irrigated crop region. The Economic Survey says that there is a need
to improve farm mechanization in the nation. Increasing Pest infestation productivity
control plays a significant role. The farmers are facing significant issues in managing
pest infestation. Pests are undesirable insects or germs that interfere with human
activity and can bite, ruin food plants or make life harder for farmers. A key point in
crop management is early detection and avoidance of pests. Effective control of pests
needs some understanding of pests and their habitats. Farmers are currently spraying
pesticides around their fields. Farming is Bangladesh's cornerstone. In our nation,
approximately 14.3 million hector of soil is irrigated crop region. The Economic
Survey says that there is a need to improve farm mechanization in the nation.
Increasing Pest infestation productivity control plays a significant role. The farmers
are facing significant issues in managing pest infestation. Pests are undesirable insects
or germs that interfere with human activity and can bite, ruin food plants or make life
harder for farmers. A key point in crop management is early detection and avoidance
of pests. Effective control of pests needs some understanding of pests and their
habitats. Farmers are currently spraying pesticides around their fields.
The main disadvantages with regard to this method are: the pesticide may come into
contact with the farmer during spraying, which may trigger skin cancer and asthma
illnesses. Increased pesticide spraying can impact consumer health as it enters the
food chain. Pesticides are also sometimes sprayed on non-affected crops resulting in
the same waste. We have therefore created an automated robotic system that can spray
pesticides in restricted quantities only if pests are discovered to solve the above-
mentioned problems. Not only does this save the farmer from life-threatening
illnesses and physical issues, but it also saves his cash because of restricted pesticide
use. That is why it helps farmers, in turn the nation, to develop economically. Using
this form of robots Time consumption is decreased in spraying the pesticide liquid

1
and it will also assist farmers to decrease the workload and in any season and
conditions to do job. In Bangladesh, farming is performed using worldly methods.
The absence of adequate understanding for most of our farmers makes it even more
erratic. The projections are based on a big part of farming and agricultural Operations,
which sometimes fail. Farmers must bear enormous losses and sometimes the source
of suicide. Since we know the advantages of proper soil moisture and its consistency,
air quality and irrigation, these criteria cannot be ignored in crop growth. Therefore,
we produced a fresh concept of using IoT to monitor crops and to use intelligent
farming. Because of its reliability and remote monitoring, we think our idea will be a
benchmark in the agribusiness. Our concept is digitalization of agriculture and
farming operations so farmers can track crop requirements and predict their
development correctly. Surely this idea will speed up their company to achieve new
heights and be more lucrative as well. Implementing our project relies mainly on
farmer’s consciousness, which we think will be readily generated owing to its
countless benefits.

1.2 Objective of this Project


 To spray pesticide by own
 To control the robot using mobile from long distance
 To watch the view in real life using camera and mobile.

1.3 Motivation
With flourishing technology that is introduced in this 21st century, there is numerous
types of robots been used in agricultural activity starting from the cultivation process
to the production process. The autonomous robot had been introduced in various
application such is in underwater [4], rescue [5], line following robot based on metal
detection [6]. In agriculture field, the usage of robotics in agriculture operation able to
help to increase the production process and improve efficiency [7]. One of the types
of the robot used in agriculture is for the purpose of pesticide spraying with the ability
to navigate in the farm, recognize the target and regulate the spraying mechanism [8].
The use of autonomous robot pesticide sprayer as the substitution of the worker who
used conventional pesticide sprayer can be applicable. Besides, the demand for the
agriculture robot also stimulates the consciousness of how important its role in the
current and future generations. The survey conducted shows that the demand for
2
robots and drones in agriculture will be expected to be rose from 2018 to 2038.
Hence, the usage of the autonomous robot is assumed to rise thus replacing the
current labor worker. This granular 20 years market forecast covers all the aspect of
the agricultural robots and drones for 16 market categories with the expectation by the
end of 2038 [7], the market of the robots and drones in these categories is predicted
will close to 35 billion with the viable technology and ongoing market demand by
considering its technology and application . Nevertheless, the common problem with
an autonomous robot use in agricultural activity is the navigation method used to able
the robot fully-operated with decision making capability. In order to navigate through
all the field, there are some research has been done [8-11]. It can be done through
infrstructure ready or to be without infrastructure. Some research on RFID based
navigation are conducted to be implemented as navigation tools [12-13]. As artificial
intelligence (AI) starts to emerge, the current robot should be able to navigate the next
movement by the adaptation of the surrounding environment and decide which path it
will take. The typical method used in the detection is based on the targeted object
orientation or repelled signal emits from the sensor itself then calculates the distance
in between it [13-18]. Other than that, there is also the robot that uses the vision
observation then accumulates all the acquired data to generate the data fusion that
enables the robot to navigate itself through the farm.

1.4 Proposed System


It has induced plant diseases a huge post-effect scenario as it is possible. The quality
and quantity of agricultural products decreases significantly. Early detection of pests
is a major problem for planting. First phase includes the crop being carefully and
periodically monitored. The affected plants are then identified and photographs are
obtained for the affected crop component using scanners or cameras. Then these
objects are pre-processed, transformed and grouped. Then these images are sent to the
processor as input and images are compared by the processor. If the picture is
contaminated, an automatic sprayer of pesticides is used to spray.

3
1.5 Organization of the Project Book
This book consists of six chapters.

Chapter 1:
Gives a brief discussion of the project introduction, project objectives, project
motivation and history.

Chapter 2:
Focuses on system design and theoretical background how the project is designed.
How the system works. All the system is legit or not.

Chapter 3:
Focuses on hardware we used, their theoretical background and applications. Each
hardware’s detail. How the hardware works. How the hardware can be
implementation.

Chapter 4:
Focuses on experimental result. How we gave our best result from our project that
described in this chapter. Also we showed all the results visual here.

Chapter 5:
Concludes overall the project. And advantages of this project. How this project will
give it’s best in our daily work.

4
Chapter 2
System Design

2.1 Introduction
In this project we can see about the system design of an Agriculture Pesticide
Spraying Robot. How they works and how their behavior, specification and so many
thing.

2.2 Block Diagram

Figure 2.1: Block Diagram of IOT Based Pesticide Sprayer

All the motors are connected to motor driver and the motor driver is connected to
ESP32. A switch is connected to battery which will operate the spray system. Also
mobile app have some buttons for control the car.

5
2.2.1 Flow Chart

Figure 2.2: Flow Chart of IOT Based Pesticide Sprayer Robot

2.3 What is IoT?


The internet of things, or IoT, is a system of interrelated computing devices,
mechanical and digital machines, objects, animals or people that are provided with
unique identifiers (UIDs) and the ability to transfer data over a network without
requiring human-to-human or human-to-computer interaction. A thing in the internet
of things can be a person with a heart monitor implant, a farm animal with a biochip
transponder, an automobile that has built-in sensors to alert the driver when tire
pressure is low or any other natural or man-made object that can be assigned an
Internet Protocol (IP) address and is able to transfer data over a network. Increasingly,
organizations in a variety of industries are using IoT to operate more efficiently, better
understand customers to deliver enhanced customer service, improve decision-making

6
and increase the value of the business. How does IoT work? An IoT ecosystem
consists of web-enabled smart devices that use embedded systems, such as processors,
sensors and communication hardware, to collect, send and act on data they acquire
from their environments. IoT devices share the sensor data they collect by connecting
to an IoT gateway or other edge device where data is either sent to the cloud to be
analyzed or analyzed locally. Sometimes, these devices communicate with other
related devices and act on the information they get from one another. The devices do
most of the work without human intervention, although people can interact with the
devices -- for instance, to set them up, give them instructions or access the data.

Figure 2.3: IoT (Internet of Things)

Figure 2.4: Flowchart of IoT (Internet of Things)

7
2.4 IOT Based Pesticide Spraying Robot
Agricultural robot for spraying fertilizers and pesticides in the agriculture fields. In
order to keep the costs to a minimum, the fertilizer and pesticide spraying robot
prototype was assembled using simple, cost-effective, and off-the-shelf components.
The agricultural robot developed for this research work focuses on two applications,
namely fertilizer and pesticide spraying as well as general crop monitoring. The
prototype system is a two-wheeled robot that consists of a mobile base, a wireless
controller for controlling the movement of the robot, and a camera providing a live-
video feed for general crop health and growth monitoring as well as detecting the
presence of pests in the agriculture field. The agricultural robot operates in
accordance to the commands of an operator.

Figure 2.5: IOT Based Pesticide Spraying Robot

8
Chapter 3

Hardware and Circuit Implementation

3.1 Introduction
In this chapter we are going to describe all the hardware that we used for our project
also describing all the circuit diagram block diagram software etc.

3.1.1 Required Components


1. Battery
2. DC Motor
3. Motor Driver
4. Servo Motor
5. Water Pump
6. Spraying Nozzle
7. ESP 32 Cam

3.2 L298 Motor Driver


L298N Motor Driver Module is a high power motor driver module for driving DC and
Stepper Motors. This module consists of an L298 motor driver IC and a 78M05 5V
regulator. L298N Module can control up to 4 DC motors, or 2 DC motors with
directional and speed control.

9
Figure 3.1: Motor Driver

3.2.1 Pins of L298 Motor Driver IC

Table 3.1: Pins of L298N Motor Driver

SL No Pin Name Description

1 IN1 & IN2 Motor A input pins. Used to control the


spinning direction of Motor A

2 IN3 & IN4 Motor B input pins. Used to control the


spinning direction of Motor B

3 ENA Enables PWM signal for Motor A

4 ENB Enables PWM signal for Motor B

5 OUT1 & Output pins of Motor A


OUT2

6 OUT3 & Output pins of Motor B


OUT4

7 12V 12V input from DC power Source

8 5V Supplies power for the switching logic


circuitry inside L298N IC

9 GND Ground pin

10
3.3 Water Pump

This is a low cost mini submersible type water pump that works on 12V DC. It is
extremely simple and easy to use. Just immerse the pump in water, connect a suitable
pipe to the outlet and power the motor with 12V to start pumping water. Great for
building science projects, fire-extinguishers, firefighting robots, fountains, waterfalls,
plant watering systems etc .

Figure 3.2:  Water Pump

3.4 DC Gear Motor

A Direct Current (DC) motor is a rotating electrical device that converts direct
current, of electrical energy, into mechanical energy. An Inductor (coil) inside the DC
motor produces a magnetic field that creates rotary motion as DC voltage is applied to
its terminal.

11
Figure 3.3:  DC Gear Motor

3.5 Jumper Wire

A jump wire (also known as jumper wire, or jumper) is an electrical wire, or group of


them in a cable, with a connector or pin at each end (or sometimes without them –
simply "tinned"), which is normally used to interconnect the Components of a
breadboard or other prototype or test circuit, internally or with other.

Figure 3.4: Jumper wires

3.6 5V DC Power Supply

We need two power supply for our system. One for full device and one 12 volts DC
for our pump. As 11 volts power supply we use three 18650 Li-Po battery which is
connected in series.

12
Figure 3.5: 18650 Li-Po Battery

3.7 Smart Phone

To see the real time condition of cornfield and also control the pump and other
electrical device we need an android or ios device with internet connection.

Figure 3.6: Mobile Phone

3.8 ESP 32 CAM

The ESP32-CAM is a development board with an ESP32-S chip, an OV2640 camera,


microSD card slot and several GPIOs to connect peripherals. In this guide, we’ll take
a look at the ESP32-CAM GPIOs and how to use them.

13
Figure 3.7: ESP 32 CAM

3.8.1 Pins of ESP 32 CAM

Table 3.2: Pins of ESP 32 CAM

SL No OV2640 CAMERA ESP32 Variable name in code

1 D0 GPIO 5 Y2_GPIO_NUM

2 D1 GPIO 18 Y3_GPIO_NUM

3 D2 GPIO 19 Y4_GPIO_NUM

4 D3 GPIO 21 Y5_GPIO_NUM

5 D4 GPIO 36 Y6_GPIO_NUM

6 D5 GPIO 39 Y7_GPIO_NUM

7 D6 GPIO 34 Y8_GPIO_NUM

8 D7 GPIO 35 Y9_GPIO_NUM

9 XCLK GPIO 0 XCLK_GPIO_NUM

10 PCLK GPIO 22 PCLK_GPIO_NUM

11 VSYNC GPIO 25 VSYNC_GPIO_NUM

12 HREF GPIO 23 HREF_GPIO_NUM

13 SDA GPIO 26 SIOD_GPIO_NUM

14 SCL GPIO 27 SIOC_GPIO_NUM

15 POWER PIN GPIO 32 PWDN_GPIO_NUM

3.9 Arduino IDE Software

To make this project we used 2 software. One is programming code editor software
which name is Arduino Ide. The one is an Android app which is for controlling our
appliances For coding for Node MCU we need an IDE which is compatible to our
NodeMCU. The Arduino IDE is one of the most easiest and compatible IDE for Node
MCU. So we choice Arduino IDE for our coding. 

14
The Arduino Integrated Development Environment (IDE) is a cross-platform
application (for Windows, macOS, Linux) that is written in functions from C and C+
+.It is used to write and upload programs to Arduino compatible boards, but also, with
the help of third-party cores, other vendor development boards.The source code for
the IDE is released under the GNU General Public License, version 2. The Arduino
IDE supports the languages C and C++ using special rules of code structuring. The
Arduino IDE supplies a software library from the Wiring project, which provides
many common input and output procedures. User-written code only requires two
basic functions, for starting the sketch and the main program loop, that are compiled
and linked with a program stub main() into an executable cyclic executive program
with the GNU toolchain, also included with the IDE distribution. The Arduino IDE
employs the program avrdude to convert the executable code into a text file in
hexadecimal encoding that is loaded into the Arduino board by a loader program in
the board's firmware. By default, avrdude is used as the uploading tool to flash the
user code onto official Arduino boards. Arduino Pro IDE

Repository github.com/arduino/Arduino Edit this at Wikidata Written in C, C++


Operating system Windows, macOS, Linux Platform IA-32, x86-64, ARM
TypeIntegrated development environment License LGPL or GPL license Website
blog.arduino.cc/2019/10/18/arduino-pro-ide-alpha-preview-with-advanced-features/
With the rising popularity of Arduino as a software platform, other vendors started to
implement custom open source compilers and tools (cores) that can build and upload
sketches to other microcontrollers that are not supported by Arduino's official line of
microcontrollers. [4]

Figure 3.8: Arduino IDE

15
3.10 Spraying Nozzle
A spray nozzle is a precision device that facilitates dispersion of liquid into a spray.
Nozzles are used for three purposes: to distribute a liquid over an area, to increase
liquid surface area, and create impact force on a solid surface.[1] A wide variety of
spray nozzle applications use a number of spray characteristics to describe the spray.
[2] Spray nozzles can be categorized based on the energy input used to cause
atomization, the breakup of the fluid into drops.[3][4] Spray nozzles can have one or
more outlets; a multiple outlet nozzle is known as a compound nozzle. Multiple
outlets on nozzles are present on spray balls, which have been used in the brewing
industry for many years for cleaning casks and kegs.[5] Spray nozzles range from
heavy duty industrial uses to light duty spray cans or spray bottles.[6]

Figure 3.9: Spraying Nozzle

16
3.11 Circuit Diagram
Hardware connection is same as like as the circuit diagram.1 st we program the
Nodemcu then we connect all the equipment as per our circuit diagram.

Figure 3.10: Circuit Diagram of An IOT Based Pesticide Sprayer Robot

3.12 Required Software and Their Setup Process

To complete our project, we need many types of software, most important of them are
following

1. Arduino IDE-1.8.9

3.13 Software

The software that is used to program the microcontroller is open-source-software and


can be downloaded for free on www.arduino.cc. With this “Arduino software” we can
write little programs with the microcontroller. These programs are called “Sketch”.

In the end the sketches are transferred to the microcontroller by USB cable. More on
that later on the subject “programming”.

17
3.13.1 Arduino IDE Installation

Now one after another the Arduino software and the USB driver for the board have to
be installed.

3.13.2 Installation and setup of the Arduino software

1. We have downloaded the Arduino software from www.arduino.cc and installed it


on the computer (This was NOT connected to the PC). After that we opened the
software file and installed the program named arduino.exe.

Two set ups on the program are important and should be considered.

a) The board that we want to connect has to be selected on the arduino software.
The “Arduino Uno” is here known as “Arduino / Genuino Uno ,Nano, Lilipo
or any name can be”.

Fig. 3.11: Program installation process -1

b) We have to choose the right “Serial-Port”, to let the Computer know to which port
the board has been connected. That is only possible if the USB driver has been
installed correctly. It can be checked this way:
At the moment the Arduino wasn’t connected to the PC. If we now choose “Port”,
under the field “Tool”, we will already see one or more ports here (COM1/ COM2/
COM3…).

18
Figure 3.12: Program installation process -2

3.14 Programming

The development cycle is divided into 4 phases:

Edit Compile Upload Run

Figure 3.13: Flowchart of the compiling process

Compile: Compile means to translate the sketch into machine language, also known
as object.
Code Run: Arduino sketch is executed as soon as terminates the step of uploading on
the board.

3.14.1 Arduino Program Development

• Based on C++ without 80% of the instructions.


• A handful of new commands.
• Programs are called 'sketches'.
• Sketches need two functions:
• void setup ( )
• Void loop ( )
• Setup ( ) runs first and once.

19
Chapter 4
Experimental Result

4.1 Introduction
The function of a greenhouse is to create the optimal growing conditions for the full
life of the plants [1]. Achievement of the desired conditions often requires the use of
pesticides, fungicides, high temperatures and increased carbon dioxide and humidity

20
levels [2]. Prolonged exposure of greenhouse workers to these conditions leads to an
uncomfortable and hazardous work environment [3, 4], contravening modern
Australian occupational, health and safety principles. Often, there are substantial risks
involved in greenhouse work due to the hazardous environment or dangerous
operations. Speciflc examples of these include: repetitive strain injury, toxic chemical
exposure, extreme heat exposure, extreme humidity (heatstroke) [3], and working at
heights. Automating tasks within the greenhouse will enable the avoidance of
unwanted or hazardous human exposure whilst potentially leading to an increase in
overall e–ciency and productivity. The main application of robots in the commercial
sector has been concerned with the substitution of manual human labour by robots or
mechanised systems to make the work more time e–cient, accurate, uniform and less
costly [5]-[10]. One may argue the social implications of such developments, for
example, the efiects on employment through loss of blue collar jobs to the more e–
cient robotic counterpart; there are also ethical considerations that may be argued.
Whilst there may well be some validity to the argument in some cases, this current
project is unique in the number of stakeholders that are affected in a positive sense.
The farmers beneflts are found in more e–cient maintenance of the crops and either
less work for themselves or a decreased need for the employment of others (arguably,
an expensive process). Increased demand on growers has begun to be met with
increased speciflc automation in many flelds, as producers believe that automation is
a viable and sometimes necessary [7] method to ensure maximum proflts with
minimum costs [6]. Indeed Hopkins [6] argues that automation enables the expansion
of a greenhouse without having to invest more flnancial resources on labour.
Merchants may beneflt from increased sales due to a lower cost prod- 1uct; the
consumers will beneflt, likewise, from a lower cost product of comparable quality.
The stakeholders that beneflt most, at least from an ethical or social perspective,
however, are the greenhouse workers. This paper presents the design and construction
of an autonomous robot that seeks to address some of the human health concerns
associated with greenhouses. This robot is designed as a base for developing systems
to enable the automation of greenhouse processes such as the spraying of pesticides,
picking of fruit and the caring for diseased plants. The system is designed to be as
modular as possible, enabling the development and/or modiflcation of any of the
individual tasks. The following sections of this paper are outlined as follows, Section
2 describes the current greenhouse environment and technique of pesticide
21
application, Section 3 outlines the design of the robot, Section IV outlines the safety
aspects considered in the design, Section V discusses the programs used in operation,
Section VI provides a cost beneflt analysis of automation within a greenhouse and
Section VII provides an overview of the efiectiveness and conclusions found upon
implementing the robot in a real greenhouse environment.

4.2 Result
The final result of our project is it able to spray the pesticide. It can runnable using
mobile app. It has built in camera for observing.

Figure 4.1: Controlling view of robot camera and pump.

4.3 Project Outlook


Here is our projects outlook. This is how look like our project. The hardware project
has performed the basic operations of robotic arm as expected. The main problem to
overcome in this project was to interface the ESP 32 cam board with the android
device via Wifi.

22
Figure 4.2: Project Outlook

4.4 Discussion
The proposed system is a bit of an advancing exploration expected to override the
standard spraying procedures with a cultivating mechanical sprayer. The robot
investigates self-rulingly along the vineyard pushes, and performs express spraying
toward distinguished targets. For site-unequivocal spraying the goal ought to at first
be perceived and after that showered. This examination revolves around the spraying
technique so as to thoroughly cover the goal while restricting the proportion of
material sprinkled. Ceaseless research focused on the target area and on the
progression of a totally operational cultivating spraying robot. The width of the
sprayer is set by the shape and size of the target like the starting late proposed patent
that prescribes a variable gush hole. In any case, in existing technique was organized,
produced, and executed in evident conditions and included exploratory frameworks
and investigations for evaluation and endorsement of the spraying contraption for
agrarian amorphous shapes. The assessed spraying strategies were as per the
following, 1. Fixed Nozzle Spacing: In this methodology, a great deal of spouts are
created vertically on a showering area with destined isolating. The gush position and
the sprinkle estimation are set going before the showering methodology paying little
regard to the target's shape and size. While the sprayer vehicle comes the collect push,
the spouts sprinkle synchronously (using an electric valve) in order to cover the goal.

23
2. Optimal Spray Coverage: In this procedure, the splashing is performed using a lone
showering gush associated with the dish tilt unit (PTU) and is prepared for organizing
the gush. The shower width of the gush is set before the splashing system. Since the
sprinkle separate crosswise over is fixed, every target will require a couple of showers
for full incorporation.

Chapter 5
Conclusion

5.1 Introduction
In this chapter we are going to describe our project’s advantages, limitations and
future Improvement and conclusions.
24
5.2 Advantages
 Workload on farmers minimized by using these type of agrobots, and also
reduce the chance of danger of breathing problems.
 By making track for robot it will be worked properly in slippery and unequal
surface.
 Farmers don’t have to go in the field because robots do their work properly
and effectively.
 Time consumed by robots for spraying liquids is less than mankind, and they
can improve the working efficiency.
 Live Video streaming.

5.3 Limitations
 Internet speed is a problem for our project.
 As we used a free server so sometimes it won’t be able to send the data
properly.
 Should be concern always in battery.
 For this robot some people will lose their job.

.
5.4 Future Improvement
 It can be improved by controlled by voice recognition mechanism.
 It can be improved by also enhance the robot to do authentication process.
 It can be improved by GPS for tracking the location.
 It can be improved by AI System

5.5 Cost Analysis

Equipment Price

L298 Motor Driver 300

Gear Motor *4 with wheel 1000

25
Esp 32 Cam 1600

Power Supply 1500

Water Pump 750

Spraying Nozzle 800

PVC Board 400

Others 1000

Total 7350 Taka

Table 5.1: Cost Analysis

5.6 Conclusion
The robot for agricultural purpose an Agrobot is a concept for the near the
performance and cost of the product once optimized, will prove to be work through in
the agricultural spraying operations. Workload on the farmers is decreased and health
problems also. Successful in constructing robot which can be travelled on rough
surfaces also and weighing enough load of compressor and other equipments.
Successful in developing a robot whose construction is enough to withstand the
challenges of the field. Sure about that once this concept will be presented in a
manner suitable to Bangladesh market, it will definitely help in bringing down the
15% molality rate found in the Bangladeshi farmers associated with the agricultural
spraying operation.

References

26
[1] S. I. Cho and N. H. Ki, “Autonomous speed sprayer guidance using machine
vision and fuzzy logic,” Trans. Amer. Soc. Agricult. Eng., vol. 42, no. 4, pp.
1137–1144, 1999.
[2] S. Dasgupta, C. Meisner, D. Wheeler, K. Xuyen, and N. T. Lam, “Pesticide
poisoning of farm workers— Implications of blood test results from Vietnam,”
Int. J. Hygiene Environ. Health, vol. 210, no. 2, pp. 121–132, 2007.
[3] W. J. Rogan and A. Chen, “Health risks and benefits of bis(4-chlorophenyl)-
1,1,1-trichloroethane (DDT),” Lancet, vol. 366, no. 9787, pp. 763–773, 2005.
[4] D. Pimentel and H. Lehman, The Pesticide Question: Environment,
Economics, and Ethics. London,International Journal of Future Generation
Communication and Networking Vol. 13, No. 1, (2020), pp. 150-160 159
ISSN: 2233-7857 IJFGCN Copyright ⓒ 2020 SERSC U.K.: Chapman & Hall,
1993.
[5] J. Reus et al., “Comparison and evaluation of eight pesticide environmental
risk indicators developed in Europe and recommendations for future use,”
Agricult. Ecosyst. Environ., vol. 90, no. 2, pp. 177–187, 2002.
[6] S. H. Swan et al., “Semen quality in relation to biomarkers of pesticide
exposure,” Environ. Health Perspect., vol. 111, no. 12, pp. 1478–1484, 2003.
[7] J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern
Anal. Mach. Intell., vol. PAMI-8, no. 6, pp. 679–698, Nov. 1986.
[8] M. Sharifi, M. Fathy, and M. T. Mahmoudi, “A classified and comparative
study of edge detection algorithms,” in Proc. Int. Conf. Inf. Technol. Coding
Comput., Apr. 2002, pp. 117–120.
[9] M. C. Shin, D. Goldgof, and K. W. Bowyer, “An objective comparison
methodology of edge detection algorithms using a structure from motion
task,” in Proc. Conf. IEEE Comput. Vis. Pattern Recognit., Jun. 1998, pp.
190–195.
[10] L. Breiman, J. Friedman, C. J. Stone, and R. A. Olshen, Classification and
Regression Trees. Boca Raton, FL, USA: CRC Press, 1984.
[11] K. Kapach, E. Barnea, R. Mairon, Y. Edan, and O. Ben-Shahar, “Computer
vision for fruit harvesting robots—State of the art and challenges ahead,” Int.
J. Comput. Vis. Robot., vol. 3, nos. 1–2, pp. 4–34, 2012. 29

27
[12] R. Berenstein, “A human-robot cooperative vineyard selective sprayer,” Ph.D.
dissertation, Dept. Ind. Eng. Manage., Ben-Gurion Univ. Negev, Beersheba,
Israel, 2016.
[13] R. Berenstein and Y. Edan, “Robotic precision spraying methods,” presented
at the ASABE Annu. Int. Meeting, Dallas, TX, USA, 2012, paper no:
121341054.
[14] R. Berenstein and Y. Edan, “Evaluation of marking techniques for a human-
robot selective vineyard sprayer,”in Proc. Int. Conf. Agricult. Eng. (CIGR-
AgEng), Valencia, Spain, 2012, p. C-1090.
[15] R. Berenstein and Y. Edan, “Human-robot cooperative precision spraying:
Collaboration levels and optimization function,” in Proc. Symp. Robot Control
(SYROCO), Dubrovnik, Croatia, 2012, pp. 799– 804.
[16] R. Berenstein, Y. Edan, and I. Ben Halevi, “A remote interface for a human-
robot cooperative vineyard sprayer,” in Proc. Int. Soc. Precision Agricult.
(ICPA), Indianapolis, IN, USA, 2012, pp. 15–18.
[17] R. Berenstein, O. B. Shahar, A. Shapiro, and Y. Edan, “Grape clusters and
foliage detection algorithmsInternational Journal of Future Generation
Communication and Networking Vol. 13, No. 1, (2020), pp. 150-160 160
ISSN: 2233-7857 IJFGCN Copyright ⓒ 2020 SERSC for autonomous
selective vineyard sprayer,” Intell. Service Robot., vol. 3, no. 4, pp. 233–243,
2010.
[18] R. Berenstein, “A human-robot cooperative vineyard selective sprayer,” Ph.D.
dissertation, Dept. Ind. Eng. Manage., Ben-Gurion Univ. Negev, Beersheba,
Israel, 2016.
[19] S.Thilagamani , N. Shanthi,” Object Recognition Based on Image
Segmentation and Clustering”, Journal of Computer Science,Vol. 7,No.11,pp.
1741-1748, 2011.
[20] S.Thilagamani , N. Shanthi,” Gaussian and gabor filter approach for object
segmentation”, Journal of Computing and Information Science in
Engineering,Vol.14,Issue 2, pp. 021006,2014

28
Appendix

A. Programming Code for the Project

#include "esp_camera.h"
#include <Arduino.h>
#include <WiFi.h>
#include <AsyncTCP.h>
#include <ESPAsyncWebServer.h>
#include <iostream>
#include <sstream>

struct MOTOR_PINS
{
int pinEn;
int pinIN1;
int pinIN2;
};

std::vector<MOTOR_PINS> motorPins =
{
{12, 13, 15}, //RIGHT_MOTOR Pins (EnA, IN1, IN2)
{12, 14, 2}, //LEFT_MOTOR Pins (EnB, IN3, IN4)
};
#define LIGHT_PIN 4

#define UP 1
#define DOWN 2
#define LEFT 3
#define RIGHT 4
#define STOP 0
29
#define RIGHT_MOTOR 0
#define LEFT_MOTOR 1

#define FORWARD 1
#define BACKWARD -1

const int PWMFreq = 1000; /* 1 KHz */


const int PWMResolution = 8;
const int PWMSpeedChannel = 2;
const int PWMLightChannel = 3;

//Camera related constants


#define PWDN_GPIO_NUM 32
#define RESET_GPIO_NUM -1
#define XCLK_GPIO_NUM 0
#define SIOD_GPIO_NUM 26
#define SIOC_GPIO_NUM 27
#define Y9_GPIO_NUM 35
#define Y8_GPIO_NUM 34
#define Y7_GPIO_NUM 39
#define Y6_GPIO_NUM 36
#define Y5_GPIO_NUM 21
#define Y4_GPIO_NUM 19
#define Y3_GPIO_NUM 18
#define Y2_GPIO_NUM 5
#define VSYNC_GPIO_NUM 25
#define HREF_GPIO_NUM 23
#define PCLK_GPIO_NUM 22

const char* ssid = "MyWiFiCar";


const char* password = "12345678";

AsyncWebServer server(80);
30
AsyncWebSocket wsCamera("/Camera");
AsyncWebSocket wsCarInput("/CarInput");
uint32_t cameraClientId = 0;

const char* htmlHomePage PROGMEM = R"HTMLHOMEPAGE(


<!DOCTYPE html>
<html>
<head>
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-
scale=1, user-scalable=no">
<style>
.arrows {
font-size:40px;
color:red;
}
td.button {
background-color:black;
border-radius:25%;
box-shadow: 5px 5px #888888;
}
td.button:active {
transform: translate(5px,5px);
box-shadow: none;
}

.noselect {
-webkit-touch-callout: none; /* iOS Safari */
-webkit-user-select: none; /* Safari */
-khtml-user-select: none; /* Konqueror HTML */
-moz-user-select: none; /* Firefox */
-ms-user-select: none; /* Internet Explorer/Edge */
user-select: none; /* Non-prefixed version, currently
supported by Chrome and Opera */
}
31
.slidecontainer {
width: 100%;
}

.slider {
-webkit-appearance: none;
width: 100%;
height: 15px;
border-radius: 5px;
background: #d3d3d3;
outline: none;
opacity: 0.7;
-webkit-transition: .2s;
transition: opacity .2s;
}

.slider:hover {
opacity: 1;
}

.slider::-webkit-slider-thumb {
-webkit-appearance: none;
appearance: none;
width: 25px;
height: 25px;
border-radius: 50%;
background: red;
cursor: pointer;
}

.slider::-moz-range-thumb {
width: 25px;
height: 25px;
32
border-radius: 50%;
background: red;
cursor: pointer;
}

</style>

</head>
<body class="noselect" align="center" style="background-color:white">

<!--h2 style="color: teal;text-align:center;">Wi-Fi Camera &#128663;


Control</h2-->

<table id="mainTable" style="width:400px;margin:auto;table-layout:fixed"


CELLSPACING=10>
<tr>
<img id="cameraImage" src="" style="width:400px;height:250px"></td>
</tr>
<tr>
<td></td>
<td class="button" ontouchstart='sendButtonInput("MoveCar","1")'
ontouchend='sendButtonInput("MoveCar","0")'><span class="arrows"
>&#8679;</span></td>
<td></td>
</tr>
<tr>
<td class="button" ontouchstart='sendButtonInput("MoveCar","3")'
ontouchend='sendButtonInput("MoveCar","0")'><span class="arrows"
>&#8678;</span></td>
<td class="button"></td>
<td class="button" ontouchstart='sendButtonInput("MoveCar","4")'
ontouchend='sendButtonInput("MoveCar","0")'><span class="arrows"
>&#8680;</span></td>
</tr>
33
<tr>
<td></td>
<td class="button" ontouchstart='sendButtonInput("MoveCar","2")'
ontouchend='sendButtonInput("MoveCar","0")'><span class="arrows"
>&#8681;</span></td>
<td></td>
</tr>
<tr/><tr/>
<tr>
<td style="text-align:left"><b>Speed:</b></td>
<td colspan=2>
<div class="slidecontainer">
<input type="range" min="0" max="255" value="150" class="slider"
id="Speed" oninput='sendButtonInput("Speed",value)'>
</div>
</td>
</tr>
<tr>
<td style="text-align:left"><b>Light:</b></td>
<td colspan=2>
<div class="slidecontainer">
<input type="range" min="0" max="255" value="0" class="slider" id="Light"
oninput='sendButtonInput("Light",value)'>
</div>
</td>
</tr>
</table>

<script>
var webSocketCameraUrl = "ws:\/\/" + window.location.hostname + "/Camera";
var webSocketCarInputUrl = "ws:\/\/" + window.location.hostname + "/CarInput";
var websocketCamera;
var websocketCarInput;

34
function initCameraWebSocket()
{
websocketCamera = new WebSocket(webSocketCameraUrl);
websocketCamera.binaryType = 'blob';
websocketCamera.onopen = function(event){};
websocketCamera.onclose = function(event)
{setTimeout(initCameraWebSocket, 2000);};
websocketCamera.onmessage = function(event)
{
var imageId = document.getElementById("cameraImage");
imageId.src = URL.createObjectURL(event.data);
};
}

function initCarInputWebSocket()
{
websocketCarInput = new WebSocket(webSocketCarInputUrl);
websocketCarInput.onopen = function(event)
{
var speedButton = document.getElementById("Speed");
sendButtonInput("Speed", speedButton.value);
var lightButton = document.getElementById("Light");
sendButtonInput("Light", lightButton.value);
};
websocketCarInput.onclose = function(event)
{setTimeout(initCarInputWebSocket, 2000);};
websocketCarInput.onmessage = function(event){};
}

function initWebSocket()
{
initCameraWebSocket ();
initCarInputWebSocket();
}
35
function sendButtonInput(key, value)
{
var data = key + "," + value;
websocketCarInput.send(data);
}

window.onload = initWebSocket;
document.getElementById("mainTable").addEventListener("touchend",
function(event){
event.preventDefault()
});
</script>
</body>
</html>
)HTMLHOMEPAGE";

void rotateMotor(int motorNumber, int motorDirection)


{
if (motorDirection == FORWARD)
{
digitalWrite(motorPins[motorNumber].pinIN1, HIGH);
digitalWrite(motorPins[motorNumber].pinIN2, LOW);
}
else if (motorDirection == BACKWARD)
{
digitalWrite(motorPins[motorNumber].pinIN1, LOW);
digitalWrite(motorPins[motorNumber].pinIN2, HIGH);
}
else
{
digitalWrite(motorPins[motorNumber].pinIN1, LOW);
digitalWrite(motorPins[motorNumber].pinIN2, LOW);
36
}
}

void moveCar(int inputValue)


{
Serial.printf("Got value as %d\n", inputValue);
switch(inputValue)
{

case UP:
rotateMotor(RIGHT_MOTOR, FORWARD);
rotateMotor(LEFT_MOTOR, FORWARD);
break;

case DOWN:
rotateMotor(RIGHT_MOTOR, BACKWARD);
rotateMotor(LEFT_MOTOR, BACKWARD);
break;

case LEFT:
rotateMotor(RIGHT_MOTOR, FORWARD);
rotateMotor(LEFT_MOTOR, BACKWARD);
break;

case RIGHT:
rotateMotor(RIGHT_MOTOR, BACKWARD);
rotateMotor(LEFT_MOTOR, FORWARD);
break;

case STOP:
rotateMotor(RIGHT_MOTOR, STOP);
rotateMotor(LEFT_MOTOR, STOP);
break;

37
default:
rotateMotor(RIGHT_MOTOR, STOP);
rotateMotor(LEFT_MOTOR, STOP);
break;
}
}

void handleRoot(AsyncWebServerRequest *request)


{
request->send_P(200, "text/html", htmlHomePage);
}

void handleNotFound(AsyncWebServerRequest *request)


{
request->send(404, "text/plain", "File Not Found");
}

void onCarInputWebSocketEvent(AsyncWebSocket *server,


AsyncWebSocketClient *client,
AwsEventType type,
void *arg,
uint8_t *data,
size_t len)
{
switch (type)
{
case WS_EVT_CONNECT:
Serial.printf("WebSocket client #%u connected from %s\n", client->id(), client-
>remoteIP().toString().c_str());
break;
case WS_EVT_DISCONNECT:
Serial.printf("WebSocket client #%u disconnected\n", client->id());
moveCar(0);
ledcWrite(PWMLightChannel, 0);
38
break;
case WS_EVT_DATA:
AwsFrameInfo *info;
info = (AwsFrameInfo*)arg;
if (info->final && info->index == 0 && info->len == len && info->opcode ==
WS_TEXT)
{
std::string myData = "";
myData.assign((char *)data, len);
std::istringstream ss(myData);
std::string key, value;
std::getline(ss, key, ',');
std::getline(ss, value, ',');
Serial.printf("Key [%s] Value[%s]\n", key.c_str(), value.c_str());
int valueInt = atoi(value.c_str());
if (key == "MoveCar")
{
moveCar(valueInt);
}
else if (key == "Speed")
{
ledcWrite(PWMSpeedChannel, valueInt);
}
else if (key == "Light")
{
ledcWrite(PWMLightChannel, valueInt);
}
}
break;
case WS_EVT_PONG:
case WS_EVT_ERROR:
break;
default:
break;
39
}
}

void onCameraWebSocketEvent(AsyncWebSocket *server,


AsyncWebSocketClient *client,
AwsEventType type,
void *arg,
uint8_t *data,
size_t len)
{
switch (type)
{
case WS_EVT_CONNECT:
Serial.printf("WebSocket client #%u connected from %s\n", client->id(), client-
>remoteIP().toString().c_str());
cameraClientId = client->id();
break;
case WS_EVT_DISCONNECT:
Serial.printf("WebSocket client #%u disconnected\n", client->id());
cameraClientId = 0;
break;
case WS_EVT_DATA:
break;
case WS_EVT_PONG:
case WS_EVT_ERROR:
break;
default:
break;
}
}

void setupCamera()
{
camera_config_t config;
40
config.ledc_channel = LEDC_CHANNEL_0;
config.ledc_timer = LEDC_TIMER_0;
config.pin_d0 = Y2_GPIO_NUM;
config.pin_d1 = Y3_GPIO_NUM;
config.pin_d2 = Y4_GPIO_NUM;
config.pin_d3 = Y5_GPIO_NUM;
config.pin_d4 = Y6_GPIO_NUM;
config.pin_d5 = Y7_GPIO_NUM;
config.pin_d6 = Y8_GPIO_NUM;
config.pin_d7 = Y9_GPIO_NUM;
config.pin_xclk = XCLK_GPIO_NUM;
config.pin_pclk = PCLK_GPIO_NUM;
config.pin_vsync = VSYNC_GPIO_NUM;
config.pin_href = HREF_GPIO_NUM;
config.pin_sscb_sda = SIOD_GPIO_NUM;
config.pin_sscb_scl = SIOC_GPIO_NUM;
config.pin_pwdn = PWDN_GPIO_NUM;
config.pin_reset = RESET_GPIO_NUM;
config.xclk_freq_hz = 20000000;
config.pixel_format = PIXFORMAT_JPEG;

config.frame_size = FRAMESIZE_VGA;
config.jpeg_quality = 10;
config.fb_count = 1;

// camera init
esp_err_t err = esp_camera_init(&config);
if (err != ESP_OK)
{
Serial.printf("Camera init failed with error 0x%x", err);
return;
}

if (psramFound())
41
{
heap_caps_malloc_extmem_enable(20000);
Serial.printf("PSRAM initialized. malloc to take memory from psram above this
size");
}
}

void sendCameraPicture()
{
if (cameraClientId == 0)
{
return;
}
unsigned long startTime1 = millis();
//capture a frame
camera_fb_t * fb = esp_camera_fb_get();
if (!fb)
{
Serial.println("Frame buffer could not be acquired");
return;
}

unsigned long startTime2 = millis();


wsCamera.binary(cameraClientId, fb->buf, fb->len);
esp_camera_fb_return(fb);

//Wait for message to be delivered


while (true)
{
AsyncWebSocketClient * clientPointer = wsCamera.client(cameraClientId);
if (!clientPointer || !(clientPointer->queueIsFull()))
{
break;
}
42
delay(1);
}

unsigned long startTime3 = millis();


Serial.printf("Time taken Total: %d|%d|%d\n",startTime3 - startTime1, startTime2 -
startTime1, startTime3-startTime2 );
}

void setUpPinModes()
{
//Set up PWM
ledcSetup(PWMSpeedChannel, PWMFreq, PWMResolution);
ledcSetup(PWMLightChannel, PWMFreq, PWMResolution);

for (int i = 0; i < motorPins.size(); i++)


{
pinMode(motorPins[i].pinEn, OUTPUT);
pinMode(motorPins[i].pinIN1, OUTPUT);
pinMode(motorPins[i].pinIN2, OUTPUT);

/* Attach the PWM Channel to the motor enb Pin */


ledcAttachPin(motorPins[i].pinEn, PWMSpeedChannel);
}
moveCar(STOP);

pinMode(LIGHT_PIN, OUTPUT);
ledcAttachPin(LIGHT_PIN, PWMLightChannel);
}

void setup(void)
{
setUpPinModes();
Serial.begin(115200);
43
WiFi.softAP(ssid, password);
IPAddress IP = WiFi.softAPIP();
Serial.print("AP IP address: ");
Serial.println(IP);

server.on("/", HTTP_GET, handleRoot);


server.onNotFound(handleNotFound);

wsCamera.onEvent(onCameraWebSocketEvent);
server.addHandler(&wsCamera);

wsCarInput.onEvent(onCarInputWebSocketEvent);
server.addHandler(&wsCarInput);

server.begin();
Serial.println("HTTP server started");

setupCamera();
}

void loop()
{
wsCamera.cleanupClients();
wsCarInput.cleanupClients();
sendCameraPicture();
Serial.printf("SPIRam Total heap %d, SPIRam Free Heap %d\n",
ESP.getPsramSize(), ESP.getFreePsram());
}

44

You might also like