Pest NGP

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 49

ABSTRACT

Insect infestations threaten yield efficiency in agricultural areas. Since insects

massively reproduce, they not only reduce crop yield and quality, but expenditures made for

biological pesticides form a huge portion of the total expenses. However, from the long-term

perspective, blind chemical pest control on agricultural areas have turned out to be less than

miraculous. Widespread adoption of chemical pesticides contributed to unprecedented

increases in crop yields, but also resulted in the poisoning of farmworkers and rural residents,

contamination of food and drinking water, destruction of wildlife habitats, and decimation of

wildlife. Rather than chemical ones, using biotechnical approaches such as pheromone traps,

a more effective and smarter pesticizing scenarios can be achieved if the reproduction stages

of the insects can be observed. Using pheromone traps, the male insects are attracted to the

trap. Hence massive reproduction is prevented, since males cannot mate with the female ones.

However, pheromone traps require physical patrolling of the traps and the expensive labor

cost due to this human labor is the most important disadvantage of the pheromone traps.

Expert staff who can recognize different kinds of insects are required for the inspection of the

traps. Many problems occur such as errors made in counting and recording of the collected

data, because of the human factor in the whole cycle. The agriculture pest monitoring device

is a system which monitors the amount of pest in farming land in the particular field .It sends

the calculates the amount of pest present in a field and suggest the amount of pesticide to be

sprayed
CHAPTER 1
INTRODUCTION

Beginning with the quote “SAVE THE AGRICULTURE”, main factor of agriculture

is to predict the climatic changes, here we are using IOT for monitoring the weather as well

as atmospheric changes throughout the crop field by having several systems in different fields

as clients, which is getting reported every time to the server, about the current atmospheric

change at that every certain place. So, the watering and pesticides can be served based on the

conditions of the field. Camera that captured image is processed then identified the disease

affected plants and then pesticides to be sprayed. In this system we are using Raspberry Pi to

control the operation of the system. We use small tank in that we add pesticide and place

motor to spray. Whenever the sensors detect the diseased plant, the signal is given to

Raspberry Pi and it will turn on the motor and start to spray. By making some modification

we can use for other applications also.

Agriculture is the backbone of Indian economy. It is a source of income for nearly

50% of the Indian population. It is mainly produce oriented and hence the profit or loss

depends on the yield obtained. One of the major issues in agriculture is the control of weeds

growing among the plantation crops. At present these kinds of plants are being removed

manually, wherever possible, or weedicides are being sprayed uniformly all over the field to

keep them under check. In conventional weed control systems, herbicides are sprayed

uniformly all over the field. This technique is very inefficient as only about 20% of the spray

reaches the plant and less than 1% of the chemical actually contributes to weed control,

leading to wastage, contamination of the environment and health problems in people [1]. To
avoid these consequences, a smart weed control system should be employed. These systems

must be capable of locating weeds in the field and herbicide sprayers are directed to spray

right on the desired spots. Also it focuses on reducing the costly labour and minimizes the use

of pesticides that harms the normal growth of plants. The machine vision [1] based approach

uses shape, texture, colour and location based features individually or in combination of these

to discriminate between weed and crop. An imaging sensor is a key component of almost any

weed detection and classification system. Individual plant classification has been successfully

demonstrated with either spectral or colour imaging. The spatial resolutions of spectral

systems are typically not adequate for accurate individual plant or leaf detection. Weed

control is a critical issue and can significantly affect crop yield. Herbicides play an important

role in weed control but their use is criticized because it is used excessively and has

potentially harmful effects. Many studies indicate that use of herbicides is reduced by patch

spraying. Manual scouting for patch spraying requires considerable resources and is not a

feasible option. Many researchers have investigated Patch spraying using remote sensing and

machine vision. Machine vision systems are suitable for plant scale whereas remote sensing

can be employed on plot basis. Both of these systems essentially require image acquisition

and image processing. Image size ranges in the order of megabytes, thus processing time

depending on image resolution, crop and weed type, algorithm used and hardware

configurations. The first step in identifying weeds within an image involves classifying the

pixels. The purpose of segmenting the image into plant and background pixels is to detect the

amount of plant material within a specific area. If the amount of plant material reaches a

specific threshold, that area is targeted for herbicidal spay application. A system that could

make use of the spatial distribution information in real-time and apply only the necessary

amounts of herbicide to the weed-infested area would be much more efficient and minimize
environmental damage. Therefore, a high spatial resolution, real-time weed infestation

detection system seems to be the solution for site specific weed management.

Prediction depicts the way the things will happen in the future, but not always based

on experience or knowledge. Prediction is helpful in various fields and it brings out together

the past and current data as a basis to develop reasonable expectations regarding the future.

The main objective of this work is to predict the occurrence of risk factor in apple caused by

apple scab in Himalayan regions. Firstly, we obtained the readings of the necessary

environmental parameters like temperature, humidity and leaf wetness duration, which leads

to the growth of disease and pests by interfacing sensors with the Raspberry Pi board and

calculation of the infection index of the disease of apple. As a prediction model, the Beta

regression model was used as a standard equation from which the severity index was derived

and then in the prediction subsystem, the Python programming language was used to predict

the severity of apple scab disease to apple caused by ascomycete fungus, Venturiainaequalis.

Using Python and by analysing pest surveillance data set of apple scab, we developed a

model for the prediction of pests. Further, we used the database connectivity, to send the data

and the required outputs to the server where the authorized officials could access the data.

The result showed that Raspberry Pi and Python successfully predicted the pest attack

in advance. In this way it is a novel, easy to handle, economical system for the apple growers

and farmers. In India almost every farmer uses insecticides and pesticides so as to protect his

crops from diseases and pests. They interpret weather on the basis of their experience and

when they find a proper time for disease and pest to attack on crop, they spray pesticides to

protect their crop from disease and pest attack. Although these chemicals are saving their

crop, but soil fertility is decreasing day by day.


In the existing system the problem of agriculture is found manually so there is more

chances for loses to the farmers. Now a day’s agriculture grow is reducing because of more

pollution and pest in the world. India’s most of the farmer grow sugarcane but did not get

yielding due to bugs and larvae in sugarcane. While depending on spectral camera

technology, UAVs can capture the images of farmland, and these images can be utilize for

analyzing the occurrence of pests and diseases of crops. In this work, we attempt to design an

agriculture framework for providing profound insights into the specific relationship between

the occurrence of pests/diseases and weather parameters. Firstly, considering that most farms

are usually located in remote areas and far away from infrastructure, making it hard to deploy

agricultural IoT devices due to limited energy supplement, a sun tracker device is designed to

adjust the angle automatically between the solar panel and the sunlight for improving the

energy-harvesting rate To avoid this situation, the proposed design has been developed with

image processing and motors to spray pesticides system used Processor for spraying

pesticides . So finding the problem can be simplified and solved easily

There are many challenges like precision agriculture, disease forecasting and pest

management in the field of farming. Due to pests and disease attacks on the crops, farmers

have to face economic losses every year. To control the pests and diseases, there are many

chemical pesticides that are being used by farmers in their fields. Widespread use of these

Agrochemicals has resulted in a damaged agricultural ecosystem, scoring low on product

quality and effect on human health. So there is a need of pest management, which is the

method to control the pest in an effective manner and reducing our dependence on pesticides.

Use of these pesticides can be reduced with the help of forecasting of diseases and pest

infections. The Infection rate and disease severity are highly dependent on environmental

parameters like temperature, humidity, leaf wetness duration and rainfall, etc. Using

correlation of these parameters with the infection rate, a mathematical prediction model is
devised to estimate the future value of infection. It predicts risk or no risk for the particular

infection to occur on that particular crop . Advance information about severity of risk help to

alert the farmers to manage the quality and quantity of pesticides for particular pest and

disease. For this work, apple scab has been considered which is infected by fungus.

The contemporary world is in a transition stage where problems concerning global

issues, such as global warming and alternative energy sources, are combined with new

challenges demanding immediate solutions. Society’s focus has shifted from economic

growth to sustainable development, where environmental, social, and economic aspects are

considered together, rather than separately. Policies that promote sustainability in all sectors

of the economy (manufacturing, agriculture, and services) are now considered as a part of

good governance. Problems such as climate change, population growth, and poverty

(especially hunger), occur in a context of a gradual depletion of natural resources and the fear

of diminishing coal energy reserves. These are some of the global issues that are thought to

require multidisciplinary approaches in order to be addressed successfully. This system focus

on agricultural production and cultivation. This overall process has a significant role in

fulfilling the basic human need for food. The production, preparation, packaging, distribution,

etc. of food also generates a lot of income. The aim of this project is to exploit modern

technologies and tools to improve monitoring and management of crops, in order to improve

the efficiency and sustainability of farming and food production. To this end, we have

designed a system for precision agriculture, which relies on a wireless sensor network

combined with a service to provide individual farmers with access to data that they find

useful. The system utilizes wireless sensor nodes that collect and transmit data about the

quality of the water supply, the soil, and other parameters in an agricultural field. While such

sensor-based systems have been investigated earlier, one of the key innovations to be

explored in this project is the combination of these sensors systems with a service-driven
business model to increase their ease of use and to amplify the gains that can be realized via

an integrated system. The goal is to give a farmer a more complete picture of the current and

historic crop status in order to foster better informed decision making. It is expected that such

decisions will benefit both farming and irrigation by saving time and resources. Factors such

as the diversity of conditions which vary depending on location (for example weather,

presence of insects, and disease) combined with the inability to predict the future

characteristics of the environment during the different seasons over time complicate the

decision making process and require specialized knowledge. This project is an attempt to

bring some of these micro-environmental sources of information into the decision making

process of farmers.

Day by day, the Electronics and Electrical industry develops the different systems as

per requirement of people. So as an Engineer, we always think about the need of people and

try to complete that requirement. So as per requirement of society we design this system,

which is a combination of different subsystems and using this subsystem we can produce this

important and intelligent device. This project which can help the people in different

problematic condition between Agricultural farming and pesticide spraying. Agricultural is

one of our most important industry for providing food, feed and fuel necessary for our

survival. Certainly, robots are playing an important role in the field of agriculture for farming

process autonomously. Normally, farming process include planting, irrigation, fertilization,

monitoring and harvesting of a crop of any kind.


CHAPTER 2
LITERATURE REVIEW

1.AUTOMATED PARAQUAT SPRAYER BY USING RASPBERRY-PI SIVARANJANI

M, SWATHI V, VASANTHI G , REVATHI Y VENKAT S

The conventional way of killing weeds in a crop plantation is to spray herbicides

throughout the plantation. This results in contamination of the food crops and also the yield

becomes less as some of the crop plants die along with the weeds. Thus there is a need for a

smart weed control system. In this paper, an image processing algorithm is used to take

images of the plantation rows at regular intervals and upon identifying the weeds in the

image, the herbicide is sprayed directly and only on the weeds. The algorithm predominantly

uses an Erosion and Dilation approach to detect weeds. The colour image is converted to

binary by extracting the green parts of the image. The amount of white pixels present in the

region of interest is determined and regions with higher white pixel count than the predefined

threshold are considered as weeds. The herbicide is stored in a container fitted with water

pump motors attached to spray nozzles. Once the weeds are identified, a signal is sent from

Raspberry-Pi to the motor driver IC controlling the water pump motors to spray the chemicals

over the weeds.

The first step in identifying weeds within an image involves classifying the pixels.

The purpose of segmenting the image into plant and background pixels is to detect the

amount of plant material within a specific area. If the amount of plant material reaches a
specific threshold, that area is targeted for herbicidal spay application. A system that could

make use of the spatial distribution information in real-time and apply only the necessary

amounts of herbicide to the weed-infested area would be much more efficient and minimize

environmental damage. Therefore, a high spatial resolution, real-time weed infestation

detection system seems to be the solution for sites specific weed management.

The firsts step is image acquisition which is accomplished by the Raspberry-Pi

Camera. The camera is mounted facing downwards on an extended arm from the chassis of

the robot at a height of about 30 centimetres from the ground. Image acquisition is done in

the presence of natural light. the next step is the processing of the image captured. The image

is subjected to morphological modifications like thresholding, erosion and dilation to detect

the presence of the plants in the Region of Interest (ROI), if present determining whether it is

a weed or the plantation crop.

The hardware includes a powerful ARM process or on a Raspberry-Pi controller. The

above explained algorithm is implemented using C code on the Raspberry-Pi. Raspberry-Pi

Camera is used to take the pictures of the plants. The images thus taken are processed, and a

decision is made based on the amount of activity (changes in pixels between white and black)

in the processed image and the herbicide is sprayed to that part of the ROI. The spraying is

accomplished by the use of water spray motors with a nozzle attached at one end. These

components are mounted on a land robot that moves through the field, taking images at

predefined intervals and processing each image taken, and thereby spraying the herbicide

2. AN IOT-BASED WIRELESS IMAGING AND SENSOR NODE SYSTEM FOR

REMOTE GREENHOUSE PEST MONITORING DAN JERIC ARCEGA RUSTIA, TA-TE

LIN*
This study focuses on designing an Internet of Things (IoT) based remote greenhouse pest

monitoring system using wireless imaging and sensor nodes (WiSN). The system designed

can continuously monitor the number of pest insects detected on yellow sticky papers

distributed in multiple locations and can measure the environmental parameters

simultaneously. The pest insects are counted through image processing and machine learning

algorithms in which it can be classified into black and white objects specifically white flies

and fruit flies with an average accuracy of 98% and computation time of 8-9 seconds per

image. Data collection was done at a greenhouse in National Taiwan University Experimental

Farm in which cabbages were grown as main crop. The greenhouse, without using pesticides,

is occasionally infested by white flies and fruit flies. Furthermore, the data are processed by

the server in which the real time analysis and monitoring are shown in a monitoring website

that also includes the latest acquired yellow sticky paper images as well as the data plots. The

feasibility of the system was tested, and the experimental results show that light intensity has

the highest correlation with the pest insect count. The developed system is effective to

acquire pest count accurately and automatically which provides important spatial-temporal

information that allows for efficient integrated pest management in greenhouse operations.

Each WiSN is composed of a Raspberry Pi 3, Raspberry Pi camera, and a multi-

environmental sensor module. The Raspberry Pi 3 is a system on chip module board that runs

with a 1.2 GHz 64-bit quad-core ARMv8 CPU and includes WIFI and Bluetooth

communication functions. The Raspberry Pi camera is a visible light camera with an 8-

megapixel native resolution image sensor and is capable of 3280x2464 pixel static image

acquisition (Raspberry Pi Foundation). The multi-environmental sensor includes a humidity

sensor, temperature sensor, atmospheric pressure sensor, and light intensity sensor. The

experimental set-up including the wireless node


For each greenhouse, in case WIFI connection is not available, there is a 4G router

that allows remote internet connection to the sensor nodes using a star WSN topology.

Through the internet, the collected images and environmental conditions are sent remotely.

Each image obtained from the camera of nodes are taken 8 cm away from the paper. The

camera module is set to take images with 3280x2464 resolution with daylight white

balancing for automatic image color correction. The images are taken from 7:00AM to

6:00PM on each day every 10 minutes while the environmental conditions are sent every 5

minutes each entire day. In this work, the set-up included two nodes distributed evenly inside

the greenhouse. The server handles the image processing, data processing, and acts as the

main web server. An image receiving program is running in the server to process the images

and to obtain the corresponding insect pest count. All the data are stored inside a MySQL

database and are processed using PHP. Each greenhouse and its nodes are uniquely registered

to the server with corresponding IDs. The monitoring website features a realtime update

monitoring of the insect pest count and environmental conditions, data plots, and in-depth

analysis such as correlation between the insect pest count and the individual environmental

conditions. The different greenhouse locations can be selected in order to view its individual

nodal data as well as its database number in which can indicate the specific monitoring

period. The simplified over-all system structure of the insect pest monitoring system and

website sample pages specifically the home and pest monitoring page

The images obtained are pre-processed by splitting the image into four equally

divided regions. The images are equalized using histogram-based contrast and brightness

adjustment using reference histograms of images obtained from previous testing results. Each

image is converted into CMYK color space using two different threshold value sets for black

and white, then undergoes k-means colour clustering for color separation classifying into

black and white insects. Erosion and dilation morphological operations are applied to the
thresholder images and blob counting is done using pre-defined blob radius, area, and

convexity. The insect pest count obtained is filtered and processed according to the previous

image references from the same hour. This approach detects glares and inaccuracies with the

counting results and its specific region. During any of the aforementioned instances, the

algorithm is repeated with the adjusted contrast and brightness to reduce the error in

counting. Each process takes up to 8-9 seconds computation time without considering the

adaptive brightness and contrast adjustment process.

3. LOW-COST SENSOR BASED EMBEDDED SYSTEM FOR PLANT PROTECTION

AND PEST CONTROL" AUTHORED BY DATTATRAYAVHATKARSHIVLING

Prediction depicts the way the things will happen in the future, but not always based

on experience or knowledge. Prediction is helpful in various fields and it brings out together

the past and current data as a basis to develop reasonable expectations regarding the future.

The main objective of this work is to predict the occurrence of risk factor in apple caused by

apple scab in Himalayan regions. Firstly, we obtained the readings of the necessary

environmental parameters like temperature, humidity and leaf wetness duration, which leads

to the growth of disease and pests by interfacing sensors with the Raspberry Pi board and

calculation of the infection index of the disease of apple. As a prediction model, the Beta

regression model was used as a standard equation from which the severity index was derived

and then in the prediction subsystem, the Python programming language was used to predict

the severity of apple scab disease to apple caused by ascomycete fungus, Venturiainaequalis.

Using Python and by analyzing pest surveillance data set of apple scab, we developed a

model for the prediction of pests. Further, we used the database connectivity, to send the data

and the required outputs to the server where the authorized officials could access the data.

The result showed that Raspberry Pi and Python successfully predicted the pest attack in
advance. In this way it is a novel, easy to handle, economical system for the apple growers

and farmers. "

4. PLANT PROTECTION AND PEST CONTROL USING LOW COST SENSOR BASED

EMBEDDED SYSTEM Miss. Shital Shinde1 , Mr. Atul Srivastava2

This Paper aimed to investigate an establishment using an Intelligent System which

employed an Embedded System and for plant protection and pest control especially of apple

using beta regression model. this beta regression model is based on mainly humidity and

temperature values. using temperature sensor and humidity sensors real time values in the

environment are sensed these values are analyzed and monitored by Raspberry Pi ,With the

use of python language beta regression model is implemented in raspberry pi for further

action. The result shows that Raspberry Pi and Python successfully predicted the pest attack

in advance. In this way it is a novel, easy to handle, economical system for the apple growers

and farmers. The system was found to be comfortable for farmers to use as they could

effectively control the farm, resulting in cost reduction, asset saving, and productive

management in farming

5. WIRELESS AGRICULTURE MONITORING USING RASPBERRY PI PROF.

VIPINKUMAR R. PAWAR PHD SCHOLAR, UNIVERSITY OF MUMABI PROFESSOR,

SANGHAVI COLLEGE OF ENGINEERING MR. SWAPNIL A. AFRE

In this paper focus on a today energy resources are becoming scarcer and therefore

more valuable. In conjunction with the population growth over last century, the need for

finding new, more efficient, and sustainable methods of agricultural cultivation and food

production has become more critical. To facilitate this process, we are designing, building,

and evaluating a system for precision agriculture which provides farmers with useful data

about the soil, the water supply, and the general condition of their fields in a user friendly,
easily accessible manner. Our system aims to make cultivation and irrigation more efficient

as the farmer is able to make better informed decisions and thus save time and resources. The

diversity of location and climatic effects upon agricultural cultivation, along with other

environmental parameters over time makes the farmer’s decision-making process more

complicated and requires additional empirical knowledge. Applying wireless sensor networks

for monitoring environmental parameters and combining this information with a user-

customized web service may enable farmers to exploit their knowledge in an efficient way in

order to extract the best results from their agricultural cultivation.

Irrigation systems can also be automated through information on volumetric water

content of soil, using dielectric moisture sensors to control actuators and save water, instead

of a predetermined irrigation schedule at a particular time of the day and with a specific

duration . The technological development in Wireless Sensor Networks made it possible to

use in monitoring and control of greenhouse parameter in precision agriculture

The agricultural monitoring system that will be implemented would be feasible and

cost effective for optimizing water resources for agricultural production. The system would

provide feedback control system which will monitor and control all the activities of drip

irrigation system efficiently. This irrigation system will allow cultivation in places with water

scarcity thereby improving sustainability. Using this system, one can save manpower, water

to improve production and ultimately increase profit.

6. AUTOMATIC FARMER FRIENDLY PESTICIDE SPRAYING ROBOT WITH

CAMERA SURVEILLANCE SYSTEM SIDDHI S. MANE1, NIKITA N. PAWAR2,

SNEHA A. PATIL3, PROF. D.O. SHIRSATH4.

- Compared to spraying pesticides manually when the environment is more closed,

and has a high temperature, humidity and so on for operating the spray work in the
greenhouse in which we use Bluetooth communication to interface controller and android.

Controller can be interfaced to the Bluetooth module though UART protocol. According to

commands received from android the robot motion can be controlled. The consistent output

of a robotic system along with quality and repeatability are unmatched Although the

productivity of the prototype is not quite efficient, the robot still meets the requirements of

pesticide spraying in the greenhouse without human operators.

This project describes the pesticide spraying with robot vehicle. In previous paper the

pesticide spraying robot is done with the help of microcontroller. In some system camera was

not used so spraying mechanism cannot be closely observed. The previous systems are only

based on the solar energy, so many problems are created. In this project we used PIC

controller, Bluetooth module, android app, IR sensor and camera using these farmer can

easily control the robot and pesticide spraying on the crops by viewing on the

computer/Camera/digital video recorder.

The robot can basically complete the work of automatic controlled and meet spraying

requirements in the greenhouse. The robot also met the economic and time constraints that it

was subject to. The robot was able to drive up and back along the tracks in the greenhouse.

The Induction Proximity Sensors detected the rails electively the spraying device de- signed

by another thesis student was able to selectively spray designated groups of plants in the

greenhouse whilst moving along the rails. The coverage of the spray coated the plants in

adequate and consistent dosage.


CHAPTER 3
SYSTEM DESIGN

EXSISTING SYSTEM:

Among the class of organophosphate pesticides, chlorpyrifos is widely used in

vegetables. Chlorpyrifos has toxic effects on the human body particularly on brain and

nervous system. In this paper, design and development of sensors for pesticide residue

detection using parameters like electrical conductivity, pH etc. are proposed. It was found

that the relative percentage deviation between the value of conductivity in pesticide free

samples and the pesticide containing samples of bitter gourd, bottle gourd and tomatoes are

31.4%, 10.7% and 19.09% and also between pesticide free samples and market samples are

33.5%, 8.7% and 16.56% respectively. This large variation among different samples shows

the presence of pesticide residue. Hence, the method can be successfully used for the

detection of pesticide residues in vegetable samples. The proposed sensor system is easy,

rapid and time undemanding method. So, this electronic device can also be used to check

impurities in any other liquid like water, milk etc.

PROPOSED SYSTEM:
The proposed a system which is to detect insects in a paddy crop field. Computer vision

provides image acquisition, processing, analyzing, and understanding images and, in general,

high quality image from the real world in order to produce numerical or symbolic

information, in the forms of decisions. It provides not only comfort but also efficiency and

time saving In this technique, camera captures the leaf of the crop and analyzes the color of

leaf and detects the infected part of the leaf. The camera is used for capturing the image of

crops and sends that image to a processor which processes the image and detects the disease

of crops. The camera is fitted with a pi randomly moves in agricultural field and takes

images. The Raspberry-Pi controls the and sends those images to cloud. Local server

interprets the data from cloud and processes the image to analyze the disease and count the

density of insects in the farm field Our proposed system concentrate on an automatic

detection system which is required to examine the pest infestation and to classify the type of

pest. Nowadays there are number of techniques to identify the pests and for the detection of

plant diseases.

Image Acquisition

The first stage of any vision system is the image acquisition stage. After the image has been

obtained,

various methods of processing can be applied to the image to perform the many different

vision tasks required

today. Digital image acquisition is the creation of photographic images, such as of a physical

scene or of the interior

structure of an object
Image Processing and Classification

Image processing is a method to perform some operations on an image, in order to get an

enhanced

image. It can be done by using open CV (Open Source Computer Vision Library)and Python.

It is a type of signal processing in which input is an image and output may be image or

characteristic/features associated with that image.

Detection

Once image processing and classification is done, the system intimates to the user if it is a

bad insect which affects the crop. Also, it suggests the pesticides which can be used for the

eradication of such pests. The intimation message is sent to the user by GSM so that they can

able to view the message in mobile.

Remedies

If the plant is contaminated, an automatic sprayer of pesticides is used to spray. The supply is

given to motor by DC battery. According to logic programmed in L293D motor drive give

commands to motor to spray the pesticide.


CHAPTER 4
SYSTEM ARCHITECTURE

BLOCK DIAGRAM:

MODULES:

• Capturing the image and analyzing the count for amount of pesticide to be used Pi

Camera module is a camera which can be used to take pictures and high-definition
video. Raspberry Pi Board has CSI (Camera Serial Interface) interface to which we

can attach Pi-Camera module directly. Using this module, the images are taken and

analyzed

• Using CNN algorithm, we will analysis the number of insects that are captured in the

trap and the results are predicted

• Controlling the motor based on the prediction based on the number of pests trapped,

the motor is controlled by driver circuit to spray the pesticide

ALGORITHM USED:

A Convolutional Neural Network (ConvNet/CNN) is a Deep Learning algorithm which can

take in an input image, assign importance (learnable weights and biases) to various

aspects/objects in the image and be able to differentiate one from the other. The pre-

processing required in a ConvNet is much lower as compared to other classification

algorithms. While in primitive methods filters are hand-engineered, with enough training,

ConvNets have the ability to learn these filters/characteristics.

The architecture of a ConvNet is analogous to that of the connectivity pattern of Neurons in

the Human Brain and was inspired by the organization of the Visual Cortex. Individual

neurons respond to stimuli only in a restricted region of the visual field known as the

Receptive Field. A collection of such fields overlap to cover the entire visual area.

A ConvNet is able to successfully capture the Spatial and Temporal dependencies in an image

through the application of relevant filters. The architecture performs a better fitting to the
image dataset due to the reduction in the number of parameters involved and reusability of

weights. In other words, the network can be trained to understand the sophistication of the

image better.

In the case of images with multiple channels (e.g. RGB), the Kernel has the same depth as that

of the input image. Matrix Multiplication is performed between Kn and In stack ([K1, I1];

[K2, I2]; [K3, I3]) and all the results are summed with the bias to give us a squashed one-

depth channel Convoluted Feature Output.

There are two types of results to the operation — one in which the convolved feature is

reduced in dimensionality as compared to the input, and the other in which the dimensionality

is either increased or remains the same. This is done by applying Valid Padding in case of the

former, or Same Padding in the case of the latter.


CHAPTER 5
HARDWARE DESCRIPTION
 Raspberry Pi

 Web-cam

 Power Supply

 GSM

 Motor Driver

 Motor

Raspberry pi:

RASPBERRY PI:

The Raspberry Pi 3 Model B+ is the latest product in the Raspberry Pi 3 range, boasting a 64-

bit quad core processor running at 1.4GHz, dual-band 2.4GHz and 5GHz wireless LAN,
Bluetooth 4.2/BLE, faster Ethernet, and PoE capability via a separate PoE HAT The dual-

band wireless LAN comes with modular compliance certification, allowing the board to be

designed into end products with significantly reduced wireless LAN compliance testing,

improving both cost and time to market. The Raspberry Pi 3 Model B+ maintains the same

mechanical footprint as both the Raspberry Pi 2 Model B and the Raspberry Pi 3 Model B.

The Raspberry Pi Compute Module 3+ (CM3+) is a range of DDR2-SODIMM-mechanically-

compatible System on Modules (SoMs) containing processor, memory, eMMC Flash (on

non-Lite variants) and supporting power circuitry. These modules allow a designer to

leverage the Raspberry Pi hardware and software stack in their own custom systems and form

factors. In addition these modules have extra IO interfaces over and above what is available

on the Raspberry Pi model A/B boards, opening up more options for the designer. The CM3+

contains a BCM2837B0 processor (as used on the Raspberry Pi 3B+), 1Gbyte LPDDR2

RAM and eMMC Flash. The CM3+ is currently available in 4 variants, CM3+/8GB,

CM3+/16GB, CM3+/32GB and CM3+ Lite, which have 8, 16 and 32 Gigabytes of eMMC

Flash, or no eMMC Flash, respectively. The CM3+ Lite product is the same as CM3+ except

the eMMC Flash is not fitted, and the SD/eMMC interface pins are available for the user to

connect their own SD/eMMC device. Note that the CM3+ is electrically identical and, with

the exception of higher CPU z-height, physically identical to the legacy CM3 products.
CM3+ modules require a software/firmware image dated November 2018 or newer to

function correctly

Hardware

• Low cost

• Low power

• High availability

• High reliability – Tested over millions of Raspberry Pis Produced to date – Module IO pins

have 15 micro-inch hard gold plating over 2.5 micron Nickel

2.2 Peripherals

• 48x GPIO

• 2x I2C

• 2x SPI

• 2x UART

• 2x SD/SDIO

• 1x HDMI 1.3a

• 1x USB2 HOST/OTG

• 1x DPI (Parallel RGB Display)

• 1x NAND interface (SMI)

• 1x 4-lane CSI Camera Interface (up to 1Gbps per lane)


• 1x 2-lane CSI Camera Interface (up to 1Gbps per lane)

• 1x 4-lane DSI Display Interface (up to 1Gbps per lane)

• 1x 2-lane DSI Display Interface (up to 1Gbps per lane) 2.3 Software

• ARMv8 Instruction Set

• Mature and stable Linux software stack – Latest Linux Kernel support – Many drivers

upstreamed – Stable and well supported userland – Full availability of GPU functions using

standard APIs

Mechanical Specification The CM3+ modules conform to JEDEC MO-224 mechanical

specification for 200 pin DDR2 (1.8V) SODIMM modules and therefore should work with

the many DDR2 SODIMM sockets available on the market. (Please note that the pinout of

the Compute Module is not the same as a DDR2 SODIMM module; they are not electrically

compatible.) The SODIMM form factor was chosen as a way to provide the 200 pin

connections using a standard, readily available and low cost connector compatible with low

cost PCB manufacture. The maximum component height on the underside of the Compute
Module is 1.2mm. The maximum component height on the top side of the Compute Module

is 2.5mm. The Compute Module PCB thickness is 1.0mm +/- 0.1mm. Note that the location

and arrangement of components on the Compute Module may change slightly over time due

to revisions for cost and manufacturing considerations; however, maximum component

heights and PCB thickness will be kept as specified. Figure 2 gives the CM3+ mechanical

dimensions

The Compute Module 3+ has six separate supplies that must be present and powered at all

times; you cannot leave any of them unpowered, even if a specific interface or GPIO bank is

unused. The six supplies are as follows: 1. VBAT is used to power the BCM2837 processor

core. It feeds the SMPS that generates the chip core voltage. 2. 3V3 powers various

BCM2837 PHYs, IO and the eMMC Flash. 3. 1V8 powers various BCM2837 PHYs, IO and

SDRAM. 4. VDAC powers the composite (TV-out) DAC. 5. GPIO0-27 VREF powers the

GPIO 0-27 IO bank. 6. GPIO28-45 VREF powers the GPIO 28-45 IO bank.

WEBCAM:

A webcam is a video camera that feeds or streams an image or video in real time to or

through a computer to a computer network, such as the Internet. Webcams are typically small

cameras that sit on a desk, attach to a user's monitor, or are built into the hardware. Webcams

can be used during a video chat session involving two or more people, with conversations

that include live audio and video. For example, Apple's iSight camera, which is built into

Apple laptops, iMacs and a number of iPhones, can be used for video chat sessions, using

the Messages instant messaging program. Webcam software enables users to record a video

or stream the video on the Internet. As video streaming over the Internet requires

much bandwidth, such streams usually use compressed formats. The maximum resolution of


a webcam is also lower than most handheld video cameras, as higher resolutions would be

reduced during transmission. The lower resolution enables webcams to be relatively

inexpensive compared to most video cameras, but the effect is adequate for video chat

sessions.[1]

The term "webcam" (a clipped compound) may also be used in its original sense of a video

camera connected to the Web continuously for an indefinite time, rather than for a particular

session, generally supplying a view for anyone who visits its web page over the Internet.

Some of them, for example, those used as online traffic cameras, are expensive,

rugged professional video cameras.

mage sensors can be CMOS or CCD, the former being dominant for low-cost cameras, but

CCD cameras do not necessarily outperform CMOS-based cameras in the low-price range.

Most consumer webcams are capable of providing VGA-resolution video at a frame rate of

30 frames per second. Many newer devices can produce video in multi-

megapixel resolutions, and a few can run at high frame rates such as the PlayStation Eye,

which can produce 320×240 video at 120 frames per second. The Wii Remote contains an

image sensor with a resolution of 1024×768 pixels.

As the bayer filter is proprietary, any webcam contains some built-in image processing,

separate from compression.

Various lenses are available, the most common in consumer-grade webcams being a

plastic lens that can be manually moved in and out to focus the camera. Fixed-focus lenses,

which have no provision for adjustment, are also available. As a camera system's depth of

field is greater for small image formats and is greater for lenses with a large f-number (small

aperture), the systems used in webcams have a sufficiently large depth of field that the use of

a fixed-focus lens does not impact image sharpness to a great extent.


Most models use simple, focal-free optics (fixed focus, factory-set for the usual distance from

the monitor to which it is fastened to the user) or manual focus.

Digital video streams are represented by huge amounts of data, burdening its transmission

(from the image sensor, where the data is continuously created) and storage alike.

Most if not all cheap webcams come with built-in ASIC to do video compression in real-time.

Support electronics read the image from the sensor and transmit it to the host computer. The

camera pictured to the right, for example, uses a Sonix SN9C101 to transmit its image

over USB. Typically, each frame is transmitted uncompressed in RGB or YUV or

compressed as JPEG. Some cameras, such as mobile-phone cameras, use a CMOS sensor

with supporting electronics "on die", i.e. the sensor and the support electronics are built on a

single silicon chip to save space and manufacturing costs. Most webcams feature built-

in microphones to make video calling and videoconferencing more convenient.

Various proprietary as well as free and open-source software is available to handle the UVC

stream. One could use Guvcview or GStreamer and GStreamer-based software to handle the

UVC stream. Another could use multiple USB cameras attached to the host computer the

software resides on, and broadcast multiple streams at once over (Wireless) Ethernet, such as

MotionEye. MotionEye can either be installed onto a Raspberry Pi as MotionEyeOs, or

afterwards on Raspbian as well. MotionEye can also be set up on Debian, Raspbian is a

variant of Debian.

POWER SUPPLY:
TRANSFORMER:

This document presents the solution for a 12V 1A flyback converter based on the Infineon

OPTIREG™ TLE8386-2EL controller and IPD50N08S4-13 OptiMOS™-T2. The user is

guided through the component selections, the circuit design and, finally, an overview of the

experimental results are presented. The TLE8386-2EL is part of the Automotive OPTIREG™

family and it implements a low-side-sense current mode controller with built in protection

features. The device is AECQ-100 qualified. The IPD50N08S4-13 is an AEC-Q101 qualified

80V N-channel enhanced mode MOSFET, it is part of the OptiMOS™-T2 family. Intended

audience This document is intended for power supply design engineers, application

engineers, students, etc., who need to design a Flyback converter for automotive power

applications where a galvanic isolation between two voltage domains is required. In

particular the focus is on a battery connected flyback that delivers up to 12W at 12V output

voltage; the intention is to provide the user with all of the needed information to fully design

and characterize the SMPS bringing it from an engineering concept to its production. Specific

features and applications are: - 48V to 12V Automotive applications - Isolated current mode

SMPS - Flyback regulators with auxiliary sensing

Centre Tapped Transformer Specifications

 Step-down Centre tapped Transformer

 Input Voltage: 220V AC at 50Hz

 Output Voltage: 24V, 12V or 0V

 Output Current: 1A

 Vertical mount type

 Low cost and small package


A centre-tapped transformer also known as two phase three wire transformer is normally used

for rectifier circuits. When a digital project has to work with AC mains a Transformer is used

to step-down the voltage (in our case, to 24V or 12V) and then convert it to DC by using a

rectifier circuit. In a center-tapped transformer the peak inverse voltage is twice as in bridge

rectifier hence this transformer is commonly used in full wave rectifier circuits.

The operation and theory behind a Center tapped transformer is very similar to a normal

secondary transformer. A primary voltage will be induced in the primary coil (I1 and I3) and

due to magnetic induction the voltage will be transferred to the secondary coil. Here in the

secondary coil of a centre tapped transformer, there will be an additional wire (T2) which will

be placed exactly at the center of the secondary coil, hence the voltage here will always be

zero.

If we combine this zero potential wire (T2) with either T1 or T2, we will get a voltage of 12V

AC. If this wire is ignored and voltage across T1 and T2 is considered then we will get a

voltage of 24V AC. This feature is very useful for the function of a full wave rectifier.

Let us consider the voltage given by the first half of the secondary coil as Va and the voltage

across the second half of the secondary coil as Vb as shown


RECTIFER CIRCUIT:

We have learnt in rectifier circuits about converting a sinusoidal ac voltage into its

corresponding pulsating dc. Apart from the dc component, this pulsating dc voltage will have

unwanted ac components like the components of its supply frequency along with its

harmonics (together called ripples). These ripples will be the highest for a single-phase half

wave rectifier and will reduce further for a single-phase full wave rectifier. The ripples will

be minimum for 3-phase rectifier circuits. Such supply is not useful for driving complex

electronic circuits. For most supply purposes constant dc voltage is required than the

pulsating output of the rectifier. For most applications the supply from a rectifier will make

the operation of the circuit poor. If the rectifier output is smoothened and steady and then

passed on as the supply voltage, then the overall operation of the circuit becomes better.

Thus, the output of the rectifier has to be passed though a filter circuit to filter the ac

components. The filter is a device that allows passing the dc component of the load and

blocks the ac component of the rectifier output. Thus the output of the filter circuit will be a

steady dc voltage. The filter circuit can be constructed by the combination of components like
capacitors, resistors, and inductors. Inductor is used for its property that it allows only dc

components to pass and blocks ac signals. Capacitor is used so as to block the dc and allows

ac to pass. All the combinations and their working are explained in detail below. Series

Inductor Filter The circuit diagram of a full wave rectifier with a series inductor filter is given

below. As the name of the filter circuit suggests, the Inductor L is connected in series

between the rectifier circuit and the load. The inductor carries the property of opposing the

change in current that flows through it. In other words, the inductor offers high impedance to

the ripples and no impedance to the desired dc components. Thus the ripple components will

be eliminated. When the rectifier output current increases above a certain value, energy is

stored in it in the form of a magnetic field and this energy is given up when the output current

falls below the average value. Thus all the sudden changes in current that occurs in the circuit

will be smoothened by placing the inductor in series between the rectifier and the load. The

waveform below shows the use of inductor in the circuit. From the circuit, for zero frequency

dc voltage, the choke resistance Ri in series with the load resistance RL forms a voltage

divider circuit, and thus the dc voltage across the load is Vdc = RL/(Ri + RL) Vdc is the

output from a full wave rectifier. In this case, the value of Ri is negligibly small when

compared to RL. The effect of higher harmonic voltages can be easily neglected as better

filtering for the higher harmonic components take place. This is because of the fact that with

the increase in frequency, the reactance of the inductor also increases. It should be noted that

a decrease in the value of load resistance or an increase in the value of load current will

decrease the amount of ripples in the circuit. So, the series inductor filter is mostly used in

cases of high load current or small load resistance. A simple series inductor filter may not be

properly used. It is always better to use a shunt capacitor (C) with series inductor (L) to form

an LC Filter. Shunt Capacitor Filter As the name suggests, a capacitor is used as the filter and

this high value capacitor is shunted or placed across the load impedance. This capacitor,
when placed across a rectifier gets charged and stores the charged energy during the

conduction period. When the rectifier is not conducting, this energy charged by the capacitor

is delivered back to the load. Through this energy storage and delivery process, the time

duration during which the current flows through the load resistor gets increased and the

ripples are decreased by a great amount. Thus for the ripple component with a frequency of

‘f’ megahertz, the capacitor ‘C’ will offer a very low impedance. The value of this impedance

can be written as: Shunt Capacitor Impedance = 1/2 fC Thus the dc components of the input

signal along with the few residual ripple components, is only allowed to go through the load

resistance RLoad. The high amount of ripple components of current gets bypassed through

the capacitor C. Now let us look at the working of Half-wave rectifier and Full-wave rectifier

with Capacitor filters, their output filtered waveform, ripple factor, merits and demerits in

detail.

MOTOR DRIVER:

The Device is a monolithic integrated high voltage, high current four channel driver designed

to accept standard DTL or TTL logic levels and drive inductive loads (such as relays

solenoides, DC and stepping motors) and switching power transistors. To simplify use as two

bridges each pair of channels is equipped with an enable input. A separate supply input is

provided for the logic, allowing operation at a lower voltage and internal clamp diodes are

included. This device is suitable for use in switching applications at frequencies up to 5 kHz.

A DC motor is any of a class of electrical machines that converts direct current

electrical power into mechanical power. The most common types rely on the forces produced

by magnetic fields. Nearly all types of DC motors have some internal mechanism, either

electromechanical or electronic, to periodically change the direction of current flow in part of


the motor. Most types produce rotary motion; a linear motor directly produces force and

motion in a straight line.

CHAPTER 6
SOFTWARE DESCRIPTION

PYTHON:
PYTHON 3.7:

Python is an interpreter, high-level, general-purpose programming language.

Created by Guido van Rossum and first released in 1991, Python's design philosophy

emphasizes code readability with its notable use of significant whitespace.

Python is an easy to learn, powerful programming language. It has efficient

high-level data structures and a simple but effective approach to object- oriented

programming. Python’s elegant syntax and dynamic typing, together with its

interpreted nature, make it an ideal language for scripting and rapid application

development in manya reason most platforms and may be freely distributed. The

same site also contains distributions of and pointers to many free third party Python

modules, programs and tools, and additional documentation. The Python interpreter is

easily extended with new functions and data types implemented in C or C++ (or other

languages callable from C). Python is also suitable as an extension language for

customizable applications. This tutorial introduces the reader informally to the basic

concepts and features of the Python language and system. It helps to have a Python

interpreter handy for hands-on experience, but all examples are self-contained, so the

tutorial can be read off- line as well. For a description of standard objects and

modules, see library-index. Reference-index gives a more formal definition of the

language. To write extensions in C or C++, read extending-index and c-api-index.

There are also several books covering Python in depth. This tutorial does not attempt

to be comprehensive and cover every single feature, or even every commonly used

feature. Instead, it introduces many of Python’s most notes worthy features, and will

give you a good idea of the language’s flavor and style. After reading it, you will be

able to read and write Python modules and programs, and you will be ready to learn

more about the various Python library modules described in library-index. If you do
much work on computers, eventually you find that there’s some task you’d liketo

automate. For example, you may wish to perform a search-and-replace over a large

number of text files, or rename and rearrange a bunch of photo files in a complicated

way. Perhaps you’d like to write a small custom database, or a specialized

GUI application or a simple game. If you’re a professional software developer,

you may have to work with several C/C++/Java libraries but find the usual

write/compile/test/re-compile cycle is too slow. Perhaps you’re writing a test suite for

such a library and find writing the testing code a tedious task. Or maybe you’ve

written a program that could use an extension language, and you don’t want to design

and implement a whole new language for your application.

Typing an end-of-file character (Control-D on Unix, Control-Z on Windows) at

the primary prompt causes the interpreter to exit with a zero exit status. If that doesn’t

work, you can exit the interpreter by typing the following command: quit(). The

interpreter’s line-editing features include interactive editing, history substitution and

code completion on systems that support read line.

Perhaps the quickest check to see whether command line editing is supported is

typing Control-P to the first Python prompt you get. If it beeps, you have command

line editing; see Appendix Interactive Input Editing and History Substitution for an

introduction to the keys. Ifnothing appears to happen, or if ^P is echoed, command

line editing isn’t available; you’ll only be able to use backspace to remove characters

from the current line. The interpreter operates somewhat like the Unix shell: when

called with standard input connected to a tty device, it reads and executes commands

interactively; when called with a file name argument or with a file as standard input, it

reads and executes a script from that file. A second way of starting the interpreter is

python -c command [arg] ..., which executes the statement(s) in command, analogous
to the shell’s -c option. Since Python statements often contain spaces or other

characters that are special to the shell, it is usually advised to quote commands in its

entirety with single quotes .Some Python modules are also useful as scripts. These

can be invoked using python-m module [arg]...,which executes the source file for the

module as if you had spelled out its full name on the command line. When a script

file is used, it is sometimes useful to be able to run the script and enter interactive

mode afterwards. This can be done by passing -i before the script.

There are tools which use doc strings to automatically produce online or printed

documentation or to let the user interactively browse through code; it’s good practice

to include doc strings in code that you write, so make a habit of it. The execution of a

function introduces a new symbol table usedfor the local variables of the function.

More precisely, all variable assignments in a functions to read the value in the local

symbol table; whereas variable references first look in the local symbol table, then in

the local symbol tables of enclosing functions, then in the global symbol table, and

finally in the table of built-in names. Thus, global variables cannot be directly

assigned a value within a function (unless named in a global statement), although they

may be referenced. The actual parameters (arguments) to a function call are

introduced in the local symbol table of the called function when it is called; thus,

arguments are passed using call by value (where the value is always an object

reference, not the value of the object).1 When a function calls another function, a new

local symbol table is created for that call. A function definition introduces the

function name in the current symbol table. The value of the function name has a type

that is recognized by the interpreter as a user-defined function. This value can be

assigned to another name which can then also be used as a function.


Annotations are stored in the annotations attribute of the function as a dictionary

and haven o effect on any other part of the function. Parameter annotations are

defined by a colon after the parameter name, followed by an expression evaluating to

the value of the annotation. Return annotationsare defined by a literal ->, followed by

an expression, between the parameter list and the colon denoting the end of the def

statement.

The comparison operators in and not in check whether a value occurs (does not

occur) in a sequence. The operator is and does not compare whether two objects are

really the same object; this only matters for mutable objects like lists. All comparison

operators have the same priority, which is lower than that of all numerical operators.

Comparisons can be chained. For

example,a<b==ctestswhetheraislessthanbandmoreoverbequalsc. Comparisons may be

combined using the Boolean operators and the outcome of a comparison (or of any

other Boolean expression) may be negated with not. These have lower priorities than

comparison operators; between them, not has the highest priority and or the lowest, so

that A and not B or C is equivalent to (A and (not B)) or C. As always, parentheses

can be used to express the desired composition. The Boolean operators and are so-

called short-circuit operators: their arguments are evaluated from left to right, and

evaluation stops as soon as the outcome is determined. For example, if A and C are

true but Bis false, A and B and C does not evaluate the expression C. When used as a

general value and not as a Boolean, the return value of a short-circuit operator is the

last evaluated argument.

Classes provide a means of bundling data and functionality together. Creating a

new class creates a new type of object, allowing new instances of that type to be

made. Each class instance can have attributes attached to it for maintaining its state.
Class instances can also have methods (defined by its class) for modifying its state.

Compared with other programming languages, Python’s class mechanism adds

classes with a minimum of new syntax and semantics. It is a mixture of the class

mechanisms found in C++ and Modula-3. Python classes provide all the standard

features of Object Oriented Programming: the class inheritance mechanism allows

multiple base classes, a derived class can override any methods of its base class or

classes, and a method can call the method of a base class with the same name. Objects

can contain arbitrary amounts and kinds of data. As is true for modules, classes

partake of the dynamic nature of Python: they are created at runtime, and can be

modified further after creation. In C++ terminology, normally class members

(including the data members) are public (except see below Private Variables), and all

member functions are virtual. A sin Modula-3, there are no short hands for

referencing the object’s members from its methods: the method function is declared

with an explicit first argument representing the object, which is provided implicitly by

the call. A sin Small talk, classes themselves are objects. This providesSemantics for

importing and renaming. Unlike C++ and Modula-3, built-in types can be used as

base classes for extension by the user. Also, like in C++, most built-in operators with

special syntax (arithmetic operators, sub scripting etc.) can be redefined for class

instances.(Lacking universally accepted terminology to talk about classes, I will make

occasional use of Smalltalk and C++ terms. I would use Modula-3 terms, since its

object- oriented semantics are closer to those of Python than C++, but I expect that

few readers have heard of it.)

Objects have individuality, and multiple names (in multiple scopes) can be bound

to the same object. This is known as aliasing in other languages. This is usually not

appreciated on a first glance at Python, and can be safely ignored when dealing with
immutable basic types (numbers, strings, tuples).However, aliasing has a possibly

surprising effect on these mantic of Python code involving mutable objects such as

lists, dictionaries, and most other types. This is usually used to the benefit of the

program, since aliases behave like pointers in some respects. For example, passing an

object is cheap since only a pointer is passed by the implementation; and if a function

modifies an object passed as an argument, the caller will see the change — this

eliminates the need for two different argument passing mechanisms as in Pascal.

A namespace is a mapping from names to objects. Most name spaces are currently

implemented as Python dictionaries, but that’s normally not noticeable in any way

(except for performance), and it may change in the future. Examples of name spaces

are: these to f built-in names (containing functions such as abs(), and built-in

exception names); the global names in a module; and the local names in a function

invocation. In a sense the set of attributes of an object also form a namespace. The

important thing to know about namespaces is that there is absolutely no relation

between names in different namespaces; for instance, two different modules may both

define a function maximize without confusion — users of the modules must prefix it

with the module name. By the way, I use the word attribute for any name following a

dot — for example, in the expression z. real, real is an attribute of the object z.

Strictly speaking, references to names in modules are attribute references: in the

expression modname.funcname, modname is a module object and funcname is an

attribute of it. In this case there happens to be a straight forward mapping between the

module’s attributes and the global names defined in the module: they share the same

namespace!1 Attributes may be read-only or writable. In the latter case, assignment to

attributes is

possible. Module attributes are writable: you can


write modname.the_answer = 42. Writable attributes may also be deleted with the

del statement. For example, del mod name .the_ answer will remove the attribute

the_answer from the object named by mod name. Namespaces are created at different

moments and have different lifetimes. The namespace containing the built-in names is

created when the Python interpreter starts up, and is never deleted. The global

namespace for a module is created when the module definition is read in; normally,

module namespaces also last until the interpreter quits.The statements executed by the

top-level invocation of the interpreter, either read from a script file or interactively,

are considered part of a module called main, so they have their own global

namespace.(The built-in names actually also live in a module; this is called built ins.)

The local namespace for a function is created when the function is called, and deleted

when the function returns or raises an exception that is not handled within the

function. (Actually, forgetting would be a better way to describe what actually

happens.) Of course, recursive invocations each have their own local namespace.

To speed uploading modules, Python caches the compiled version

of each module in the pycache directory

under the name

module.version.pyc, where the version encodes the format of the compiled

file; it generally contains the Python version number. For example,

inCPython release 3.3 the compiled version of spam.py would be cached as


pycache/spam.cpython-33.pyc. This namingconvention allows compiled modules from

different releases and different versions of Python to coexist. Python checks the modification

date of thesource against the compiled version to see if it’s out of date and needs to be

recompiled. This is a completely automatic process. Also, the compiled modules are platform-

independent, so the same library can be shared among systems with different architectures.

Python does not check the cache in two circumstances. First, it always recompiles and does

not store the result for the module that’s loaded directly from the command line. Second, it

does not check the cache if there is no source module. To support anon-source (compiled only)

distribution, the compiled module must be in the source directory, and there must not be a

source module. Some tips for experts:

• You can use the -O or -OO switches on the Python command to

reduce the size of a compiled module. The -O switch removes assert

statements, the -OO switch removes both assert statements and doc

strings. Since some programs may rely on having these available, you

should only use this option if you know what you’re doing. “Optimized”

modules have an opt- tag and are usually smaller. Future releases may

change the effects of optimization.

• A program doesn’t run any faster when it is read from a .pyc file

than when it is read from a .py file; the only thing that’s faster about .pyc

files is the speed with which they are loaded.

• The module compile all can create .pyc files for all modules in a

directory.

• There is more detail on this process, including a flow chart of the

decisions
THONNY IDE:

Thonny is as mall and light weight Integrated Development Environment. It was

developed to provide a small and fast IDE, which has only a few dependencies from

other packages. Another goal was to be as independent as possible from a special

Desktop Environment like KDE or GNOME, so Thonny only requires the GTK2

toolkit and therefore you only need the GTK2 runtime libraries installd to run it.

For compiling Thonny yourself, you will need the GTK (>= 2.6.0) libraries and

header files. You will also need the Pango, Gliband ATK libraries and header files.

All these files are available at http://www.gtk.org. Furthermore you need, of course, a

C compiler and the Make tool; a C++ compiler is also required for the included

Scintilla library. The GNU versions of these tools are recommended.

Compiling Thonny is quite easy. The following should do it:

% ./configure

% make

% make install
The configure script supports several common options, for a detailed list, type

% ./configure --help

There are also some compile time options which can be found in src/Thonny .h.

Please see Appendix C for more information. In the case that your system lacks

dynamic linking loader support, you probably want to pass the option --disable-vte to

the configure script. This prevents

compiling Thonny with dynamic linking loader support to automatically load

libvte.so.4 if available. Thonny has been successfully compiled and tested under

Debian 3.1 Sarge, Debian 4.0 Etch, Fedora Core 3/4/5, Linux From Scratch and

FreeBSD 6.0. It also compiles under Microsoft Windows

At startup, Thonny loads all files from the last time Thonny was launched. You

can disable this feature in the preferences dialog (see Figure 3-4). If you specify some

files on the command line, only these files will be opened, but you can find the files

from the last session in the file menu under the "Recent files" item. By default this

contains the last 10 recently opened files. You can change the amount of recently

opened files in the preferences dialog. You can start several instances of Thonny , but

only the first will load files from the last session. To run a second instance of Thonny

, do not specify any file names on the command-line, or disable opening files in a

running instance using the appropriate command line option.

Thonny detects an already running instance of itself and opens files from the

command-line in the already running instance. So, Thonny can be used to view and
edit files by opening them from other programs such as a file manager. If you do not

like this for some reason, you can disable using the first instance by using the

appropriate command line option

If you have installed libvte.so in your system, it is loaded automatically by Thonny

, and you will have a terminal widget in the notebook at the bottom. If Thonny

cannot find libvte.so at startup, the terminal widget will not be loaded. So there is no

need to install the package containing this file in order to run Thonny . Additionally,

you can disable the use of the terminal widget by command line option, for more

information see Section3.2.You can use this terminal (from now on called VTE)

nearly as an usual terminal program like xterm. There is basic clipboard support. You

can paste the contents of the clipboard by pressing the right mouse button to open the

popup menu and choosing Paste. To copy text from the VTE, just select the desired

text and then press the right mouse button and choose Copy from the pop up menu.

On systems running the X Window System you can paste the last selected text by

pressing the middle mouse button in the VTE (on 2-button mice, the middle button

can often be simulated by pressing both mouse buttons together).

As long as a project is open, the Make and Run commands will use the project’s

settings, instead of the defaults. These will be used whichever document is currently

displayed. The current project’s settings are saved when it is closed, or when Thonny

is shut down. When restarting Thonny , the previously opened project file that was in

use at the end of the last session will be reopened. Execute will run the corresponding

executable file, shell script or interpreted script in a terminal window. Note that the

Terminal tool path must be correctly set in the Tools tab of the Preferences dialog -

you can use any terminal program that runs a Bourne compatible shell and accept the
"-e" command line argument to start a command. After your program or script has

finished executing, you will be prompted to press the return key. This allows you to

review any text output from the program before the terminal window is closed.

By default the Compile and Build commands invoke the compiler and linker with

only the basic arguments needed by all programs. Using Set Includes and Arguments

you can add any include paths and compile flags for the compiler, any library names

and paths for the linker, and any arguments you want to use when running Execute.

Thonny has basic printing support. This means you can print a file by passing the

filename of the current file to a command which actually prints the file.

However, the printed document contains no syntax highlighting.

CHAPTER 7
RESULTS

CHAPTER 8
CONCLUSION
An algorithm to detect weeds among plantations was successfully developed and a

prototype of the Automatic Weed detection and Smart Herbicide Sprayer Robot was designed

and implemented successfully. The robot comprised of hardware that can support and

effectively spray herbicides on the detected weeds in real-time. A test track was designed

with a row of ragi (E. coracana) plants and randomly placed broad -leaved weeds. The weeds

were successfully identified and sprayed upon by the robot and this process takes

approximately 3 seconds. All the weed plants were detected properly, with a few ragi (E.

coracana) plants being detected as weeds in cases where they grew in large clusters.

FUTURE SCOPE

The system designed has a lot of scope for improvement and many more features can

be added to this robot.

• The major shortcoming of the robot is its dependency on good and stable lighting

conditions. So, it can be equipped with a light intensity sensor to automatically adjust

variables such as the threshold for extracting green pixels, shutter speed and white balance.

• The image analysis for weed detection can be further improved by dividing the

image into a greater number of regions and have as many nozzles to spray the chemicals.

• It can be turned into a very robust closed loop system by incorporating a memory

module. The image processing algorithm can be developed further so that the detection

becomes more generic.

References:
[1] Kamarul Hawari Ghazali, Mohd. Marzuki Mustafa, Aini Hussain “Machine Vision System

for Automatic Weeding Strategy using Image Processing Technique,” American-Eurasian J. of

Agriculture & Environmental Science, vol. 3, no. 3, pp. 451-458, 2008.

[2] Amir H. Kargar B, Ali M. Shirzadifar, “Automatic Weed Detection System and Smart

Herbicide Sprayer Robot for corn fields,” IEEE Int. Conf. on Robotics and Mechatronics, Tehran, pp.

468 – 473, Feb. 2013.

[3] Muhammad H. Siddiqi, Irshad Ahmad, Suziah Bt Sulaiman “Weed Recognition Based on

Erosion and Dilation Segmentation Algorithm,” Int. Conf. on Education Technology and Computer,

Singapore, Apr. 2009, pp. 224-228.

[4] Rafael C. Gonzalez, Richard E. Woods, “Morphological Image Processing,” in Digital

Image Processing, 3nd edition, New Jersey, Prentice-Hall Inc., 2006, pp. 649-657.

[5] Asnor Juraiza Ishak, Siti Salasiah Mokri, Mohd. Marzuki Mustafa, and Aini Hussain, “Weed

Detection utilizing Quadratic Polynomial and ROI Techniques”, in Proc. of 5th Student Conf. on

Research and Development, Malaysia, Dec. 2007, pp. 1-5.

[6] Raspberry-Pi Foundation, “Raspberry-Pi Documentation,” [Online]. Available:

https://www.raspberrypi.org/documentation/. Last Updated: 1st May 2015.

[7] Ajit G Deshmukh, V.A. Kulkarni, “Advanced Robotic Weeding System,” Trans. on Electrical

and Electronics Engineering, vol. 1, no. 3, 2013, pp. 232 – 236.

[8] Chengliang Liu, Mingjun Wang, and Jun Zhou, “Co-Ordinating control for an agricultural

vehicle with individual wheel speeds and steering angles,” IEEE Control Systems Magazine, vol. 28,

no. 5, Oct. 2008, pp. 21-24.

[9] H. Pota, R. Eaton, J. Katupitiya, S.D. Pathirana, “Agricultural robotics: A streamlined

approach to realization of autonomous farming,” 2nd Int. Conf. on Industrial and Information

System, IEEE, Aug. 2007, pp.85-90.


[10] Manivannan M and Kumaresan N "Embedded web server& GPRS based Advanced

Industrial Automation using Linux RTOS" International Journal of Engineering Science and

Technology Vol. 2(11), 2010, 6074-6081.

[11] Clyde C. W. Robson, Samuel Silverstein, and Christian Bohm (2008) An Operation-Server

Based Data Acquisition System Architecture IEEE Transaction on Nuclear Science, Vol. 55, No.1.

[12] Du.Y, and Liu.C, (2007) Testing Method for Embedded Real-time System Software

Control & Automation, Vol. 23, No. 4-2, pp. 86-88.

[13] KrithiRamamritham, John A. Stankovic, (1994) Scheduling Algorithms and Operating

Systems Support for Real-Time Systems, Proceedings of IEEE, vol. 82, No. 1, pp. 55-67.

[14] Li J,Zhang B, Qiu D.Y, (2007) Multi-computer communication based on Modbus RTU in

power quality monitoring system. Electric Power Automation Equipment, Vol.27,(1):93-96.

[15] Peng D.G, Zhang.H, Jiang.J.N. (2008) Design and Realization of Embedded Web Server

Based on ARM and Linux. Mechatronics, Vol.14(10):37-40

You might also like