Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 13

FYP/Forms and Performa’s

DEPARTMENT OF ELECTRICAL ENGINEERING


UNIVERSITY OF GUJRAT

FYP PROJECT PROPOSAL


ACADEMIC SESSION 2023/2024

1. Faryal Naeem Mehmood / 20013122-001


Name/
2. Syeda Ambreen Zahra / 20013122-022
Registration No.
Supervisor Name Dr. Syed Muhammad Wasif
with Designation: Assistant Professor

Project Title: Steering control of Ackerman architecture weed managing mobile robot

Fundamental Analytical
Experimental/Hardware  Simulation
Nature of Project:
Design/Fabrication  Programming 
Others (please specify)

1. INTRODUCTION

Weed control robot is a robot that detects and removes the weeds from crops. Weed is unwanted
vegetation that grows in crop plants and extracts their nutrients. Weeds take moisture from main
crops and impact negatively towards their growth. They can cause losses in crop production and can
be dangerous for the health of humans and animals [1]. Due to their adverse effects removal of
weeds is necessary and it is a hectic target for farmers. Since many past years, herbicides,
weedicides and other chemicals are used to remove the weeds from crops but these chemicals
pollute our environment and also damage plants. Agricultural operations use farm machineries like
tractors, harvesters and weeder that catalyse the process and minimize the labour. Some landowners
can afford these heavy machineries but some cannot afford these machineries also these machines
need fuel to work and today fuel is too expensive and also cause environmental pollution so also this
method is not recommended [2]. For the sake of increasing productivity and growth of crops
farmers use fertilizer sprayers to fertilize the crops which requires more labour and time. Weeds not
only reduce the yield of crops; it also causes inconvenience by interfere with agricultural operations ,
so to avoid this method weed control robot is used. This method gained popularity in the community
as it is environmental friendly and economic beneficial.

Many weed control robots do selective chemical spraying, mechanical weeding (using hoe-tool),
electrical discharging [3]. In this project electrical discharging method will be used. There are two
FYP/Forms and Performa’s

methods used for electrical discharging for removal of weed. One is spark discharge and other is
continuous contact. The first way involves giving plants short bursts of high-voltage electricity. This
is used for things like killing weeds, making plants grow thinner, and making fruit ripen faster.
Second method uses a pair of electrodes, one on each side of a plant and the energy is delivered in a
short pulse (e.g 1 us). This method is used for things like cutting off parts of plants, drying out the
leaves of root crops, and also for weed control and thinning out rows of plants. It damages the plant
tissue either by a sudden burst of electricity or by getting really hot from the electricity flowing
through it.

In this project, Ackermann architecture will be implemented that will follow the four-wheel inde-
pendent steering mechanism. It includes Ackermann steering in which inner and outer wheel of ve-
hicle turn at different radii, active front and rear steering that allows front wheel of vehicle to turn in
opposite direction of rear wheel, crab steering that allows vehicle to move diagonally by turning
wheels in the same direction, spinning that allows the vehicle to rotate around a central point. This
control system makes it easy to switch between four modes of steering according to situation for
maneuvering tasks. The Ackermann steering structure solves the problem of different steering an-
gles caused by the different steering radii of the left and right wheels during steering. According to
Ackermann’s steering geometry [4], when a robot turns along a curve, the steering angle of its inner
wheel can be increased to approximately 2–4° greater than that of its outer wheel by using the equal
crank of the four-link structure. The center of the four-wheel paths can be approximately intersected
with the extension line of the rear axle to obtain the steering center so that smooth turning can be
achieved. Ackermann steering is associated with high-efficiency movement and a high payload.
However, because of their large size, Ackermann steering structures cannot be used in small spaces
and are mainly used in vehicles. Robot will employ machine vision and image processing tech-
niques. It will be equipped with cameras, captures images of the agricultural field. Image processing
algorithms will be employed to analyze these images, identifying and localizing weeds based on
their distinct features. Once weeds are detected, the robot's control system will instructs its mechan-
ical arm or tool to remove the weeds from the soil.

2. PROBLEM STATEMENT

A group from previous batch worked on autonomous weed control robot but it left with some
limitations: robot was detecting weed in virtual environment instead of natural environment and
weed removal mechanism was not developed; secondly, there was no mechanism developed for
FYP/Forms and Performa’s

navigation of Ackermann steering control.

3. OBJECTIVES
The main aims of project are:

1. To develop a system that classify between the weed and other crops in cotton field using image
processing techniques.
2. To build a system for removal of weeds from cotton field via electrical discharging.
3. To develop a mechanism for navigation of Ackermann steering control.

4. LITERATURE REVIEW

Quan Qiu et al. in [5] introduced a groundbreaking extended Ackerman Steering Principle (ASP) to
address the coordinated movement control challenges of a specialized agricultural mobile platform
designed for tasks such as soil/crop data collection and spraying/fertilizing. This four-wheel drive
platform, equipped with in-wheel motors and front/rear steering gears, required innovative control
strategies to minimize linear velocity deviations during turns. The novel strategy they proposed con-
sisted of three integral components: firstly, the utilization of a steering-motor-turning-angle based
model for estimating front/rear wheel steering angles; secondly, the selection of an optimal virtual
turning center to minimize deviations in steering-toe-in angles, enabling the application of the Ack-
erman Steering Principle; and finally, the employment of ASP to generate linear velocities for inside
and outside wheels. Extensive experimentation across varying terrains substantiated the strategy's
efficacy, reducing motor energy consumption and enhancing curve tracking with minimal slippage.
The results underline the potential of this research to significantly decrease slippage in the mobile
platform's curve tracking, promising improved performance in open farmland operations and laying
the foundation for future autonomous navigation capabilities.

In the realm of agricultural robotics, where precision and maneuverability are paramount, Yunxiang
Ye et al. in [6] explored innovative solutions to address the challenges of navigating car-like robotic
equipment in confined orchard spaces constrained by tree rows and obstacles. Their research intro-
duced a four-wheel-independent-steering (4WIS) mechanism within a specialized robotic bin man-
agement system known as the "bin-dog system." This groundbreaking approach necessitated the de-
velopment of a four-mode steering strategy, encompassing Ackermann steering, active front and
rear steering (AFRS), crab steering, and spinning, to enable the bin-dog system to efficiently man-
age fruit bins in these tight orchard environments. The study focused on evaluating the auto-steering
FYP/Forms and Performa’s

performance of both Ackermann and AFRS modes using a pure pursuit method with GPS-based
navigation. Through extensive field testing, the research confirmed the ability of Ackermann and
AFRS steering to achieve satisfactory path tracking within their respective physical constraints. Ad-
ditionally, the findings highlighted the advantages of crab steering and spinning modes in reducing
spatial requirements for merging and cornering compared to traditional steering approaches. As a
promising step forward, the study underlined the need for dynamic control systems to select the
most suitable steering strategies based on variables such as longitudinal speed, look-ahead distance,
and path curvature, paving the way for more efficient and adaptable robotic agricultural equipment
in orchard settings.

In the domain of agricultural automation and weed control, W. S. Lee et al. in [7] developed a real-
time intelligent robotic system that showcased exceptional capabilities in selective herbicide appli-
cation to in-row weeds through the integration of machine vision and precision chemical applica-
tion. The system's robotic vision component efficiently processed images at a rate of 0.34 seconds
per frame, covering an 11.43 cm by 10.16 cm region and identifying 10 plant objects within, thus
enabling the prototype robotic weed control system to maintain a continuous speed of 1.20 km/h.
The paper discusses the overall performance of this innovative robotic system in real-world applica-
tions within a commercial processing tomato field, as well as in simulated trials. The research's con -
clusion highlighted the successful development and testing of this real-time intelligent robotic weed
control system, specifically emphasizing the impressive image processing speed, which enabled pre-
cise and efficient weed detection and selective spraying. The system's real-time identification of
73.1% of tomatoes and 68.8% of weeds in commercial tomato fields underscores its practical viabil -
ity, offering promising prospects for the future of intelligent and sustainable weed control in agricul-
ture.

Bjorn Astrand et al. in [8] introduced a pioneering autonomous agricultural mobile robot designed
for mechanical weed control in outdoor agricultural settings. This innovative robot integrated two
distinct vision systems, a gray-level vision system for recognizing crop rows and guiding the robot
along them, and a color-based vision system for distinguishing single crops from weed plants. The
latter vision system controlled a weeding tool capable of removing weeds within the crop rows. The
novel row-recognition algorithm in the vision system underwent rigorous outdoor field testing,
demonstrating remarkable accuracy with a precision of ±2 cm. The feasibility of color vision for sin-
gle-plant identification, particularly in discriminating between crops and weeds, was also estab-
lished. The research showcased the successful collaboration of various subsystems, with an initial
greenhouse trial confirming the robot's proficiency in weed control within crop rows. The conclu-
FYP/Forms and Performa’s

sion emphasized the comprehensive system design, featuring forward and downward-looking cam-
era systems, a four-wheeled mobile robot based on Ackerman steering, and a mechanical weeding
tool. The promise of the robot's ability to address weed control in ecologically grown fields was re-
inforced. Future endeavors were outlined, with a focus on the development of robust and high-per-
formance algorithms for improved plant-weed differentiation and extensive field testing across di-
verse agricultural settings, expanding its applicability to various row-cultivated crops. This research
marks a significant stride towards autonomous agricultural robotics and precision weed manage-
ment.

In [9] Prakash Kanade et al. talk about a special robot that helps get rid of weeds in fields. This
robot used a computer program to look at pictures of the field and locate where the weeds were. It
did this by targeting the edges of plants and using a special method to pick out weeds. To make it
work quickly and accurately, they used a computer program called R-CNN based on the VGG-16
feature extraction network, which is like a brain for the robot. It helps the robot to see and
understand things in real time. This computer system can look at the pictures and see where the
weeds are. This way the robot can get rid of the weeds without damaging the good plants. In this
way farmers can better care for their crops, making farming more efficient and helping to produce
more food. They implemented this technology in corn field to remove weeds and concluded that it
had significant practical implications for promoting the use of intelligent weeding robots in the field.

In [10] Prof. V. A. Kulkarni et al. developed a robotic vehicle having four wheels, steered by dc
motor and IR sensor arrangement for controlling the growth of weeds in fields. The robot was
trained to detect particular rows per column at fixed distance in natural environment. The
mechanism for obstacle detection using different sensors and path planning was also developed in it.
It was an efficient solution for weed problem and it had ability to work without human assistance. It
had two man mechanisms, first one was to build a four-wheel vehicle which could move in field and
second was cutting the weeds in rows of crops. The whole algorithm, calculation, processing,
monitoring was designed with motors and sensors interfaced with microcontroller. Authors
predicted that system had the potential to do even more. It could be upgraded to check soil moisture
with a special sensor. Then It would be able to adjust the amount of water in the soil to match the
needs of the seeds and crops. This means it could add more water to the soil if needed. In addition,
measuring other things on the farm, such as how well the crops were growing or how many weeds
there were and what type, could be improved. It could even monitor one or more systems using a
GSM system that allows robot to monitor and control them remotely.
FYP/Forms and Performa’s

The research presented in [11] addresses the optimization of a wheeled mobile robot's motion mode
switching (MMS) performance, a critical aspect of its operation in agricultural applications
involving detection, weeding, and information monitoring. The investigation aimed to identify the
optimal control parameters that ensure the smooth MMS of a four-wheel steering system. Through a
combination of single-factor tests, a binary quadratic general rotation combination test, and an
entropy weight method, the study sought to construct a comprehensive index for evaluating these
parameters. The results unequivocally demonstrated the significant impact of specific factors and
their interactions on the comprehensive index. Ultimately, the study determined the best
combinations for the speed of the stepper motor and locking voltage under different steering angles,
further verifying the efficacy of these combinations through prototype testing. The findings shed
light on a practical approach to enhancing the MMS of wheeled robots, contributing to the growing
body of knowledge in the field of experimental methods for optimizing wheeled robot performance
in agriculture.

A study on robotic weed control by machine vision for vegetable crops was made in [12]. The
machine consists of four parts which are primary vision system, secondary vision system, inertia
control unit, and arm control that running on four parallel PCs (personal computer). The robotic arm
used to access weed plants. The weeds removed by produced electric spark of 15 kV and all
subsystems connected through single CAN bus. The robot was not economical due to needs of high-
power supply. The test outcomes show 99% of lettuce plants and 84% of weeds detected.

Authors developed a fully managed autonomous weed detection robot which was able of machine
vision technique which is sub-field of computer vision, weed detection, whole covering of field
plants, and also automatic recharge available for less than 400 dollars’ in [13]. Therefore, the main
pick out of this model was to develop a robot for weed plants control for passing through the crop
line (1 footstep distance) and to detect weed plants with machine learning algorithms then
herbicides were sprayed on them for weed removal.

Megha. P. Arakeri et.al., developed a model built with computer vision based robotic weed control
system (WCS) for controlling the weed plants in onion fields in [14]. This system was capable of
determining weed plants and specifically dropped appropriate quantity of the toxin on the weed
plants. It was made for the automatic removal and control of weed plants. Thus, decreased the
problems of maintenance of crop fields. The preferred robotic model was based on image processing
namely pictorial representation of weed plants, along-with artificial intelligence and internet of
FYP/Forms and Performa’s

things (IoT).

5. Sustainable Development Goals (SDG’s to be addressed in the project)


[ Details of UN SDG’s can be found on internet ; Remove the tick marks from irrelevant SDG’s]
1. No Poverty
2. Zero Hunger
3. Good Health and Well Being 
4. Quality Education
5. Gender Equality
6. Clean Water and Sanitation
7. Affordable and Clean Energy
8. Decent Work and Economic Growth 
9. Industry, Innovation and Infrastructure
10. Reduced Inequalities
11. Sustainable Cities and Communities
12. Responsible Consumption and Production
13. Climate Action
14. Life Below Water
15. Life on Land
16. Peace, Justice and Strong Institutions
17. Partnerships for the Goals

Justification of Sustainable Development Goals


FYP/Forms and Performa’s

Good Health and Well Being (SDG-3):


Weed control robots can help reduce the need for chemical herbicides, which can have harmful
effects on human health when used in excess. By using robots to target and remove weeds, the
project contributes to a healthier environment and minimizes the risk of pesticide exposure for
agricultural workers and nearby communities.
Decent Work and Economic growth (SDG-8):
Autonomous weed removal minimizes herbicide and labour expenses, fostering economic growth.
This method outperforms conventional approaches, ensuring precise and efficient method. Weed
removal with minimal chemical usage, promoting healthier and higher crop yields, and contributing
to overall economic prosperity which ensures decent work.

6. FLOW CHART / BLOCK DIAGRAM

Camera 1 Camera 2
(For Weed detection) (For Path Planning)

Image Pre-Processing Multi target depth ranging


(e.g enhancement (Provide 3-D representation of
filtration, Gray scaling) environment and distance to
multiple objects in field)

Image Segmentation Localization


(e.g Thresholding, (determine current position
edge detection, binary) using different techniques)

Path Planning
Feature Extraction
(Safe and efficient route, obstacle
(e.g shape analysis,
detection and avoidance)
colour analysis)
FYP/Forms and Performa’s

7. Range of Problem Identification and Solving


[ Complex Engineering Problems have characteristic WP1 and some or all of WP2 to WP7: Remove the tick
marks from irrelevant WP’s]
WP1: Depth of Cannot be resolved without in-depth engineering
Knowledge Required knowledge at the level of one or more of WK3,
WK4, WK5, WK6 or WK7 which allows a  
fundamentals-based, first principles analytical
approach.
WP2: Range of Involve wide-ranging and/or conflicting technical,
conflicting non-technical issues (such as ethical,
requirements sustainability, legal, political, societal) and 
consideration of future requirements.
WP3: Depth of Have no obvious solution and require abstract 
analysis thinking, creativity and originality in analysis to
required formulate suitable models.
WP4: Familiarity of Involve infrequently encountered issues or novel
issues problems.
WP5: Extent of Address problems not encompassed by standards

applicable codes and codes of practice for professional engineering.
WP6: Extent of 
stakeholder Involve collaboration across engineering
involvement and disciplines, other fields, and/or diverse groups of
conflicting stakeholders with widely varying needs.
requirements
WP7: Address high level problems with many 
Interdependence components or sub-problems that may require a
systems approach.

Justification / Description aligned with the project objectives and outcomes

Depth of Knowledge required (WP1):


The Steering control of Ackerman architecture weed managing mobile robot involves a combination
of mechanical, electrical, and software systems working together. This project requires deep
understanding of robotics and control systems. In-depth knowledge of computer vision, image
FYP/Forms and Performa’s

processing is required to ensure accurate and reliable weed detection.


Range of conflicting requirements (WP2):
In this project raspberry pi is used and there are several alternatives of raspberry pi like Libre
Computer Board AML-S905X-CC. Raspberry Pi boasts a large and active community, providing
extensive online resources and tutorials for troubleshooting and development which is limited in
Libre Computer Board AML-S905X-CC. Its well-established ecosystem offers a wide range of
accessories and add-ons, simplifying the integration of components like cameras, sensors, and motor
controllers.
There are many methods for killing the weed e.g mechanical hoeing, electrical discharging etc. In
this project electrical discharging method is used. The precision of electrical discharging, targeting
weeds at the root level ensures thorough control and minimizes the chances of regrowth. In contrast,
Mechanical hoeing is often limited to weeds with shallow roots, making it less versatile compared to
other methods. Sometimes it can also remove main crop with weeds while electrical discharge only
target weeds.
Depth of analysis required (WP3):
Each task in this project is itself a complex engineering problem and depth of analysis is required to
develop and implement different algorithms for addressing these problems. After implementation,
analysis of output will be done e.g for image processing tasks different algorithms are used like
filtering algorithms, segmentation algorithms etc.

Extent of applicable codes (WP5):

This project follows the IEEE ( P7010) standard of Autonomous and Intelligent Systems. Through
the integration of artificial intelligence, computer vision, and robotic systems, the weed control
robot can navigate its environment, recognize different plant species, and selectively target and
eliminate weeds. By utilizing algorithms and sensors, the robot can make real-time decisions about
which plants to target, optimizing the use of resources and minimizing the impact on desirable
crops.
Extent of stakeholder involvement and conflicting requirements (WP6):

This project involves only collaboration across engineering disciplines. Weed control robot using
ackerman steering mechanism involves a combination of mechanical, electrical, and software
systems working together. The design of ackerman steering is based on mechanical system, various
algorithms for image processing techniques would be implemented on software while control
system is based on electrical system.
Interdependence (WP7):
FYP/Forms and Performa’s

Every task in this project is dependent on other task e.g image processing tasks cannot be performed
without capturing images. Similarly, feature extraction is based on image processing tasks. The
steering control system is interdependent with other subsystems of the weed managing robot, such
as the sensing and actuation systems. Adjustments in steering impact the trajectory of the robot,
influencing its ability to effectively manage weeds.

8. GANTT CHART
2023 2024
TITLE OF ACTIVITY
10 11 12 01 02 03 04 05 06
 
1. Image Acquisition

 
2. Image processing

 
3. Image segmentation

 
4. Feature Extraction

5. Path planning & Localization  

6. Navigation of Ackermann   
Architecture
7. Weed Removal 

8. Testing 

9. Report Writing 

9. PROJECT REQUIRMENT: HARDWARE / SOFTWARE


Hardware and software both are required for weed control robot.
Hardware requirements:
1.Robot Platform
2.Sensors (e.g cameras)
3.Actuators (motors)
4.Micro-controllers (e.g aurdino)
5. Raspberry pi
Software Requirements:
1. Spyder (Python programming)

10. REFERENCES
FYP/Forms and Performa’s

[1] B. Astrand, A.-J. Baerveldt, "An Agricultural Mobile Robot with Vision-Based Perception for
Mechanical Weed Control," Autonomous Robots, vol. 13, no. 1, pp. 21-35, 2002.
[2] A. Kumar, H. B, S. M. K, A. A. P, "Design and Fabrication of Electric Weeder along with
Fertilizer Sprayer," Int. Res. J. Eng. Technol. (IRJET), vol. 10, no. 04, Apr. 2023.
[3] M. F. Diprose and F. A. Benson, "Electrical Methods of Killing Plants," J. Agric. Engng Res.,
vol. 30, no. 3, pp. 197-209, 1984.
[4] C.-J. Lin, M.-Y. Chang, K.-H. Tang, and C.-K. Huang, "Navigation Control of Ackermann
Steering Robot Using Fuzzy Logic Controller," Sensors Mater., vol. 35, no. 3, pp. 781–794,
2023.
[5] Q. Qiu.et al, "Extended Ackerman Steering Principle for the coordinated movement control of a
four-wheel drive agricultural mobile robot," Computers and Electronics in Agriculture, vol.
152, pp. 40-50, 2018.
[6] Y. Ye. et al, "Steering Control Strategies for a Four-Wheel-Independent-Steering Bin Managing
Robot," IFAC-PapersOnline, vol. 49, no. 16, pp. 39-44, 2016.
[7] W. S. Lee. et al, "Robotic Weed Control System for Tomatoes," Precision Agriculture, vol. 1,
pp. 95-113, 1999.
[8] B. Astrand, A.-J. Baerveldt, "An Agricultural Mobile Robot with Vision-Based Perception for
Mechanical Weed Control," Autonomous Robots, vol. 13, no. 1, pp. 21-35, 2002.
[9] P. Kanade. et al, "Agricultural Mobile Robots in Weed Management and Control," in Int. J.
Advanced Networking and Applications, vol. 13, no. 3, pp. 5001-5006, 2021.
[10] V. A. Kulkarni, A. G. Deshmukh, " Advanced Agriculture Robotic Weed Control System",
International Journal of Advanced Research in Electrical, Electronics and Instrumentation
Engineering, vol. 2, no. 10, Oct. 2013.
[11] J. Qu. et al, "Performance Analysis and Optimization for Steering Motion Mode Switching of
an Agricultural Four-Wheel-Steering Mobile Robot," Agronomy, vol. 12, no. 11, p. 2655, 2022
[12] J. Blasco. et al, “Robotic weed control using Machine vision”, publish by Elsevier science Ltd,
2002 Silsoe research institute.
[13] Yayun Du. et al, “A Low-cost Robot with Autonomous Recharge and Navigation for Weed
Control in Fields with Narrow Row Spacing”, Prague, Czech Republic, 2021
[14] Megha. P. Arakeri.et al, H. V. Sairam, “Computer vision based robotic weed control system for
precision agriculture”, IEEE, 2017

Student Faryal Naeem Mehmood Syeda Ambreen Zahra


FYP/Forms and Performa’s

Signature

Supervisor Dr. Syed Muhammad Wasif

Signature with date

You might also like