Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 20

PROBLEM STATEMENT

The agricultural industry currently faces a pressing challenge in swiftly identifying diseased plants within
fields, resulting in significant crop losses due to delayed intervention. Traditional methods reliant on visual
inspection lack the necessary speed and precision for effective disease management, while overreliance on
pesticides poses sustainability concerns.

To address these issues, there is a clear need for an innovative solution that integrates ultrasonic technology
and advanced multi-sensor data analysis. This system aims to enable early disease detection, enhance
precision agriculture practices, and promote efficient resource utilization. By bridging the gap between
technology and agriculture, this project aims to revolutionize plant disease detection methods, reducing
losses, promoting sustainability, and safeguarding global food supply chains.
Rice Plant Disease Detection

CHAPTER 1
INTRODUCTION

Swiftly identifying diseased plants in agriculture is a critical challenge leading to substantial crop losses. Our
innovative system integrates ultrasonic technology with pH, moisture, image, temperature, humidity, sonar,
and GPS sensors to revolutionize plant health detection.

Our primary goal is to redefine plant disease identification through a comprehensive multi-sensor approach.
Emitting ultrasonic waves onto plants and analyzing their reflections, we merge data from various sensors to
provide a holistic understanding of plant health and environmental conditions. This integrated method
detects deviations from healthy norms, facilitating rapid differentiation between diseased and healthy plants.

This pioneering system offers non-intrusive, precise assessment, minimizing growth disruption while
maximizing detection efficiency. Early identification, even pre-symptoms, curbs ailment spread and crop
losses. Reduced pesticide use, a byproduct of early detection, aligns with sustainable practices, fostering
ecologically balanced agriculture.

Our endeavor marks a breakthrough in farming. Merging ultrasonic technology with a sensor suite, including
GPS, empowers farmers to ensure comprehensive crop health and efficient resource management. The
synergy of innovation and sustainability underscores our overarching aim – melding technology with
mindful farming for amplified yields, minimized losses, and global food security. Our solution addresses
timely disease detection, promoting resilient, sustainable agriculture.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 2


Rice Plant Disease Detection

CHAPTER 2
LITERATURE SURVEY

Literature Survey:

1. Introduction:

Agriculture plays a huge role in people's lives and the economy. In India, it's the main way many people
make a living, around 58% of the population [3]. India is even one of the top countries globally when it
comes to growing crops. Back in 2018, over half of the workers got their jobs from farming, and this helped
make up about 18-20% of the country's money. So, India is doing well in farming, but there are problems to
deal with. Some of these problems include not using good farming methods, not using enough helpful things
like compost and fertilizers, not having enough water, and having plants get sick. When plants get sick, it's
really bad for how well they grow. In fact, about 20-30% of crops don't grow well because of diseases. This
causes a big loss in how much food we get from farms. That's why it's super important to spot these diseases
early and do something about it.

Before, people used to look at plants and try to figure out if they were sick, but this took a lot of time and
sometimes they made mistakes. But now, thanks to new technology, things have gotten better. There's a new
way to figure out if plants are sick using computers and smart systems. This system is all about identifying
diseases in the leaves of 14 different plants like apples, strawberries, and more. It uses a special kind of
computer program called a Convolutional Neural Network (CNN) which looks at pictures of the leaves and
figures out if they're healthy or not. This helps farmers take care of sick plants quickly and accurately, so
they can keep growing lots of good food.

2. Research Objects and Scope

The survey focuses on leveraging Deep Learning, particularly Convolutional Neural Networks, to develop a
robust system for plant disease detection. A comparative analysis of different CNN variants, namely Faster
R-CNN, R-FCN, and SSD, is conducted to ascertain their efficacy in pinpointing diseased plant regions. The
study places equal emphasis on the preprocessing of data to optimize the accuracy of the models. The
research methodology involves training the chosen CNN variants on a dataset comprising images of plants
with various diseases.

The primary focus of this study is to harness the potential of Deep Learning, specifically Convolutional
Neural Networks (CNNs), for the purpose of detecting plant diseases. The overarching goal is to evaluate the
effectiveness of these advanced technological tools in accurately identifying signs of illness in plants based
on visual imagery. The scope of this research encompasses a thorough investigation into the applicability
and performance of various CNN architectures, including Faster R-CNN, R-FCN, and SSD.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 3


Rice Plant Disease Detection

3. Convolutional Neural Networks (CNN)

A specific type of deep neural network is the convolutional neural network (CNN). CNNs combine learned
features with input data, making them particularly suitable for processing 2D data like images. CNNs
eliminate the need for manual feature extraction when classifying images. The model itself extracts features
directly from images during training. CNN architecture includes layers such as Input layer, Output Layer,
Convolutional Layer, Fully Connected Layer, Softmax Layer, and Pooling Layer.[3]

A. VGG 16 Model:

VGG16 is a CNN model used for large-scale image tasks. It excels in both object localization and image
classification. The model is depicted below:

The model comprises seven distinct layers, each processing specific information:

1. Input Layer: Holds image data with parameters like height, width, depth, and color information (RGB).

2. Convolutional Layer: Also known as the feature extraction layer, it extracts significant features from input
images using dot products.

3. Pooling Layer: Reduces computational power by decreasing the dimensions of featured matrices obtained
from dot products.

4. Fully Connected Layer: Connects neurons from one convolutional layer to another, facilitating complex
connections.

5. Softmax Layer/Logistic Layer: Performs multi-classification or binary classification, determining the


probability of an object's presence in an image.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 4


Rice Plant Disease Detection

6. Activation Function - ReLU: Rectified Linear Unit (ReLU) transforms weighted input and activates
nodes in convolutional operations, a common activation function in neural networks.

Deep learning involves intricate layers to extract meaningful information, with CNNs being a specialized
architecture for processing images. The VGG16 model exemplifies this by efficiently performing both object
localization and image classification tasks through its distinct layers, each handling specific aspects of the
data.

4. Methodology

Plants are vulnerable to various diseases and problems that affect their growth and health. These issues can
be caused by factors like changes in the environment, such as temperature and humidity, as well as
insufficient nutrients, light, and common diseases like bacterial, viral, and fungal infections. To address this,
we propose using a powerful tool called Convolutional Neural Network (CNN) to detect diseases in plant
leaves. CNN is known for its ability to achieve high accuracy when the data is of good quality.

A. Dataset:

For our study, we utilized the Plant Village Dataset, which contains a collection of 54,303 images showing
both healthy and unhealthy plant leaves [3]. These images are categorized into 38 groups based on the plant
species and the type of disease. Our analysis involved examining over 50,000 of these images, each with
labels indicating its category. We resized the images to a standardized 256 × 256 pixels and conducted
further optimization and model predictions using these modified images [2].

B. Data Processing and Augmentation:

In order to build an effective image classifier, Image augmentation techniques is used. These methods are
crucial because even though a dataset may contain hundreds or thousands of examples, it might still not
capture enough variety to create an accurate model. Augmentation involves making changes to the images,
such as flipping them vertically or horizontally, rotating them at different angles, and adjusting their
brightness. These augmentations expand the range of data available for training. Our dataset's images were
all set to a size of 256 x 256 pixels. We executed data processing and image augmentation using the Keras
deep-learning framework. The specific augmentation techniques included:

- Rotation: Randomly rotating training images at different angles.


- Brightness: Adapting to variations in lighting by using images with varying brightness levels during
training.
- Shear: Adjusting the shearing angle to account for potential irregularities in the images.

C. System Architecture Overview and Methodology:

A hybrid model architecture that combines data from the six sensors (sonar, image, temperature/humidity,
pH, nutrient, soil moisture) using Convolutional Neural Networks (CNNs):
Dept. of Computer Science And Engineering, SDMCET, Dharwad. 5
Rice Plant Disease Detection

1. Input Data: Assume the sensor data is captured as time series or sequential data, and each sensor provides
a sequence of measurements over time.

2. Sensor-Specific CNNs: Design separate CNN branches for each sensor's time series data. Each branch
processes the sequential data from one sensor using 1D convolutional layers.

3. Feature Concatenation: After processing through sensor-specific CNNs, concatenate the output feature
maps from each branch.

4. Shared CNN Layers: Shared 1D CNN layers is added on top of the concatenated features. These layers
capture cross-sensor relationships and higher-level patterns.

5. Fully Connected Layers: The Flattened output from shared CNN layers is passed through fully connected
layers to refine the combined features.

6. Output Layer: The architecture is finished with an output layer suitable for the specific prediction task
(e.g., classification, regression).

 Detailed System Architecture:

1. Sensor-Specific 1D CNNs:

- Image Branch: 1D Convolutional Layers -> ReLU -> MaxPooling

- Sonar Branch: 1D Convolutional Layers -> ReLU -> MaxPooling

- Temperature/Humidity Branch: 1D Convolutional Layers -> ReLU -> MaxPooling

- pH Branch: 1D Convolutional Layers -> ReLU -> MaxPooling

- Nutrient Branch: 1D Convolutional Layers -> ReLU -> MaxPooling

- Soil Moisture Branch: 1D Convolutional Layers -> ReLU -> MaxPooling

2. Feature Concatenation:

- Concatenate the outputs from all sensor branches.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 6


Rice Plant Disease Detection

3. Shared 1D CNN Layers:

- Dense: 1D Convolutional Layers -> ReLU -> MaxPooling

- LSTM/GRU: 1D Convolutional Layers -> ReLU -> MaxPooling

4. Fully Connected Layers:

- Flatten the output from shared CNN layers (Using Embedding layers ).

- Dense -> ReLU

- LSTM/GRU -> ReLU

5. Output Layer:

- Depending on task performed by the sensor, adding an appropriate output layer to produce result (e.g.,
Dense layer with softmax for classification, Dense layer for regression).

Label Index | Embedding Vector


User 1 [sensor 1.1, sensor 1.2,
sensor 1.3, sensor 1.4,
sensor 1.5, sensor 1.6]
User 2 [sensor 2.1, sensor 2.2,
sensor 2.3, sensor 2.4,
sensor 2.5, sensor 2.6]
User 3 [sensor 3.1, sensor 3.2,
sensor 3.3, sensor 3.4,
sensor 3.5, sensor 3.6]

This architecture processes each sensor's time series data through sensor-specific CNNs, capturing sensor-
specific patterns. The concatenated features are then refined through shared CNN layers to capture cross-
sensor relationships. The fully connected layers further process the combined features, and the output layer
makes predictions based on the fused information.

5. Performance Evaluation:
A. Data Analysis:
The hybrid CNN data analysis process incorporates diverse sensor data types like sonar, temperature, soil
moisture, pH, image, and nutrient measurements. Raw data is collected, pre-processed, and organized for
training and testing. Individualized 1D CNN branches capture unique patterns from each sensor. Image data
is integrated through preprocessing for feature extraction. Concatenating features creates a comprehensive
representation, enhanced by shared 1D CNN layers that capture interrelationships between modalities [4].
Fully connected layers refine fused features for better understanding of interactions. Output layers facilitate
predictions or classifications. The model is rigorously trained, evaluated, and validated, and learned features
are analysed for insights into sensor contributions. Iterative optimization enhances architecture and
Dept. of Computer Science And Engineering, SDMCET, Dharwad. 7
Rice Plant Disease Detection

hyperparameters. The refined hybrid CNN model offers nuanced interpretation of sensor data for informed
decision-making across domains.
B. Result Analysis:
The result analysis of the hybrid CNN model, combining data from diverse sensors including sonar,
temperature, soil moisture, pH, image, and nutrients, involves thorough testing for predictive accuracy and
interpretability. Insights into sensor contributions and interactions are gained by examining shared layer
features. Image data patterns are deciphered. Anomalies and biases are identified, guiding model
refinements. Iterative fine-tuning improves overall performance and robustness. The analysis validates the
model and reveals correlations between sensor inputs, enabling informed decisions and actionable insights in
applications like environmental monitoring and image-based analysis. The accuracy level of the current
CNN model being used currently i.e., VGG16 Model has 94.8% accuracy level [3].

6. Transfer Learning and Pre-Trained Models:


Leveraging transfer learning and pre-trained models enhances the performance of the hybrid CNN model
integrating data from diverse sensors: sonar, temperature, soil moisture, pH, image, and nutrients. A pre-
trained CNN, originally designed for image analysis, serves as an image feature extractor. Frozen
convolutional layers extract high-level visual features, augmenting image understanding. Sensor-specific
branches tailor to unique aspects of each sensor's data, while nutrient data contributes to shared architecture,
capturing dynamic relationships. Fusion of image-based and sensor-based features occurs in fully connected
layers and the output layer. This harmonized approach enables accurate predictions or classifications across
multiple dimensions. By skilfully combining transfer learning and hybrid architecture, the model not only
benefits from pre-trained knowledge but also excels in interpreting intricate sensor interactions, offering a
robust and versatile solution for complex analysis across domain. Transfer Learning is Applied to CNNs in
Plant Disease Detection:
1. Feature Extraction:
The hybrid CNN model's feature extraction process, integrating data from diverse sensors like sonar,
temperature, soil moisture, pH, image, and nutrients, offers a potent approach for deriving meaningful
insights from multi-modal input. Its architecture leverages both image and sensor data to create a
comprehensive representation. Convolutional layers adeptly capture visual patterns, serving as proficient
feature extractors. Sensor-specific branches isolate unique temporal patterns for each sensor, refining
understanding. Nutrient sensors enrich the feature space, capturing inter-sensor dynamics. Fused features
from image and sensor pathways converge, revealing cross-modal relationships. Fully connected layers
synthesize features, identifying complex interactions. This synergy culminates in the output layer's
predictions or classifications, driven by a holistic feature representation. The model harmoniously combines
image-based attributes and sensor-specific patterns, yielding informed decisions, accurate predictions, and
deeper insights across diverse analysis tasks.
2. Fine-Tuning:
Fine-tuning within the hybrid CNN model, amalgamating data from diverse sensors (sonar, temperature, soil
moisture, pH, image, and nutrients), is a strategic process to enhance performance for specific analysis tasks.
Adjustments span shared and sensor-specific layers, balancing pre-learned features with task adaptation. In
shared architecture, hyperparameter optimization refines cross-modal relationship capture. Sensor-specific
branches focus on unique characteristics, adjusting parameters for improved temporal pattern extraction.
Fully connected layers harmonize image and sensor data, with weights and biases optimization. Fine-tuning
of the nutrient sensor aids in leveraging insights from other sensors. Iterative validation ensures
improvements and prevents overfitting. Through iterative fine-tuning, the hybrid CNN model becomes
Dept. of Computer Science And Engineering, SDMCET, Dharwad. 8
Rice Plant Disease Detection

finely attuned to task nuances, culminating in an optimized framework that effectively leverages synergies
among sensor modalities.

7. References:
SI.No Author Title Link

1 Kushal M U, Literature Survey of Plant Disease Detection https://www.ijraset.com/


Nikitha S, using CNN. research-paper/plant-
Shashank L M, disease-detection-using-
Partha Sarathi S, cnn
Maruthi M N
2 Sumit Kumar, Plant Disease Detection Using CNN. https://turcomat.org/
Veerendra index.php/turkbilmat/
Chaudhary, article/download/
Supriya Khaitan 7743/6139/13981
Chandra
3 Rinu R, Dr Plant Disease Detection and Classification https://www.ijrte.org/wp-
Manjula S H using CNN. content/uploads/papers/
v10i3/C64580910321.pdf
4 Alex Krizhevsky, Long Short-Term Memory, Neural https://
Ilya Sutskever, Computation (1997). proceedings.neurips.cc/
Geoffrey E Hinton paper_files/paper/2012/
file/
c399862d3b9d6b76c8436e
924a68c45b-Paper.pdf

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 9


Rice Plant Disease Detection

CHAPTER 3
DETAILED DESIGN

Use Case Diagram

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 10


Rice Plant Disease Detection

The provided figure depicts a comprehensive use case diagram for a Plant Disease Detection System, tailored to assist
farmers in managing their paddy farms effectively. The central actor in this diagram is the "Farmer," representing the
primary user of the system. The farmer engages with four main use cases: Registration, Status of Paddy Farm, Profile
Updates, and SMS Services.

The Registration use case is further subdivided into two sub-use cases: "Login" and "New User." The "Login" sub-use
case enables registered farmers to access the system, while the "New User" sub-use case facilitates the registration
process for new farmers. This division ensures a streamlined user authentication experience.

The Status of Paddy Farm use case is divided into two sub-use cases: "Sensors Status" and "Healthy Plant and Soils."
The "Sensors Status" sub-use case provides real-time data from sensors deployed in the farm, helping farmers monitor
environmental conditions. The "Healthy Plant and Soils" sub-use case evaluates the well-being of plants and soil
quality, assisting farmers in making informed decisions to maintain optimal crop health.

The Profile Update use case is further segmented into two sub-use cases: "User Personal Data" and "Device Data."
The "User Personal Data" sub-use case allows farmers to manage their personal information, ensuring accurate
communication and user-specific features. The "Device Data" sub-use case enables farmers to configure and
personalize their sensor devices for precise data collection.

The SMS Services use case also consists of two sub-use cases: "Alert" and "Per Day Update." The "Alert" sub-use
case notifies farmers of critical situations or disease outbreaks, enabling timely responses. The "Per Day Update" sub-
use case provides regular insights on farm conditions, helping farmers stay well-informed.

Additionally, the system itself acts as another actor, interacting with the farmer and the environment. It receives input
from the farmer's sensors and performs a series of operations to aid disease detection. These operations include "Image
Cleaning and Preprocessing," "Feature Extraction," and "Action of Your System Locks This," culminating in "Report
Generation" to provide the farmer with valuable insights.

An extended feature included in the diagram is the "Change Language" functionality, allowing farmers to interact with
the system in their preferred language, enhancing user-friendliness.

Lastly, an "Admin" actor is introduced, responsible for system management. The admin has access to system logs,
enabling them to track activities and troubleshoot issues. They also possess the authority to update or delete features,
ensuring system flexibility and adaptability. The "Maintenance" sub-use case empowers the admin to keep the system
running smoothly.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 11


Rice Plant Disease Detection

Data Flow Diagram(DFD)

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 12


Rice Plant Disease Detection

Data Flow Diagrams (DFDs) are graphical representations that depict the flow of data within a system. They
showcase processes, data stores, data flows, and external entities, providing a visual overview of how
information moves and is transformed within a system. DFDs help in understanding the system's
functionality and interactions, making it easier to identify potential bottlenecks, errors, or improvements.

In the provided DFD diagram, the "Sensors Module" plays a crucial role in capturing environmental data
from various sensors: temperature, humidity, pH, ultrasonic, and GPS. These sensors collect essential data
about the farm's conditions, such as temperature, humidity levels, pH balance, distance measurements, and
geographical coordinates.

The data from these sensors is transmitted to the "Application," which initially identifies the user and their
associated sensors. This user-specific data is then forwarded to the "Backend" of the application, where a
series of processes occur to analyze and evaluate the collected information.

The "Backend" encompasses several significant steps. First, "Image Cleaning and Preprocessing" is carried
out to enhance the quality of images before further analysis. This step ensures that the input data is in an
optimal state for subsequent operations. Following this, "Feature Extraction" is performed, involving the
extraction of shape, colour, and other relevant features from the plant images.

Next, the processed data is transferred to a "CNN Model," which stands for Convolutional Neural Network.
This model leverages neural network algorithms and is trained on a dataset that contains comparative
feature-extracted data. The "Test Dataset" is used for prediction and comparison. The CNN model assesses
the input data against the test dataset, enabling it to predict whether the plant or leaves are healthy or not.

Based on the collected data and the GPS location, a "Message Generation" process generates a message
indicating the health status of the plant or leaves. This message is then transmitted through an "SMS
Service" to the farmer. Importantly, the message is generated in the farmer's regional language, enhancing
user-friendliness and ensuring effective communication.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 13


Rice Plant Disease Detection

Architectural Design

An architectural diagram provides a high-level representation of a system's structure and components,


illustrating how they interact and work together to achieve specific goals. It offers a bird's-eye view of the
system's architecture, helping stakeholders understand its key components and their relationships.

In the provided architectural diagram, the system begins with a set of "Sensors." These sensors include six
types: humidity, temperature, pH, GPS, ultrasonic, and soil moisture sensors. These sensors collect crucial
environmental data from the farm, such as humidity levels, temperature, pH balance, GPS coordinates,
distance measurements, and soil moisture content.

Following the sensor inputs, the system moves to "Data Cleaning and Preprocessing." This step involves
refining and enhancing the raw sensor data, ensuring it's in an optimal state for further analysis. This process
helps eliminate noise, outliers, and inconsistencies that could affect the accuracy of subsequent stages.

The "Feature Extraction" component comes next. Here, relevant features such as shape, color, and other
attributes are extracted from the data. Both the "Test Data" and the "Dataset Data" are utilized in this step.
The "Dataset Data" serves as a reference for comparison, aiding in the identification of patterns and
distinctive traits in the input data.

The processed data is then fed into a "CNN Model" (Convolutional Neural Network). This neural network
model has been trained on the "Dataset Data" and uses this knowledge to assess and predict outcomes based
on the input data. It employs complex algorithms to recognize patterns and relationships, enabling it to
determine whether the plant is healthy or not.

The "Prediction Model" generates the results based on the analysis conducted by the CNN model. If the
prediction indicates an issue with the plant's health, an "Alert Message" is generated. This alert message
contains pertinent information about the detected problem. Importantly, the system uses the farmer's
"Register Number" to ensure that the message is sent to the correct recipient.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 14


Rice Plant Disease Detection

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 15


Rice Plant Disease Detection

CHAPTER 4
PROJECT SPECIFIC REQUIREMENTS
1. Ultrasonic Emission and Reception:
The system must emit ultrasonic waves towards plants and accurately capture their reflections for
analysis.

2. Sensor Integration:
The system should seamlessly integrate diverse sensors including pH, moisture, image, temperature,
humidity, sonar, and GPS.

3.Data Processing and Analysis:


A robust data processing mechanism must be developed to handle and analyze the gathered sensor data
efficiently.

4.Database Integration:
An extensive and diverse database of healthy plant profiles must be established for comparative analysis.

5.Swift Disease Identification:


The system must be capable of rapidly distinguishing between healthy and diseased plants, ideally before
visible symptoms appear.

6. Non-Intrusive Assessment:
The assessment process should be non-intrusive, causing minimal disruption to plant growth and
development.

7. Accuracy and Precision:


The system's accuracy in identifying diseased plants should be high, with minimal false positives or false
negatives.

8. Resource Efficiency:
The system should contribute to efficient resource management by optimizing water, pesticide, and
fertilizer usage.

9. User-Friendly Interface:
The system should feature an intuitive user interface, allowing farmers to easily monitor and interpret
plant health data.

10. Integration with Farming Practices:


The system should seamlessly integrate with existing farming practices and workflows.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 16


Rice Plant Disease Detection

CHAPTER 5
IMPLEMENTATION

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 17


Rice Plant Disease Detection

CHAPTER 6
RESULTS
<<May include few snapshots focusing on meeting of the project specific requirements>>

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 18


Rice Plant Disease Detection

CHAPTER 7
CONCLUSION

In conclusion, our innovative project represents a transformative leap in the realm of agriculture. By
seamlessly integrating ultrasonic technology with a diverse range of sensors including pH, moisture, image,
temperature, humidity, sonar, and GPS, we have developed a comprehensive and advanced system for the
early detection of plant diseases. Our multi-sensor approach provides a holistic understanding of plant health
and environmental conditions, enabling rapid and accurate differentiation between diseased and healthy
plants.

This project holds immense promise for the agricultural industry, offering a non-intrusive, accurate, and
sustainable solution to the persistent challenge of timely disease detection. By facilitating early intervention
and reducing the need for excessive pesticide use, our system aligns with modern eco-friendly farming
practices, contributing to an ecologically balanced and resource-efficient agricultural ecosystem.

As we move forward, we envision our project making a significant impact on global food security,
sustainable agriculture, and efficient resource management. By empowering farmers with cutting-edge
technology and comprehensive insights into plant health, we strive to create a future where crop losses are
minimized, yields are optimized, and our agricultural systems are resilient in the face of disease challenges.
With a commitment to innovation and sustainability, our project paves the way for a more productive,
resilient, and sustainable future in agriculture.

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 19


Rice Plant Disease Detection

REFERENCES

<<Include list of references, numbered [1], [2], … >>

Dept. of Computer Science And Engineering, SDMCET, Dharwad. 20

You might also like