Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

AI PROJECT CYCLE

Q. What is Project Cycle: Project Cycle is a step-by-step process to solve


problems using proven
scientific methods and drawing inferences about them.
1.Problem Scoping – Understanding the Problem
2.Data Acquisition – Collecting accurate and reliable data
3.Data Exploration – Arranging the data uniformly
4.Modelling – Creating Models from the data
5.Evaluation – Evaluating the project

1. Problem Scoping

The 4W’s of Problem Scoping are Who, What, Where, and Why. This W’s helps
in identifying and understanding the problem in a better and efficient manner.
1. Who – “Who” part helps us in comprehending and categorizing who all are
affected directly and indirectly with the problem and who are called the “Stake
Holders”.
2. What – “What” part helps us in understanding and identifying the nature of
the problem and under this block, you also gather evidence to prove that the
problem you have selected exists.

3. Where – “Where” does the problem arise, situation, and location.


4. Why – “Why” is the given problem worth solving.
Problem Statement Template (PST)

The Problem Statement Template helps us to summarize all the key points
into one single.

Template so that in the future, whenever there is a need to look back at the basis
of the problem, we can take a look at the Problem Statement Template and
understand the key elements of it.

2.Data Acquisition
It is the process of collecting accurate and reliable data to work with. Data Can
be in the format of the text, video, images, audio, and so on and it can be collected
from various sources like interest, journals, newspapers, and so on.
Data Sources

1. Surveys

1. Survey is one of the methods to gather data from the users for the second stage
of ai project cycle that is data acquisition.
2. Survey is a method of gathering specific information from a sample of people.
for Example, a census survey is conducted every year for analysing the
population.
3. Surveys are conducted in particular areas to acquire data from particular people.
2.Web Scraping

1. Web Scraping means collecting data from web using some technologies.
2. We use it for monitoring prices, news and etc.
3. For example, using Programming we can Do web scrapping. using beautiful
soup in python. beautiful soup is a package in Python
3.Sensors

1. Sensors are very Important but very simple to understand.


2. Sensors are the part of IOT. IOT is internet of things.
3. Example of IOT is smart watches or smart fire alarm which automatically detects
wire and starts the alarm. How does this happen, this happens when sensors like
fire sensor sends data to the IOT or the smart alarm and if sensor detects heat or
fire the alarm starts.
4.Cameras

1. Camera captures the visual information and then that information which is called
image is used as a source of data.
2. Cameras are used to capture raw visual data.
5.Observations

1. When we observe something carefully we get some information


2. For example, Scientists take insects in observation for years and that data will be
used by them. So, this is a data source.
3. Observations is a time-consuming data source.
API

1. API stands for Application Programming interface.


2. Let us take an example to understand API: When you visit a restaurant and
check the menu, and then you want to order some food, do you do to the kitchen
and ask the cook to prepare food, no right. You ask the waiter for the order and
then the waiter gives that order to the main kitchen area.
3. So here waiter is a messenger which takes request and tells the kitchen what you
want and then the waiter responds you with the food
4. Like that: API is actually a messenger which takes requests from you and then
tells the system what you want and then it gives you a response.

3.Data Exploration :

It is the process of arranging the gathered data uniformly for a better


understanding. Data can be arranged in the form of a table, plotting a chart, or
making a database.
If we simplify this Data Exploration means that the data which we collected in
Data Acquisition, in Data Exploration we need to arrange it for example if we
have data of 50 students in a class, we have their Mobile Number, Date of Birth,
Class, etc.
In the process of data exploration, we can make a chart for that data in which all
the names will be at one place and all the mobile numbers at one, etc.

Data Exploration or Visualization Tools

1. Google Charts

Google chart tools are powerful, simple to use, and free. Try out our rich gallery
of interactive charts and data tools.
2. Tableau

Tableau is often regarded as the grandmaster of data visualization software and


for good reason.

Tableau has a very large customer base of 57,000+ accounts across many
industries due to its simplicity of use and ability to produce interactive
visualizations far beyond those provided by general BI solutions.
3. Fusion Charts

This is a very widely-used, JavaScript-based charting and visualization package


that has established itself as one of the leaders in the paid-for market.
It can produce 90 different chart types and integrates with a large number of
platforms and frameworks giving a great deal of flexibility.

4. High charts

A simple options structure allows for deep customization, and styling can be done
via JavaScript or CSS. High charts is also extendable and pluggable for experts
seeking advanced animations and functionality.
4.Data Modelling

It is the process in which different models based on the visualized data can
be created and even checked for the advantages and disadvantages of the model.

Learning Based Approach

The learning-Based Approach is based on a machine learning experience with


the data fed.

Machine Learning (ML)

Machine learning is a subset of artificial intelligence (AI) which provides


machines the ability to learn automatically and improve from experience
without being programmed for it.

Types of Machine Learning

Machine learning can be divided into 3 types, Supervised


Learning, Unsupervised Learning, and semi-supervised or Reinforcement
Learning
Supervised learning is where a computer algorithm is trained on input data that
has been labelled for a particular output.

For example, a shape with three sides is labelled as a


triangle, Classification and Regression models are also type of supervised
Learning.

What is classification? Classification in which the algorithm’s job is to separate


the labelled data to predict the output. Example: to predict weather , which of
them is apple and pineapple.

What is Regression ?

Regression is a type of supervised learning which is used to predict continuous


value. Example: Regression is used to predict the weather. it is also used widely
for weather forecasting.
Unsupervised Learning

In terms of machine learning, unsupervised learning is in which a system


learns through data sets created on its own. In this, the training is not labelled.

Suppose a boy sees someone performing tricks by a ball, so he also learnt the
tricks by himself. This is what we call unsupervised learning.

Reinforcement Learning

Learning through feedback or trial and error method is called Reinforcement


Learning.

In this type of learning, The system works on Reward or Penalty policy. In this
an agent performs an action positive or negative, in the environment which is
taken as input from the system, then the system changes the state in the
environment and the agent is provided with a reward or penalty.
The system also builds a policy, that what action should be taken under a specific
condition.
Example: A very good example of these is Vending machines.
Suppose you put a coin (action) in a Juice Vending machine(environment),
now the system detects the amount of coin given (state) you get the drink
corresponding to the amount(reward) or if the coin is damaged or there is any
another problem, then you get nothing (penalty).
Here the machine is building a policy that which drink should be provided under
what condition and how to handle an error in the environment.

Rule Based Approach

• Rule Based Approach Refers to the AI modelling where the relationship


or patterns in data are defined by the developer.
• That means the machine works on the rules and information given by the
developer and performs the task accordingly.

For example: Suppose you have a dataset containing 100 images of apples and
bananas each. Now you created a machine using Computer vision and trained it
with the labelled images of apples and bananas. If you test your machine with an
image of an apple it will give you the output by comparing the images in
its datasets. This is known as the Rule-Based Approach.

5.Evaluation

It is the method of understanding the reliability of an API Evaluation and is based


on the outputs which are received by feeding the data into the model and
comparing the output with the actual answers.

You might also like