Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

Example

Straight Line Technique Formula Y = mx+b

We choose linear Regression(When data is showing linear


pa erns Least Square Error Technique Formula Y = mx+b

Gradient Descent Technique Formula Y = mx+b


When Output (Y) is con nuous Regression Models
Hyperparameter Tuning
Feature Engineering/Change the features
Over Sampling or Undersampling Techniques
Evalua on Metrics
Change the algorithms Accuracy Improving Techniques
Change the sample or collect more data
Subtopic
Standardiza on
We can apply for Linear regression, KNN, Neural Network,
RobustScaler Feature Scaling Techniques
SVM
MinMaxScaler

Logis c Regression
Variance Error
Machine Learning Math

Variance-Bias Tradeoff

CART
Bias Error

ID3
Decision Tree Classifica on
Op miza on & Regulariza on
CHAID

Overfi ng & Underfi ng ID 4.5

Random Forest
When Output(Y) is Discrete Classifica on Models

Gradient Boos ng

Support Vector Machines

KNN Algorithm 1. Euclidean Distance


2. Manha an Distance
3. Minkowski Distance
Subtopic
Subtopic

You might also like