Download as pdf
Download as pdf
You are on page 1of 4
CHEATSHEET Machine Learni @ Algorithms @ ( Python and R Codes) Types Supervised Learning Decision Tree -Random Forest KANN - Logistic Regression Python Code Unsupervised Learning ‘Apriori algorithm - k-means Hierarchical Clustering Reinforcement Learning - Markov Decision Process Q Learning R Code ‘Import Library Import other necessary Libraries like pandas, numpy.. from sklearn import Linear model load Train and Test datasets ‘Identify feature and response variable(s) and values must be nuneric and nuspy arrays x traineinput_variables_values_training datasets Cui} ytrainstarget_variables_values_training datasets x test=input_variables_values_test_datasets Create Linear regression object Linear = Line mmode1.LinearRegression() Linear ‘Train the model using the training sets and check score Linear.fit(x train, y_train) Linear.score(x_train, y_train) n n Po = Ey 4 #Equation coefficient and Intercept print(*Coefficient: \n', Linear.coef_) print(*Intercept: \n', linear.intercept_) Predict Output predicted linear. predict(x test) ‘load Train and Test datasets Identify feature and response variable(s) and ftvalues must be nuseric and numpy arrays x train <- input_variables_values_training datasets y.train <- tanget_variables_values_training datasets x test <- input_variables_values_test_datasets x G cbind(x train,y train) #rain the model using the training sets and check score Linear <- Im(y_train ~ ., data = x) sunmary(linear) Predict Output predicted- predict (1inear,x_test) Logistic LY ces (1) steport Library from sklearn.Linear_model Inport LogisticRegression ‘tassumed you have, X (predictor) and Y (target) for training data set and x_test(predictor) tof test dataset create Logistic regression object model = Logistickegression() ‘Train the model using the training sets ttand check score model. Fit(X, y) model.score(X, y) equation coefficient and Intercept print(‘Coetficient: \n', model.coef_) print(‘Intercept: \n', model. intercept_) seredict output predicted= model.predict(x test) x & cbind(x_trainy train) ‘Train the model using the training sets and check logistic < gln(y train ~ ., data = x,fasily="binonial")} sumnary(logistic) aredict Output predicteds predict (logistic,x test) rt is = © S Tsay Import Library ‘Import other necessary Libraries Like pandas, numpy. from sklearn import tree ‘Assumed you have, X (predictor) and ¥ (target) for Hteaining data set and x test(predictor) of test_dataset Create tree object model = tree. DecisiontreeClassifier(eriterion="gini") for classification, here you can change the ‘algorithm as gini or entropy (information gain) by default it As gini ‘wnodel = tree.ecistontreeRegressor() for ‘regression ‘Train the model using the training sets and check model. fit(X, y) model. score(X, y) ‘Predict Output predicteds model. predict(x test) ‘Himport Library Libeary(rpart) x & cbind(x_train,y_train) grow tree Fat < rpart(y_train~ ., data = xymethode"class*) summary (Fit) ‘Predict Output predicted= predict(fit,x test) AUT) ca ea ETT) stmport Library fron sklearn import sve ‘assumed you have, X (predictor) and Y (target) for ‘training data set and x_test(predictor) of test_dataset create SVM classification object odel = sva.sve() there are various options associated with {t, this 4s simple for classification. ‘Train the model using the training sets and check odel.fit(x, y) sodel.score(x, y) avredict output predicted= model predict (x test) laport Library Library (e1e71) X < cbind(x train,y_train) seStting model fat csvmy_train ~ ., data = x) summary (#1) sPredict Output predicted» predict(Fit,x test) WET em tN ony taport Library ‘from sklearn.naive_bayes inport Gaussian\ ‘tassumed you have, X (predictor) and ¥ (target) for ‘training data set and x test (predictor) of test_dataset create SvM classification object model = GaussianNe() IHthere 4s other distribution for multinomial classes ike Bernoulli Naive Bayes ‘Train the model using the training sets and check model ft, y) Predict Output predicted= model.predict(x test) ‘teport Library Lbrary(e1@71) x & cbind(x_train,y_teain) Fitting model Fit

You might also like