Professional Documents
Culture Documents
Grid Search For Random Forest
Grid Search For Random Forest
Caret Practice
Pham Dinh Khanh
May 15, 2018
Test Algorithm
Tunning algorithm is important in bulding modeling. In random forest model, you can not pre-understand your result
because your model are randomly processing. Tunning algorithm will help you control training proccess and gain
better result. In this study, we will focus on two main tunning parameters in random forest model is mtry and ntree.
Beside, there are many other method but these two parameters perhaps most likely have biggest affect to model
accuracy.
mtry: Number of variable is randomly collected to be sampled at each split time.
ntree: Number of branches will grow after each time split.
In below result we use repeatedcv method to divide our dataset into 10 folds cross-validation and repeat only 3
repeat times in order to slows down our process. I will hold back validation set for back testing.
#https://machinelearningmastery.com/tune-machine-learning-algorithms-in-r/
library(randomForest)
library(mlbench)
library(caret)
library(e1071)
# Load Dataset
data(Sonar)
x <- dataset[,1:60]
y <- dataset[,61]
number=10,
repeats=3)
set.seed(123)
data=dataset,
method='rf',
metric='Accuracy',
tuneGrid=tunegrid,
trControl=control)
print(rf_default)
https://rpubs.com/phamdinhkhanh/389752 1/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
## Random Forest
##
## 208 samples
## 60 predictor
##
## No pre-processing
## Summary of sample sizes: 187, 187, 187, 188, 188, 187, ...
## Resampling results:
##
## Accuracy Kappa
## 0.8408442 0.6765085
##
Random search
Caret can provide for you random parameter if you do not declare for them. As below model will generate 15 random
values of mtry at each time tunning. We have 15 values because of tunning length is 15.
# library(doParallel)
# cores <- 7
# registerDoParallel(cores = cores)
#mtry: Number of random variables collected at each split. In normal equal square number columns.
ntree <- 3
number=10,
repeats=3,
search = 'random')
set.seed(1)
data = dataset,
method = 'rf',
metric = 'Accuracy',
tuneLength = 15,
trControl = control)
print(rf_random)
https://rpubs.com/phamdinhkhanh/389752 2/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
## Random Forest
##
## 208 samples
## 60 predictor
##
## No pre-processing
## Summary of sample sizes: 187, 188, 188, 186, 187, 187, ...
##
## 4 0.8559380 0.7070142
## 11 0.8476840 0.6907062
## 13 0.8397547 0.6739463
## 23 0.8330159 0.6604453
## 24 0.8249206 0.6446857
## 30 0.8265079 0.6474770
## 35 0.8122150 0.6187142
## 38 0.8184848 0.6311962
## 40 0.8185642 0.6321423
## 42 0.8234127 0.6413732
## 47 0.8024459 0.5993188
## 54 0.8252309 0.6461357
## 55 0.8153824 0.6250915
## 57 0.8106205 0.6162190
##
## Accuracy was used to select the optimal model using the largest value.
plot(rf_random)
https://rpubs.com/phamdinhkhanh/389752 3/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
Grid search
We also can define a grid of algorithm to tunning model. Each axis of grid is an algorithm parameter and point in grid
are specific combinations of parameter. In this example we only tunning on one parameter, the grid search only have
one dimension as vector.
#Create control function for training with 10 folds and keep 3 folds for training. search method i
s grid.
number=10,
repeats=3,
search='grid')
#create tunegrid with 15 values from 1:15 for mtry to tunning model. Our train function will chang
e number of entry variable at each split according to tunegrid.
data = dataset,
method = 'rf',
metric = 'Accuracy',
tuneGrid = tunegrid)
print(rf_gridsearch)
https://rpubs.com/phamdinhkhanh/389752 4/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
## Random Forest
##
## 208 samples
## 60 predictor
##
## No pre-processing
## Summary of sample sizes: 208, 208, 208, 208, 208, 208, ...
##
## 1 0.8249973 0.6474343
## 2 0.8252196 0.6480826
## 3 0.8158734 0.6290701
## 4 0.8152652 0.6279022
## 5 0.8206327 0.6389178
## 6 0.8162241 0.6295473
## 7 0.8194694 0.6360085
## 8 0.8182547 0.6338760
## 9 0.8159730 0.6291169
## 10 0.8118015 0.6208063
## 11 0.8120447 0.6215262
## 12 0.8122707 0.6219964
## 13 0.8124745 0.6222660
## 14 0.8164327 0.6306590
## 15 0.8127498 0.6229256
##
## Accuracy was used to select the optimal model using the largest value.
plot(rf_gridsearch)
https://rpubs.com/phamdinhkhanh/389752 5/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
this results recommend us that the best optimal mtry = 2 with accuracy = 82.5%.
Tune by tools
In randomeForest() have tuneRF() for searching best optimal mtry values given for your data. We will depend on
OOBError to define the most accurate mtry for our model which have the least OOBEError.
set.seed(1)
## 0.1612903 1e-05
## -0.1923077 1e-05
## -0.2307692 1e-05
https://rpubs.com/phamdinhkhanh/389752 6/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
print(bestMtry)
## mtry OOBError
## 4.OOB 4 0.1490385
## 5.OOB 5 0.1250000
## 7.OOB 7 0.1490385
## 10.OOB 10 0.1538462
According to this results, mtry = 5 is the best parameter for our model. This is quite different with Grid search method
when accuracy by 82% at mtry = 82.5%.
Tunning manually
In manually tunning we continualy keep caret because its result is aligned with previous model and provide a feasible
comparision. Moreover, keeping repeated cross validation in caret can reduces model’s overfiting.
https://rpubs.com/phamdinhkhanh/389752 7/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
This approach create many model caret scenarios with different manual parameters and compare its accuracy. Let
look at example we do this to evaluate different ntree while hodling mtry constant.
# library(doParallel)
# registerDoParallel(cores = cores)
number = 10,
repeats = 3,
search = 'grid')
#create tunegrid
data = dataset,
method = 'rf',
metric = 'Accuracy',
tuneGrid = tunegrid,
trControl = control,
ntree = ntree)
#Compare results
summary(results)
##
## Call:
## summary.resamples(object = results)
##
## Number of resamples: 30
##
## Accuracy
##
## Kappa
https://rpubs.com/phamdinhkhanh/389752 8/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
dotplot(results)
# stopCluster(cores)
Our model have highest accuracy at ntree = 1000 when accuracy = 83.77%. It mean that we should keep ntree <=
1000 is best adapted values for tunning. In this case we keep mtry equal root square of number columns dataset.
We should try another option such as mtry = 5 or mtry = 2 in case they have interaction effects.
Extend caret
In this method we create a new algorithm for caret to support. It is the same with random forest we implemented but
we make it more flexiable tunning with multiple parameters. In this cases we will tunning for both: mtry and ntree
parameters.
We can create an custom list in our model to set up the rule of tunning such as defining the parameters, type, library,
predict and prop,…. caret package can search this list parameter to adjust the process.
https://rpubs.com/phamdinhkhanh/389752 9/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
library = "randomForest",
loop = NULL)
randomForest(x, y,
mtry = param$mtry,
ntree=param$ntree)
#Predict label
predict(modelFit, newdata)
#Predict prob
Now, let make our model tunning by calling caret traing model with this customRF list. Model will tune with different
mtry and ntree.
# library(doParallel)
# registerDoParallel(cores = cores)
# train model
number=10,
repeats=3,
allowParallel = TRUE)
set.seed(123)
method=customRF,
metric=metric,
tuneGrid=tunegrid,
trControl=control)
summary(custom)
https://rpubs.com/phamdinhkhanh/389752 10/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
plot(custom)
https://rpubs.com/phamdinhkhanh/389752 11/12
10/14/21, 5:44 PM RPubs - Random Forest Tunning in Caret
# stopCluster(cores)
This function take several hours to complete with my laptop computer (core i7-3720QM and 8GB RAM) because it
will run thousand models in reality. You should consider to implement.
Reference documentary :
https://topepo.github.io/caret/model-training-and-tuning.html
https://machinelearningmastery.com/tune-machine-learning-algorithms-in-r/
https://rpubs.com/phamdinhkhanh/389752 12/12