Professional Documents
Culture Documents
ML Ass
ML Ass
ML Ass
Ensemble Technique
1.1. General overview of ensemble technique
Ensemble means ‘a group producing a single effect’
In machine learning it is a technique that combines several
base models in orders to produce one optimal mode.
In learning models noise variance and bias are the major
source of error.
The ensemble methods in machine learning help minimize
these errors causing factors there by ensuring the accuracy
and stability of machine learning algorithms.
CONT……
Ensemble learning is a machine learning technique that
enhances accuracy and resilience in forecasting by merging
predictions from multiple models.
The underlying concept behind ensemble learning is to
combine the outputs of diverse models to create a more precise
prediction.
Ensemble techniques are a set of machine learning methods
that combine multiple models to improve the overall
performance of the learning system.
How does Ensemble Learning Work?
However, being weak does not imply being useless. You can
put them together to form an ensemble.
For each new prediction, you run your input data through all
four models and then average the results.
Because your machine learning models work differently,
ensemble learning is efficient. Each model may perform well
on some data but not on others.
Why use Ensemble Learning?
• For several reasons, an ensemble is preferable to a single
model:
Performance: As described in the preceding section, the
outcome of ensemble learning is a strong learner. The strong
learner is the result of weak learners. As a result, models'
predictive capabilities improve. Better performance is
achieved when compared to a single model.
Error reduction: Machine learning model prediction errors
can be described by bias and variance. Bias is defined as the
difference between a prediction and the actual outcome.
Variance is defined as a model's sensitivity to small changes in
the training set
A model with low bias and variance is preferable
Challenges of Ensemble Learning
While ensemble learning is a powerful tool, and as fascinating
as it can look, it still does have some drawbacks.
So below are some of the challenges faced by ensemble
learning.
When you use an ensemble, you must devote more time and
resources to training your machine learning models.
A random forest with 500 trees, for example, produces much
better results than a single decision tree, but it also takes much
longer to train
CON,T
Running ensemble models can also be
difficult if the algorithms you use require a
large amount of memory
1.2. Describe types of ensemble technique
• There are two types of ensemble techniques:
1. Simple Ensemble Techniques
1. Max Voting
2. Averaging
3. Weighted Averaging
2. Advanced Ensemble techniques
1. Bagging
2. Boosting
3. Stacking
1. Max Voting
1. Bagging
Bagging is a machine learning ensemble technique that can
improve the accuracy and stability of a model by generating
multiple subsets of the training data and training a separate
model on each subset using the same learning algorithm.
CONT…..
Examples of bagging algorithms include Bagged Decision
Trees.
This method involves training multiple independent models on
different subsets of the training data and then aggregating their
predictions through averaging or voting.
Bagging often reduces variance in the predictions and
improves the stability of the model.
CONT…..
The idea behind bagging is combining the results of multiple
models (for instance, all decision trees) to get a generalized
result.
Here’s a question: If you create all the models on the same set of
data and combine it, will it be useful? There is a high chance that
these models will give the same result since they are getting the
same input.
So how can we solve this problem? One of the techniques is
bootstrapping.
Bootstrapping