Day 17 - Ensemble Techniques in ML - Averaging, Weighted average
Ensemble Techniques in ML
Ensemble methods is a machine learning technique that combines several base models in order to produce one optimal predictive model.
Ensemble methods include building multiple models and combining them to achieve better outcomes. To put it another way, they integrate the conclusions drawn from various models to enhance overall performance. Generally speaking, ensemble methods produce more accurate results than a single model would.
Similar is true for a diverse set of models in comparison to single models. This diversification in Machine Learning is achieved by a technique called the Ensemble technique.
Simple Ensemble Techniques
- Max Voting
- Averaging
- Weighted Averaging
Averaging
Multiple predictions are made for each data point while averaging, similar to the max voting method. In this approach, the final prediction is made by averaging the results of all the models.
When calculating probabilities for classification problems or for predictions in regression problems, averaging can be used to make predictions.
For example, in the below case, the averaging method would take the average of all the values. ie, when you asked 7 of your colleagues to rate a movie (out of 10); we’ll assume Four of them rated it as 8 while two of them gave it a 5 and the rest of them as 3.
ie., ( 8 + 8 + 8 + 8 + 2 + 2 + 3 ) / 7 = 5.5
Weighted Average
This is an extension of the averaging method. All models are assigned different weights defining the importance of each model for prediction.
For instance, if two of your colleagues are critics (1st and 3rd person), while others have no prior experience in this field, then the answers by these two friends are given more importance as compared to the other people.
The result is calculated as
[(8*0.3) + (8*0.08) + (8*0.3) + (8*0.08) + (2*0.08) + (2*0.08) + (3*0.08)] = 6.64
Comments
Post a Comment