Day 17 - Ensemble Techniques in ML - Averaging, Weighted average

  By Jerin Lalichan 


Ensemble Techniques in ML


    Ensemble methods is a machine learning technique that combines several base models in order to produce one optimal predictive model.

    Ensemble methods include building multiple models and combining them to achieve better outcomes. To put it another way, they integrate the conclusions drawn from various models to enhance overall performance. Generally speaking, ensemble methods produce more accurate results than a single model would.

    For example, let's consider the case in which you need to decide if you should go to a particular movie or not. You can infer that a diverse group of people are likely to make better decisions as compared to individuals. So it's best to check online reviews since it is an aggregation of reviews of hundreds of people from different backgrounds when compared to asking a few of your friends. 
    
    Similar is true for a diverse set of models in comparison to single models. This diversification in Machine Learning is achieved by a technique called the Ensemble technique.


Simple Ensemble Techniques

  1. Max Voting

  2. Averaging

  3. Weighted Averaging

Averaging

    Multiple predictions are made for each data point while averaging, similar to the max voting method. In this approach, the final prediction is made by averaging the results of all the models. 
    
    When calculating probabilities for classification problems or for predictions in regression problems, averaging can be used to make predictions.

    For example, in the below case, the averaging method would take the average of all the values. ie, when you asked 7 of your colleagues to rate a movie (out of 10); we’ll assume Four of them rated it as 8 while two of them gave it a 5 and the rest of them as 3. 

ie.,  ( 8 + 8 + 8 + 8 + 2 + 2 + 3 ) / 7  =  5.5 


Weighted Average

    This is an extension of the averaging method. All models are assigned different weights defining the importance of each model for prediction. 
    
    For instance, if two of your colleagues are critics (1st and 3rd person), while others have no prior experience in this field, then the answers by these two friends are given more importance as compared to the other people.

The result is calculated as 

[(8*0.3) + (8*0.08) + (8*0.3) + (8*0.08) + (2*0.08) + (2*0.08) + (3*0.08)] = 6.64



   
  I am doing a challenge - #66DaysofData  in which I will be learning something new from the Data Science field for 66 days, and I will be posting daily topics on my LinkedIn, On my GitHub repository, and on my blog as well.


Stay Curious!  










Comments

Popular posts from this blog

Day 4 - Performance metrics in Machine Learning - Regression