AutoModel Performance not Matching Confusion Matrix

DennisBalogluDennisBaloglu Member Posts: 11 Newbie
For the AutoModel, I look at the output for the confusion matrix and performance measures. However, when calculating the performance measures by hand using the confusion matrix, it doesn't match up to the performance measures listed. Am I overlooking something?

Best Answer

  • Options
    lionelderkrikorlionelderkrikor Moderator, RapidMiner Certified Analyst, Member Posts: 1,195 Unicorn
    Solution Accepted
    Hi @DennisBaloglu

    Yes, this slightly difference is expected : 

    The displayed performance is (by default) a "multi hold out set validation" method on the 40% of the dataset which are not used to train the model.
    Then this "test set" is divided in 7 parts, and 7 performances are calculated.
    Then AutoModel remove the maximum performance and the minimum performance (the outliers) and the average performance is calculated on the 5 remaining performance.
    Thus this calculated performance can slightly differ from the performance calculated from the confusion matrix.

    To retrieve the performance calculation methodology, on the results panel (the final screen), you can click on the "information mark" an go to "Models" -> "Performance" to see the description of this methodology

    Hope this helps,




Sign In or Register to comment.