error range of classifier

taghaddotaghaddo Member Posts: 6 Contributor I
edited December 2018 in Help
how can I get out the box plot of errors of my classifier in the cross validation test? for example linear regression predict my label, and I also have the real number. I want to see the ranges of errors between different classifiers to choose the one that has the smallest range.

Best Answer


  • lionelderkrikorlionelderkrikor Moderator, RapidMiner Certified Analyst, Member Posts: 1,195 Unicorn
    Hi @taghaddo,

    I'm not sure to understand but I will nevertheless try  to provide some answer elements : 
    I think that the "Auto-Model" feature of RapidMiner can help you.
    After submitting your dataset to the "Auto-Model" tool, and after being guided, you have the results screen
    where you can compare the performance metrics (after selecting one of them) of your classifiers (you have to click first on Comparaison -> Overview)
    The results are presented like that : 

    Then,you can exploit the plot by right clicking on it and for example choose "save as" etc.

    You can also click on ROC Comparaison to compare the ROC curve of your different classifiers.

    I hope it helps,



    Warning : You spoke of "Cross Validation"  in your post : In Auto-Model feature, the performance metrics of the classifiers
    are associated to a Split Validation (with a ratio by default of 0.8 / 0.2).
  • taghaddotaghaddo Member Posts: 6 Contributor I
    hi, thanks for your time and effort.
    first I could not find results that you show on the picture, after running the designed model in the result section there are multiple tabs based on what you want to extract from model such as test, performance,etc. could you show me step by step.
    secondly my question was different. To elaborate more, after calculating error (formula between predicted value and base value), then an important question is how the range of error is? there are multiple ways to do it, the very basic formula is to just : Max Error-Min Error, however, it is not a good formula since it is very affected by noises. Hence, some people suggest to use 90% confidence interval,(by assigning alpha=0.9), which means with the confidence of 90% the range is between these two numbers which is calculated  by formula, what is the formula here ?
  • lionelderkrikorlionelderkrikor Moderator, RapidMiner Certified Analyst, Member Posts: 1,195 Unicorn
    Hi @taghaddo,

    I understood that you want to compare the performances of differents classifiers. To basically performs this task 
    based on a "Split Validation" of the models, "Auto-Model" is an adapted tool.
    First you have to click on the "Auto Model" button at the top of the screen, submit your dataset and at the end
    you have the "results" screen I showed in my previous post.

    However I don't know how to calculate the 90 % confidence interval in RapidMiner.

    I hope it helps,


Sign In or Register to comment.