Options

Interpreting cross validation

yerisderanakyerisderanak Member Posts: 2 Contributor I
edited November 2018 in Help

Hi guys!
I'm a total beginner, so please bear with me.
I have a process set upped, with a cross validation at the end. Inside of it a have Deep Learning, Apply Model and Performance operators. So far so good, so after I run (4h later :D) I get a Confusion Matrix and a Accuracy. And here is my question:
So I have accuracy: 35.42% +/- 47.83% (mikro: 35.42%)
Accuracy is average accuracy of all models trained, right?
So is +/- 47.83% variance?
And for the confusion matrix, is it from the last model trained, or is it a some kind of summary of all the runs?
To be exact I use k-fold cross validation, so maybe my understanding of that process is wrong.

Thx in advance and sorry for noob question!

Answers

  • Options
    MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,507 RM Data Scientist

    Hi,

    You are mostly right. The average accuracy of all applications of your model is 35%. The standard deviation of your k applications is 47%. The confusion matrix is the confusion matrix of all examples. So it's kind of the sum of all indivudial confusion matricies.

     

    ~Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
Sign In or Register to comment.