Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.

Interpreting cross validation

yerisderanakyerisderanak Member Posts: 2 Learner II
edited November 2018 in Help

Hi guys!
I'm a total beginner, so please bear with me.
I have a process set upped, with a cross validation at the end. Inside of it a have Deep Learning, Apply Model and Performance operators. So far so good, so after I run (4h later :D) I get a Confusion Matrix and a Accuracy. And here is my question:
So I have accuracy: 35.42% +/- 47.83% (mikro: 35.42%)
Accuracy is average accuracy of all models trained, right?
So is +/- 47.83% variance?
And for the confusion matrix, is it from the last model trained, or is it a some kind of summary of all the runs?
To be exact I use k-fold cross validation, so maybe my understanding of that process is wrong.

Thx in advance and sorry for noob question!

Answers

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,531 RM Data Scientist

    Hi,

    You are mostly right. The average accuracy of all applications of your model is 35%. The standard deviation of your k applications is 47%. The confusion matrix is the confusion matrix of all examples. So it's kind of the sum of all indivudial confusion matricies.

     

    ~Martin

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
Sign In or Register to comment.